The Elements of Differentiable Programming

This page summarizes the projects mentioned and recommended in the original post on news.ycombinator.com

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  • ForwardDiff.jl

    Forward Mode Automatic Differentiation for Julia

  • You seem somewhat obsessed with the idea that reverse-mode autodiff is not the same technique as forward-mode autodiff. It makes you,,, angry? Seems like such a trivial thing to act a complete fool over.

    What's up with that?

    Anyway, here's a forward differentiation package with a file that might interest you

    https://github.com/JuliaDiff/ForwardDiff.jl/blob/master/src/...

  • ceres-solver

    A large scale non-linear optimization library

  • I can't reply to the guy saying julia is the only one. But there are others.

    Ceres uses dual numbers

    https://github.com/ceres-solver/ceres-solver/blob/master/inc...

    This library from google is used everywhere in robotics, so it's hardly some backwater little side project.

    So does c++ autodiff

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • autodiff

    automatic differentiation made easier for C++

  • Pytorch

    Tensors and Dynamic neural networks in Python with strong GPU acceleration

  • Sure, right here: https://github.com/pytorch/pytorch/blob/main/torch/autograd/...

    Here's the documentation: https://pytorch.org/tutorials/intermediate/forward_ad_usage....

    > When an input, which we call “primal”, is associated with a “direction” tensor, which we call “tangent”, the resultant new tensor object is called a “dual tensor” for its connection to dual numbers[0].

  • jax

    Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more

  • The dual numbers exist just as surely as the real numbers and have been used well over 100 years

    https://en.m.wikipedia.org/wiki/Dual_number

    Pytorch has had them for many years.

    https://pytorch.org/docs/stable/generated/torch.autograd.for...

    JAX implements them and uses them exactly as stated in this thread.

    https://github.com/google/jax/discussions/10157#discussionco...

    As you so eloquently stated, "you shouldn't be proclaiming things you don't actually know on a public forum," and doubly so when your claimed "corrections" are so demonstrably and totally incorrect.

  • SaaSHub

    SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives

    SaaSHub logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • The Julia language has a number of correctness flaws

    19 projects | news.ycombinator.com | 16 May 2022
  • Maxtext: A simple, performant and scalable Jax LLM

    10 projects | news.ycombinator.com | 23 Apr 2024
  • Deep Learning in Javascript

    1 project | dev.to | 1 Apr 2024
  • Julia 1.10 Released

    15 projects | news.ycombinator.com | 27 Dec 2023
  • Apple releases MLX for Apple Silicon

    4 projects | /r/LocalLLaMA | 8 Dec 2023