SaaSHub helps you find the best software and product alternatives Learn more →
Top 23 Autograd Open-Source Projects
-
Scout Monitoring
Free Django app performance insights with Scout Monitoring. Get Scout setup in minutes, and let us sweat the small stuff. A couple lines in settings.py is all you need to start monitoring your apps. Sign up for our free tier today.
-
interviews.ai
It is my belief that you, the postgraduate students and job-seekers for whom the book is primarily meant will benefit from reading it; however, it is my hope that even the most experienced researchers will find it fascinating as well.
-
pennylane
PennyLane is a cross-platform Python library for quantum computing, quantum machine learning, and quantum chemistry. Train a quantum computer the same way as a neural network.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
machine_learning_refined
Notes, examples, and Python demos for the 2nd edition of the textbook "Machine Learning Refined" (published by Cambridge University Press).
-
Arraymancer
A fast, ergonomic and portable tensor library in Nim with a deep learning focus for CPU, GPU and embedded devices via OpenMP, Cuda and OpenCL backends
-
DL4S
Accelerated tensor operations and dynamic neural networks based on reverse mode automatic differentiation for every device that can run Swift - from watchOS to Linux
-
corgi
A neural network, and tensor dynamic automatic differentiation implementation for Rust. (by patricksongzy)
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
Project mention: Understanding GPT: How To Implement a Simple GPT Model with PyTorch | dev.to | 2024-05-31In this guide, we provided a comprehensive, step-by-step explanation of how to implement a simple GPT (Generative Pre-trained Transformer) model using PyTorch. We walked through the process of creating a custom dataset, building the GPT model, training it, and generating text. This hands-on implementation demonstrates the fundamental concepts behind the GPT architecture and serves as a foundation for more complex applications. By following this guide, you now have a basic understanding of how to create, train, and utilize a simple GPT model. This knowledge equips you to experiment with different configurations, larger datasets, and additional techniques to enhance the model's performance and capabilities. The principles and techniques covered here will help you apply transformer models to various NLP tasks, unlocking the potential of deep learning in natural language understanding and generation. The methodologies presented align with the advancements in transformer models introduced by Vaswani et al. (2017), emphasizing the power of self-attention mechanisms in processing sequences of data more effectively than traditional approaches (Vaswani et al., 2017). This understanding opens pathways to explore and innovate in the field of natural language processing using cutting-edge deep learning techniques (Kingma & Ba, 2015).
See also https://github.com/unifyai/ivy which I have not tried but seems along the lines of what you are describing, working with all the major frameworks
Project mention: MatX: Efficient C++17 GPU numerical computing library with Python-like syntax | news.ycombinator.com | 2023-10-03I think a comparison to PyTorch, TensorFlow and/or JAX is more relevant than a comparison to CuPy/NumPy.
And then maybe also a comparison to Flashlight (https://github.com/flashlight/flashlight) or other C/C++ based ML/computing libraries?
Also, there is no mention of it, so I suppose this does not support automatic differentiation?
Project mention: PennyLane: Python library for differentiable programming of quantum computers | news.ycombinator.com | 2024-05-07
It is a small DSL written using macros at https://github.com/mratsim/Arraymancer/blob/master/src/array....
Nim has pretty great meta-programming capabilities and arraymancer employs some cool features like emitting cuda-kernels on the fly using standard templates depending on backend !
Project mention: Owl project (OCaml scientific computing) formally concluded | news.ycombinator.com | 2024-02-19
Project mention: Custos – A minimal OpenCL, CUDA, Vulkan and host CPU array manipulation engine | news.ycombinator.com | 2024-03-01
Project mention: Yagrad – 100 SLOC autograd engine with complex numbers and fixed DAG | news.ycombinator.com | 2024-03-17
Autograd related posts
-
Understanding GPT: How To Implement a Simple GPT Model with PyTorch
-
Building a Simple Chatbot using GPT model - part 2
-
Tinygrad 0.9.0
-
PyTorch 2.3: User-Defined Triton Kernels, Tensor Parallelism in Distributed
-
Clasificador de imágenes con una red neuronal convolucional (CNN)
-
penzai: JAX research toolkit for building, editing, and visualizing neural nets
-
Shape Typing in Python
-
A note from our sponsor - SaaSHub
www.saashub.com | 1 Jun 2024
Index
What are some of the best open-source Autograd projects? This list will help you:
Project | Stars | |
---|---|---|
1 | Pytorch | 78,852 |
2 | ivy | 14,027 |
3 | flashlight | 5,174 |
4 | MegEngine | 4,731 |
5 | interviews.ai | 4,437 |
6 | Deep Java Library (DJL) | 3,908 |
7 | pennylane | 2,141 |
8 | dfdx | 1,646 |
9 | machine_learning_refined | 1,605 |
10 | Arraymancer | 1,314 |
11 | awesome-jax | 1,322 |
12 | owl | 1,190 |
13 | pytorch_sparse | 965 |
14 | norse | 622 |
15 | neograd | 230 |
16 | MyGrad | 186 |
17 | bottle | 147 |
18 | vtl | 142 |
19 | DL4S | 102 |
20 | custos | 62 |
21 | vim-autograd | 26 |
22 | yagrad | 25 |
23 | corgi | 23 |
Sponsored