SaaSHub helps you find the best software and product alternatives Learn more →
Top 11 Rust Onnx Projects
-
burn
Burn is a new comprehensive dynamic Deep Learning Framework built using Rust with extreme flexibility, compute efficiency and portability as its primary goals.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
wonnx
A WebGPU-accelerated ONNX inference run-time written 100% in Rust, ready for native and the web
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
Project mention: 3 years of fulltime Rust game development, and why we're leaving Rust behind | news.ycombinator.com | 2024-04-26You can use libtorch directly via `tch-rs`, and at present I'm porting over to Burn (see https://burn.dev) which appears incredibly promising. My impression is it's in a good place, if of course not close to the ecosystem of Python/C++. At very least I've gotten my nn models training and running without too much difficulty. (I'm moving to Burn for the thread safety - their `Tensor` impl is `Sync` - libtorch doesn't have such a guarantee.)
Burn has Candle as one of its backends, which I understand is also quite popular.
Tract is the most well known ML crate in Rust, which I believe can compile to WASM - https://github.com/sonos/tract/. Burn may also be useful - https://github.com/burn-rs/burn.
Project mention: Intel CEO: 'The entire industry is motivated to eliminate the CUDA market' | news.ycombinator.com | 2023-12-14The two I know of are IREE and Kompute[1]. I'm not sure how much momentum the latter has, I don't see it referenced much. There's also a growing body of work that uses Vulkan indirectly through WebGPU. This is currently lagging in performance due to lack of subgroups and cooperative matrix mult, but I see that gap closing. There I think wonnx[2] has the most momentum, but I am aware of other efforts.
[1]: https://kompute.cc/
[2]: https://github.com/webonnx/wonnx
To solve this, we built a native extension in Edge Runtime that enables using ONNX runtime via the Rust interface. This was made possible thanks to an excellent Rust wrapper called Ort:
Project mention: Small inference runtime for deep neural networks | news.ycombinator.com | 2023-07-23
Rust Onnx related posts
-
Small inference runtime for deep neural networks
-
Are there any ML crates that would compile to WASM?
-
WebGPU ONNX inference runtime written in Rust
-
rustformers/llm: Run inference for Large Language Models on CPU, with Rust 🦀🚀🦙
-
tract VS burn - a user suggested alternative
2 projects | 25 Mar 2023 -
onnxruntime
-
Steelix - CLI for ONNX model analysis
-
A note from our sponsor - SaaSHub
www.saashub.com | 2 Jun 2024
Index
What are some of the best open-source Onnx projects in Rust? This list will help you:
Project | Stars | |
---|---|---|
1 | burn | 7,384 |
2 | tract | 2,086 |
3 | wonnx | 1,517 |
4 | ort | 629 |
5 | blindai | 493 |
6 | rust-mlops-template | 277 |
7 | onnxruntime-rs | 262 |
8 | altius | 87 |
9 | tractjs | 76 |
10 | steelix | 43 |
11 | yolov5-api-rust | 27 |
Sponsored