- TransformerEngine VS Whisper
- TransformerEngine VS autocvd
- TransformerEngine VS warp-drive
- TransformerEngine VS ivy
- TransformerEngine VS nanoGPT
- TransformerEngine VS fastaudio
- TransformerEngine VS liberate-fhe
- TransformerEngine VS FastFold
- TransformerEngine VS PyTorch-Guide
- TransformerEngine VS Pytorch
TransformerEngine Alternatives
Similar projects and alternatives to TransformerEngine
-
Scout Monitoring
Free Django app performance insights with Scout Monitoring. Get Scout setup in minutes, and let us sweat the small stuff. A couple lines in settings.py is all you need to start monitoring your apps. Sign up for our free tier today.
-
Whisper
High-performance GPGPU inference of OpenAI's Whisper automatic speech recognition (ASR) model (by Const-me)
-
autocvd
Tool to automatically set CUDA_VISIBLE_DEVICES based on GPU utilization. Usable from command line and code.
-
warp-drive
Extremely Fast End-to-End Deep Multi-Agent Reinforcement Learning Framework on a GPU (JMLR 2022)
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
liberate-fhe
A Fully Homomorphic Encryption (FHE) library for bridging the gap between theory and practice with a focus on performance and accuracy.
TransformerEngine reviews and mentions
-
Benchmarking Large Language Models on NVIDIA H100 GPUs with CoreWeave (Part 1)
4090 now has its 8-bit float enabled as well, see the [transformer engine issue](https://github.com/NVIDIA/TransformerEngine/issues/15)
-
GPUs for Deep Learning in 2023 – An In-depth Analysis
Would be curious to see your benchmarks. Btw, Nvidia will be providing support for fp8 in a future release of CUDA - https://github.com/NVIDIA/TransformerEngine/issues/15
I think TMA may not matter as much for consumer cards given the disproportionate amount of fp32 / int32 compute that they have.
Would be interesting to see how close to theoretical folks are able to get once CUDA support comes through.
Stats
NVIDIA/TransformerEngine is an open source project licensed under Apache License 2.0 which is an OSI approved license.
The primary programming language of TransformerEngine is Python.
Popular Comparisons
Sponsored