SaaSHub helps you find the best software and product alternatives Learn more →
Mixtral-offloading Alternatives
Similar projects and alternatives to mixtral-offloading based on common topics and language
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
makeMoE
From scratch implementation of a sparse mixture of experts language model inspired by Andrej Karpathy's makemore :)
-
xTuring
Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6
mixtral-offloading reviews and mentions
-
DBRX: A New Open LLM
Waiting for Mixed Quantization with MQQ and MoE Offloading [1]. With that I was able to run Mistral 8x7B on my 10 GB VRAM rtx3080... This should work for DBRX and should shave off a ton of VRAM requirement.
1. https://github.com/dvmazur/mixtral-offloading?tab=readme-ov-...
- Mixtral in Colab
- Run Mixtral-8x7B models in Colab or consumer desktops
-
A note from our sponsor - SaaSHub
www.saashub.com | 11 May 2024
Stats
dvmazur/mixtral-offloading is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of mixtral-offloading is Python.
Popular Comparisons
Sponsored