Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality. Learn more →
LaMini-LM Alternatives
Similar projects and alternatives to LaMini-LM
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
llama_generative_agent
A generative agent implementation for LLaMA based models, derived from langchain's implementation.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
NOTE:
The number of mentions on this list indicates mentions on common posts plus user suggested alternatives.
Hence, a higher number means a better LaMini-LM alternative or higher similarity.
LaMini-LM reviews and mentions
Posts with mentions or reviews of LaMini-LM.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-06-17.
- Explore large language models on any computer with 512MB of RAM
-
Show HN: Explore large language models on any computer with 512MB of RAM
That's correct. In particular, the current base model is LaMini-Flan-T5-248M described in more detail here:
https://github.com/mbzuai-nlp/lamini-lm
I shared more details over on Reddit:
https://www.reddit.com/r/LocalLLaMA/comments/14btk3a/explore...
-
Generative agents with open-sourced large langurage models!
There are many models to try, but I saw this great benchmark of models here: https://github.com/mbzuai-nlp/LaMini-LM
- LaMini-LM: A Diverse Herd of Distilled Models from Large-Scale Instructions
-
"LaMini-LM: A Diverse Herd of Distilled Models from Large-Scale Instructions", Wu et al 2023 (knowledge-distillation of instruction-tuning)
Here is the git https://github.com/mbzuai-nlp/LaMini-LM. They are claiming these 285M models are on par with Llama and beyond.
-
New llama LoRA trained on WizardLM dataset
Have you seen LaMini-LM? They achieve Alpaca-7b performance in a 1.5b model through the use of a very large dataset (2M+ instructions).
- Good computer spec for the next 5 years to be able to run Local LLM
-
A note from our sponsor - InfluxDB
www.influxdata.com | 3 Jun 2024
Stats
Basic LaMini-LM repo stats
9
801
7.8
about 1 year ago
Sponsored
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com