Woodpecker
Qwen
Woodpecker | Qwen | |
---|---|---|
2 | 5 | |
561 | 11,682 | |
- | 6.8% | |
8.9 | 9.4 | |
5 months ago | 5 days ago | |
Python | Python | |
- | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Woodpecker
-
shinning the spotlight on CogVLM
Woodpecker: Hallucination Correction for Multimodal Large Language Models https://github.com/BradyFU/Woodpecker
- Woodpecker: Hallucination Correction for Multimodal Large Language Models
Qwen
What are some alternatives?
hallucination-leaderboard - Leaderboard Comparing LLM Performance at Producing Hallucinations when Summarizing Short Documents
spacy-llm - 🦙 Integrating LLMs into structured NLP pipelines
unilm - Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
SqueezeLLM - [ICML 2024] SqueezeLLM: Dense-and-Sparse Quantization
ChatGLM2-6B - ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型
gsgen - [CVPR 2024] Text-to-3D using Gaussian Splatting
GPT4RoI - GPT4RoI: Instruction Tuning Large Language Model on Region-of-Interest
OuterFlightTracker - A flight tracker made in 6 hours on a flight home from OuterNet
deeplake - Database for AI. Store Vectors, Images, Texts, Videos, etc. Use with LLMs/LangChain. Store, query, version, & visualize any AI data. Stream data in real-time to PyTorch/TensorFlow. https://activeloop.ai
Baichuan-7B - A large-scale 7B pretraining language model developed by BaiChuan-Inc.
Chinese-LLaMA-Alpaca - 中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
Baichuan-13B - A 13B large language model developed by Baichuan Intelligent Technology