Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality. Learn more →
Top 3 Python ggml Projects
-
inference
Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
py.gpt.prompt
PyGPTPrompt: A CLI tool that manages context windows for AI models, facilitating user interaction and data ingestion for optimized long-term memory and task automation.
Project mention: GreptimeAI + Xinference - Efficient Deployment and Monitoring of Your LLM Applications | dev.to | 2024-01-24Xorbits Inference (Xinference) is an open-source platform to streamline the operation and integration of a wide array of AI models. With Xinference, you’re empowered to run inference using any open-source LLMs, embedding models, and multimodal models either in the cloud or on your own premises, and create robust AI-driven applications. It provides a RESTful API compatible with OpenAI API, Python SDK, CLI, and WebUI. Furthermore, it integrates third-party developer tools like LangChain, LlamaIndex, and Dify, facilitating model integration and development.
Project mention: New open-source model with 8k context runs on CPU, outperforms GPT-3 | news.ycombinator.com | 2023-06-30
Project mention: A Python interface for the OpenAI REST API to automate GPT via prompts. | /r/OpenAI | 2023-05-30
Python ggml related posts
-
Ask HN: Cheapest way to run local LLMs?
-
Minigpt4 Inference on CPU
-
New open-source model with 8k context runs on CPU, outperforms GPT-3
-
MPT 30B inference code using CPU
-
The Coming of Local LLMs
-
A note from our sponsor - InfluxDB
www.influxdata.com | 29 May 2024
Index
What are some of the best open-source ggml projects in Python? This list will help you:
Project | Stars | |
---|---|---|
1 | inference | 2,871 |
2 | mpt-30B-inference | 573 |
3 | py.gpt.prompt | 28 |
Sponsored