instructor
fructose
instructor | fructose | |
---|---|---|
17 | 3 | |
5,417 | 700 | |
- | 13.1% | |
9.8 | 9.1 | |
2 days ago | about 1 month ago | |
Python | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
instructor
- Instructor: Structured Outputs for LLMs
-
Anthropic's Haiku Beats GPT-4 Turbo in Tool Use
Ah yes. Have you tried out instructor [0] or Guidance [1]?
[0]: https://github.com/jxnl/instructor/
- Instructor: Structured Data Like JSON from Large Language Models
-
Show HN: Fructose, LLM calls as strongly typed functions
Good stuff. How does this compare to Instructor? I’ve been using this extensively
https://jxnl.github.io/instructor/
-
Show HN: Ellipsis – Automatic pull request reviews
it's super cool! checkout how the Instructor repo uses it to keep various parts of their docs in sync: https://github.com/jxnl/instructor/blob/main/ellipsis.yaml
-
Pushing ChatGPT's Structured Data Support to Its Limits
I've been using the instructor[1] library recently and have found the abstractions simple and extremely helpful for getting great structured outputs from LLMs with pydantic.
1 https://github.com/jxnl/instructor/tree/main
-
Efficiently using python in GPTs
Maybe try using jason liu’s instructor package (https://github.com/jxnl/instructor) to structure the outputs with pydantic? It’s explained in his presentation from the AI Engineer summit (https://youtu.be/yj-wSRJwrrc)
-
Ask HN: Cheapest way to run local LLMs?
One of the most powerful ways to integrate LLMs with existing systems is constrained generation. Libraries such as outlines[1] and instructor[2] allow structural specification of the expected outputs as regex patterns, simple types, jsonschema or pydantic models.
These outputs often consume significantly fewer tokens than chat or text completion.
[1] https://github.com/outlines-dev/outlines
[2] https://github.com/jxnl/instructor
- OpenAI Function Calls for Humans
-
Unbounded Books: Search by ~Vibes
The best GPT-wrapper you’ll see today?
...but this one hasn't raised oodles of cash.
Mike (creator) here, excited to hear what HN-folks think. Anything to add/improve?
Had fun building, extra s/out to Railway, NextJS, and https://github.com/jxnl/instructor
Check it out: https://www.unboundedbooks.com/
fructose
- FLaNK AI Weekly 18 March 2024
-
Show HN: Fructose, LLM calls as strongly typed functions
This approach may be too high-level "magic" to the point of being difficult to work with and iterate upon.
Looking at the prompt templates (https://github.com/bananaml/fructose/tree/main/src/fructose/... ), they use LangChain-esque "just try to make the output to be valid JSON" when APIs such as the GPT-4 turbo which this model uses by defauly now support function calling/structured data natively, and libraries such as outlines (https://github.com/outlines-dev/outlines) which is more complex but can better ensure a dictionary output for local LLMs
What are some alternatives?
langchainjs - 🦜🔗 Build context-aware reasoning applications 🦜🔗
outlines - Structured Text Generation
simpleaichat - Python package for easily interfacing with chat apps, with robust features and minimal code complexity.
grok-1 - Grok open release
chatgpt-localfiles - Make local files accessible to ChatGPT
PythonGPT - PythonGPT writes and indexes code to implement dynamic code execution using generative models. Younger sibling of DoctorGPT.
httpx - A next generation HTTP client for Python. 🦋