Llm_utils Alternatives
Similar projects and alternatives to llm_utils
-
litellm
Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
-
-
openai-messages-token-helper
A utility library for dealing with token counting for messages sent to an LLM (currently OpenAI models only)
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
llm_utils discussion
llm_utils reviews and mentions
-
Show HN: Token price calculator for 400+ LLMs
> tiktoken.encoding_for_model(model)
Calling this where model == 'gpt-4o' will encode with CL200k no?
But yes, I do agree with you. I had time implementing non-tiktoken tokenizers for my project. I ended up manually adding tokenizer.json files into my repo.[1] The other options is downloading from HF, but the official repos where the model's tokenizer.json lives require agreeing to their terms to access. So it requires an HF key, and agreeing to the terms. So not a good experience for a consumer of the package.
> Message frame tokens?
Do you mean the chat template tokens? Oh, that's another good point. Yeah, it counts OpenAI prompt tokens. I solved this by implementing a Jinja templating engine to create the full prompt. [2] Granted, both llama.cpp and mistral-rs do this on the backend, so it's purely for counting tokens. I guess it would make sense to add a function to convert tokens to Dollars.
[1] https://github.com/ShelbyJenkins/llm_utils/tree/main/src/mod...
Stats
ShelbyJenkins/llm_utils is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of llm_utils is Rust.