llm_utils VS openai-messages-token-helper

Compare llm_utils vs openai-messages-token-helper and see what are their differences.

llm_utils

Utilities for Llama.cpp, Openai, Anthropic, Mistral-rs. (by ShelbyJenkins)

openai-messages-token-helper

A utility library for dealing with token counting for messages sent to an LLM (currently OpenAI models only) (by pamelafox)
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
llm_utils openai-messages-token-helper
2 3
24 15
- -
6.0 7.6
22 days ago 27 days ago
Rust Python
MIT License MIT License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

llm_utils

Posts with mentions or reviews of llm_utils. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-06-17.
  • Show HN: Token price calculator for 400+ LLMs
    12 projects | news.ycombinator.com | 17 Jun 2024
    > tiktoken.encoding_for_model(model)

    Calling this where model == 'gpt-4o' will encode with CL200k no?

    But yes, I do agree with you. I had time implementing non-tiktoken tokenizers for my project. I ended up manually adding tokenizer.json files into my repo.[1] The other options is downloading from HF, but the official repos where the model's tokenizer.json lives require agreeing to their terms to access. So it requires an HF key, and agreeing to the terms. So not a good experience for a consumer of the package.

    > Message frame tokens?

    Do you mean the chat template tokens? Oh, that's another good point. Yeah, it counts OpenAI prompt tokens. I solved this by implementing a Jinja templating engine to create the full prompt. [2] Granted, both llama.cpp and mistral-rs do this on the backend, so it's purely for counting tokens. I guess it would make sense to add a function to convert tokens to Dollars.

    [1] https://github.com/ShelbyJenkins/llm_utils/tree/main/src/mod...

openai-messages-token-helper

Posts with mentions or reviews of openai-messages-token-helper. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-06-17.
  • Show HN: Token price calculator for 400+ LLMs
    12 projects | news.ycombinator.com | 17 Jun 2024
    I grappled with that issue for https://github.com/pamelafox/openai-messages-token-helper as I wanted to be able to use it for a quick token check with SLMs as well, so I ended up adding a parameter "fallback_to_default" for developers to indicate they're okay with assuming gpt-35 BPE encoding.
  • Lessons after a Half-billion GPT Tokens
    2 projects | news.ycombinator.com | 13 Apr 2024
    Lol, nice truncation logic! If anyone’s looking for something slightly fancier, I made a micro-package for our tiktoken-based truncation here: https://github.com/pamelafox/llm-messages-token-helper

What are some alternatives?

When comparing llm_utils and openai-messages-token-helper you can also consider the following projects:

tokencost - Easy token price estimates for 400+ LLMs

SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured

Did you konow that Rust is
the 6th most popular programming language
based on number of metions?