anthropic-tokenizer
Approximation of the Claude 3 tokenizer by inspecting generation stream (by javirandor)
openai-messages-token-helper
A utility library for dealing with token counting for messages sent to an LLM (currently OpenAI models only) (by pamelafox)
anthropic-tokenizer | openai-messages-token-helper | |
---|---|---|
3 | 3 | |
87 | 15 | |
- | - | |
7.8 | 7.6 | |
6 days ago | 27 days ago | |
Python | Python | |
MIT License | MIT License |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
anthropic-tokenizer
Posts with mentions or reviews of anthropic-tokenizer.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2024-06-17.
openai-messages-token-helper
Posts with mentions or reviews of openai-messages-token-helper.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2024-06-17.
-
Show HN: Token price calculator for 400+ LLMs
I grappled with that issue for https://github.com/pamelafox/openai-messages-token-helper as I wanted to be able to use it for a quick token check with SLMs as well, so I ended up adding a parameter "fallback_to_default" for developers to indicate they're okay with assuming gpt-35 BPE encoding.
-
Lessons after a Half-billion GPT Tokens
Lol, nice truncation logic! If anyone’s looking for something slightly fancier, I made a micro-package for our tiktoken-based truncation here: https://github.com/pamelafox/llm-messages-token-helper
What are some alternatives?
When comparing anthropic-tokenizer and openai-messages-token-helper you can also consider the following projects:
tokencost - Easy token price estimates for 400+ LLMs