-
Notifications
You must be signed in to change notification settings - Fork 75
Count LLM tokens in Azure AI Foundry models #1067
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Labels
Comments
@dotnet/dev-proxy-maintainers any feedback? |
When we get to work with other AI platforms (Ollama, OAI, etc), we'll implement them in separate plugins. |
Researching integration with OpenLIT which has ready-to-use dashboards for LLM observability including pricing |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Build a new plugin that allows customers to count the number of LLM tokens per recorded session. For each recorded request, check if it has a response, and if it's an LLM response that contains tokens information (sent and received). Calculate the total of tokens per type (prompt, completion, total). Group total per model. Show totals per type and a grand total for the session.
Initially, the plugin will support Azure AI Foundry URLs for groupings per model.
Originally proposed by @tomorgan
The text was updated successfully, but these errors were encountered: