Adds logging LLM usage information using OpenTelemetry. Closes #1067#1167
Merged
garrytrinder merged 2 commits intodotnet:mainfrom May 13, 2025
Merged
Adds logging LLM usage information using OpenTelemetry. Closes #1067#1167garrytrinder merged 2 commits intodotnet:mainfrom
garrytrinder merged 2 commits intodotnet:mainfrom
Conversation
Contributor
|
In the first code snippet the plugin is not enabled, is this correct? When I enable the plugin I get a warning when executing Dev Proxy from the build files. |
Collaborator
Author
|
Duh! Of course the plugin needs to be enabled. Sorry for that. Let me check why there's a warning. |
Collaborator
Author
|
I can't repro the warning you're getting. I did notice, that the sample prices file was missing a bracket, so I updated the sample. Could you please try again and see if that might've been it? |
Contributor
garrytrinder
approved these changes
May 13, 2025
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.

Adds logging LLM usage information using OpenTelemetry. Closes #1067
To test
devproxyrc.json:
{ "$schema": "https://raw.githubusercontent.com/dotnet/dev-proxy/main/schemas/v0.27.0/rc.schema.json", "plugins": [ { "name": "OpenAITelemetryPlugin", "enabled": true, "pluginPath": "~appFolder/plugins/dev-proxy-plugins.dll" } ], "urlsToWatch": [ "http://localhost:11434/*" ], "logLevel": "information", "newVersionNotification": "stable", "showSkipMessages": true, "showTimestamps": true, "validateSchemas": true }Visualize using Aspire dashboard:
Call Ollama using its OpenAI-compatible API:
curl -ikx http://127.0.0.1:8000 -X POST -H Content-Type: application/json -d '{"model":"llama3.2","messages":[{"role":"user","content":"What is the capital of France?"}],"stream":false}' http://localhost:11434/v1/chat/completionsIn the Aspire dashboard, you'll see tracing and metrics (it takes a moment for metrics to show up).
To test cost estimation
devproxyrc.json
{ "$schema": "https://raw.githubusercontent.com/dotnet/dev-proxy/main/schemas/v0.27.0/rc.schema.json", "plugins": [ { "name": "OpenAITelemetryPlugin", "enabled": true, "pluginPath": "~appFolder/plugins/dev-proxy-plugins.dll", "configSection": "openAITelemetryPlugin" } ], "urlsToWatch": [ "http://localhost:11434/*" ], "openAITelemetryPlugin": { "includeCosts": true, "pricesFile": "llm-prices.json" }, "logLevel": "information", "newVersionNotification": "stable", "showSkipMessages": true, "showTimestamps": true, "validateSchemas": true }llm-prices.json:
{ "prices": { "llama3.2": { "input": 10.0, "output": 30.0 } } }Call Ollama again, check out the new span with cost information and wait for additional metrics to show up in the dashboard.