Skip to content

feat: add llmobs proxy paths to trace agent #628

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 13 commits into from
Mar 31, 2025
Merged

Conversation

sabrenner
Copy link
Collaborator

@sabrenner sabrenner commented Mar 27, 2025

Adds LLM Observability Eval Metrics and Spans endpoints to the trace agent. They are EVP proxy paths to begin with, and then we forward to our actual backend (proxy path without /evp_proxy/v2).

I verified this by using it as a layer in a small lambda I have for LLM Observability testing - with a debug log I added, I noted

DD_EXTENSION | DEBUG | Got response from https://llmobs-intake.datadoghq.com/api/v2/llmobs backend: 202 Accepted

and was able to see the span in LLM Observability. This was verified for both Python and Node.js.

MLOB-2521

@sabrenner sabrenner marked this pull request as ready for review March 28, 2025 19:51
@sabrenner sabrenner requested a review from a team as a code owner March 28, 2025 19:51
Copy link
Contributor

@duncanista duncanista left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What are the endpoint limits? What's the max size package? Does the tracer already batch the data accordingly?

@sabrenner
Copy link
Collaborator Author

What are the endpoint limits?

I think the only limitations are payload size, which we have handling for in our SDKs and ingestion paths. Otherwise our intake is JSON based, and all payload encoding is handled in our SDKs. This PR would just add the pass-through endpoints so we can leverage the API key on the agent.

What's the max size package?

I'm guessing you mean the payload size (correct me if I'm wrong!) - we have a 1mb limit on our spans, which we have checks for in the SDKs that will be using this endpoint. We're actually actively reconfiguring this to send spans as batched events (which is actually still just one event from our side, but read as a batch at ingestion).

Does the tracer already batch the data accordingly?

Yeah, even in its current state everything is handled by the tracer, but we are releasing a version of our SDK that submits spans as a batch.

Let me know if this answers your questions @duncanista!

@sabrenner sabrenner force-pushed the sabrenner/add-llmobs-proxies branch from 318f56f to 825ec8b Compare March 31, 2025 18:46
@astuyve astuyve merged commit 0463249 into main Mar 31, 2025
46 checks passed
@astuyve astuyve deleted the sabrenner/add-llmobs-proxies branch March 31, 2025 20:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants