Skip to content

Integration for Griptape AI #4330

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
smeubank opened this issue Apr 28, 2025 · 2 comments
Open

Integration for Griptape AI #4330

smeubank opened this issue Apr 28, 2025 · 2 comments
Labels
Feature New Integration Integrating with a new framework or library Python SDK

Comments

@smeubank
Copy link
Member

Problem Statement

When using Griptape as the AI framework, which internally relies on the OpenAI Python SDK, Sentry's current OpenAI integration does not correctly capture token usage, request durations, or cost estimates.

Potentially because Griptape wraps or abstracts OpenAI SDK calls, they are not captured by Sentry’s current instrumentation.

Solution Brainstorm

Add support for Griptape as a first-class integration, similar to the existing OpenAI integration.
Alternatively, provide guidance or utilities for manually instrumenting Griptape's OpenAI calls to enable token tracking, durations, and cost estimates.

@sentrivana sentrivana added the New Integration Integrating with a new framework or library label Apr 28, 2025
@invke
Copy link

invke commented Apr 30, 2025

Thanks @smeubank, yeah I imagine it'll be increasingly more valuable for framework integrations over AI SDK integrations, as more complex systems are built and aren't locking themselves to a single providers set of models. Perhaps though this is where OTel makes sense with appropriate conventions.

I think it looks like they had some working group and ideas around conventions for LLM spans but can't quite see where it ended up (open-telemetry/semantic-conventions#327).

Reads like OpenLLMetry's conventions that you mentioned were adopted, https://github.com/traceloop/openllmetry?tab=readme-ov-file#frameworks.

Griptape themselves an OpenTelemetryObservabilityDriver (adapter pattern sort of thing) which I haven't explicitly used but I imagine'd would be how I would integrate to get it to populate Sentry with insights on the AI pipelines. However, I don't think I understand enough about the conventions it used to record tokens.

My (not super familiar) understanding of how it might work would be these frameworks OTel adapters/drivers would use a standard convention (like above?) that you can then use Sentry's span processor which would load them into the AI insights as it'd be matching the LLM span conventions.

@getsantry getsantry bot moved this to Waiting for: Product Owner in GitHub Issues with 👀 3 Apr 30, 2025
@invke
Copy link

invke commented May 1, 2025

Collin from Griptape pointed me at the right docs for the conventions (https://opentelemetry.io/docs/specs/semconv/gen-ai/gen-ai-metrics/) when I inquired about how their OTel driver, they are currently looking into adding metrics with this in mind by the sounds of it, griptape-ai/griptape#1790.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Feature New Integration Integrating with a new framework or library Python SDK
Projects
Status: No status
Development

No branches or pull requests

5 participants