Using the OpenTelemetry SDK directly with Portkey gives you complete control over what gets traced while benefiting from Portkey’s intelligent gateway features like caching, fallbacks, and load balancing.
Why OpenTelemetry SDK + Portkey?
Full Control
Manually instrument exactly what you need with custom spans and attributes
Production Ready
Battle-tested OpenTelemetry standard used by enterprises worldwide
Custom Attributes
Add any metadata you need to traces for debugging and analysis
Gateway Intelligence
Portkey adds routing optimization and resilience to your LLM calls
Quick Start
Prerequisites
- Python
- Portkey account with API key
- OpenAI API key (or use Portkey’s virtual keys)
Step 1: Install Dependencies
Install the required packages:Step 2: Configure OpenTelemetry
Set up the tracer provider and OTLP exporter:Step 3: Configure Portkey Gateway
Set up the OpenAI client with Portkey’s gateway:Step 4: Create Instrumented Functions
Manually instrument your LLM calls with custom spans:Complete Example
Here’s a full working example:Next Steps
Configure Gateway
Set up intelligent routing, fallbacks, and caching
Explore Virtual Keys
Secure your API keys with Portkey’s vault
View Analytics
Analyze costs, performance, and usage patterns
Set Up Alerts
Configure alerts for anomalies and performance issues
See Your Traces in Action
Once configured, navigate to the Portkey dashboard to see your custom OpenTelemetry traces enhanced with gateway intelligence: