Add enterprise-grade observability, cost tracking, and governance to your Roo AI coding assistant
Roo is an AI coding assistant that integrates directly into your VS Code environment, providing autonomous coding capabilities. While Roo offers powerful AI assistance for development tasks, Portkey adds essential enterprise controls for production deployments:
This guide will walk you through integrating Portkey with Roo and setting up essential enterprise features including usage tracking, access controls, and budget management.
If you are an enterprise looking to standardize Roo usage across your development teams, check out this section.
Portkey allows you to use 250+ LLMs with your Roo setup, with minimal configuration required. Let’s set up the core components in Portkey that you’ll need for integration.
Create Virtual Key
Virtual Keys are Portkey’s secure way to manage your LLM provider API keys. Think of them like disposable credit cards for your LLM API keys, providing essential controls like:
To create a virtual key: Go to Virtual Keys in the Portkey App. Save and copy the virtual key ID
Save your virtual key ID - you’ll need it for the next step.
Create Default Config
Configs in Portkey are JSON objects that define how your requests are routed. They help with implementing features like advanced routing, fallbacks, and retries.
We need to create a default config to route our requests to the virtual key created in Step 1.
To create your config:
This basic config connects to your virtual key. You can add more advanced portkey features later.
Configure Portkey API Key
Now create Portkey API key access point and attach the config you created in Step 2:
Step 2
Save your API key securely - you’ll need it for Roo integration.
Now that you have your Portkey components set up, let’s connect them to Roo. Since Portkey provides OpenAI API compatibility, integration is straightforward and requires just a few configuration steps in your VS Code settings.
You need your Portkey API Key from Step 1 before going further.
This method uses the default config you created in Portkey, making it easier to manage model settings centrally.
OpenAI Compatible
https://api.portkey.ai/v1
dummy
(since the model is defined in your Portkey config)Using a default config with override_params
is recommended as it allows you to manage all model settings centrally in Portkey, reducing maintenance overhead.
If you prefer more direct control or need to use multiple providers dynamically, you can pass Portkey headers directly:
Configure the basic settings as in Method 1:
OpenAI Compatible
https://api.portkey.ai/v1
gpt-4o
, claude-3-opus-20240229
)Add custom headers by clicking the +
button in the Custom Headers section:
Optional headers:
Custom headers give you flexibility but require updating headers in Roo whenever you want to change providers or models.
You can now use Roo with all of Portkey’s enterprise features enabled. Monitor your requests and usage in the Portkey Dashboard.
Why Enterprise Governance? When deploying Roo across development teams in your organization, you need to consider several governance aspects:
Portkey adds a comprehensive governance layer to address these enterprise needs. Let’s implement these controls step by step.
Enterprise Implementation Guide
Step 1: Implement Budget Controls & Rate Limits
Virtual Keys enable granular control over LLM access at the team/developer level. This helps you:
Step 2: Define Model Access Rules
As your development team scales, controlling which developers can access specific models becomes crucial. Portkey Configs provide this control layer with features like:
Create your config on the Configs page in your Portkey dashboard.
Configs can be updated anytime to adjust controls without developers needing to update their Roo settings.
Step 3: Implement Developer Access Controls
Create Developer-specific API keys that automatically:
Create API keys through:
Example using Python SDK:
For detailed key management instructions, see our API Keys documentation.
Step 4: Deploy & Monitor
After distributing API keys to your developers, your enterprise-ready Roo setup is ready. Each developer can now use their designated API keys with appropriate access levels and budget controls.
Monitor usage in Portkey dashboard:
Roo now has:
Now that you have enterprise-grade Roo setup, let’s explore the comprehensive features Portkey provides to ensure secure, efficient, and cost-effective AI-assisted development.
Using Portkey you can track 40+ key metrics including cost, token usage, response time, and performance across all your LLM providers in real time. Filter these metrics by developer, team, or project using custom metadata.
Portkey’s logging dashboard provides detailed logs for every request made by Roo. These logs include:
Easily switch between 250+ LLMs for different coding tasks. Use GPT-4 for complex architecture decisions, Claude for detailed code reviews, or specialized models for specific languages - all through a single interface.
Track coding patterns and productivity metrics with custom metadata:
Set and manage spending limits per developer or team. Prevent budget overruns with automatic cutoffs.
Enterprise-grade SSO integration for seamless developer onboarding and offboarding.
Hierarchical structure with teams, projects, and role-based access control for development organizations.
Comprehensive audit logging for security compliance and code generation tracking.
Automatically switch between models if one fails, ensuring uninterrupted coding.
Route requests based on code complexity or language requirements.
Distribute requests across multiple API keys or providers.
Cache common code patterns to reduce costs and improve response times.
Automatic retry handling for failed requests with exponential backoff.
Enforce spending limits to control development costs.
Protect your codebase and enhance security with real-time checks on AI interactions:
Implement real-time protection for your development environment with automatic detection and filtering of sensitive code, credentials, and security vulnerabilities.
How do I track costs per developer?
Portkey provides several ways to track developer costs:
What happens if a developer exceeds their budget?
When a developer reaches their budget limit:
Can I use Roo with local or self-hosted models?
Yes! Portkey supports local models through Ollama and other self-hosted solutions. Configure your local endpoint as a custom provider in Portkey and use it with Roo just like any other provider.
How do I ensure code security with AI assistance?
Portkey provides multiple security layers:
Join our Community
Schedule a 1:1 call with our team to see how Portkey can transform your development workflow with Roo. Get personalized recommendations for your team’s specific needs.
For enterprise support and custom features for your development teams, contact our enterprise team.