Learn to integrate OpenAI with Portkey, enabling seamless completions, prompt management, and advanced functionalities like streaming, function calling and fine-tuning.
Portkey has native integrations with OpenAI SDKs for Node.js, Python, and its REST APIs. For OpenAI integration using other frameworks, explore our partnerships, including Langchain, LlamaIndex, among others.
Provider Slug. openai
To integrate the Portkey gateway with OpenAI,
baseURL
to the Portkey Gateway URLprovider
, apiKey
, ‘virtualKey’ and others.Here’s how to apply it to a chat completion request:
Install the Portkey SDK with npm
This request will be automatically logged by Portkey. You can view this in your logs dashboard. Portkey logs the tokens utilized, execution time, and cost for each request. Additionally, you can delve into the details to review the precise request and response data.
Portkey supports OpenAI’s new “developer” role in chat completions. With o1 models and newer, the developer
role replaces the previous system
role.
OpenAI has released a new Responses API that combines the best of both Chat Completions and Assistants APIs. Portkey fully supports this new API, enabling you to use it with both the Portkey SDK and OpenAI SDK.
The Responses API provides a more flexible foundation for building agentic applications with built-in tools that execute automatically.
Portkey supports Remote MCP support by OpenAI on it’s Responses API. Learn More
Portkey allows you to track user IDs passed with the user
parameter in OpenAI requests, enabling you to monitor user-level costs, requests, and more.
When you include the user
parameter in your requests, Portkey logs will display the associated user ID, as shown in the image below:
In addition to the user
parameter, Portkey allows you to send arbitrary custom metadata with your requests. This powerful feature enables you to associate additional context or information with each request, which can be useful for analysis, debugging, or other custom use cases.
Portkey also supports creating and managing prompt templates in the prompt library. This enables the collaborative development of prompts directly through the user interface.
Observe how this streamlines your code readability and simplifies prompt updates via the UI without altering the codebase.
Portkey supports OpenAI’s Realtime API with a seamless integration. This allows you to use Portkey’s logging, cost tracking, and guardrail features while using the Realtime API.
Portkey supports streaming responses using Server Sent Events (SSE).
You can also stream responses from the Responses API:
Portkey’s multimodal Gateway fully supports OpenAI vision models as well. See this guide for more info:
You can also use the Responses API to process images alongside text:
Function calls within your OpenAI or Portkey SDK operations remain standard. These logs will appear in Portkey, highlighting the utilized functions and their outputs.
Additionally, you can define functions within your prompts and invoke the portkey.prompts.completions.create
method as above.
The Responses API also supports function calling with the same powerful capabilities:
Please refer to our fine-tuning guides to take advantage of Portkey’s advanced continuous fine-tuning capabilities.
Portkey supports multiple modalities for OpenAI and you can make image generation requests through Portkey’s AI Gateway the same way as making completion calls.
Portkey’s fast AI gateway captures the information about the request on your Portkey Dashboard. On your logs screen, you’d be able to see this request with the request and response.
Log view for an image generation request on OpenAI
More information on image generation is available in the API Reference.
Portkey’s multimodal Gateway also supports the audio
methods on OpenAI API. Check out the below guides for more info:
Check out the below guides for more info:
Web search delivers accurate and clearly-cited answers from the web, using the same tool as search in ChatGPT:
Options for search_context_size
:
high
: Most comprehensive context, higher cost, slower responsemedium
: Balanced context, cost, and latency (default)low
: Minimal context, lowest cost, fastest responseResponses include citations for URLs found in search results, with clickable references.
File search enables quick retrieval from your knowledge base across multiple file types:
This tool requires you to first create a vector store and upload files to it. Supports various file formats including PDFs, DOCXs, TXT, and more. Results include file citations in the response.
Control the depth of model reasoning for more comprehensive analysis:
Portkey also supports the Computer Use Assistant (CUA) tool, which helps agents control computers or virtual machines through screenshots and actions. This feature is available for select developers as a research preview on premium tiers.
Learn More about Computer use tool here
When integrating OpenAI with Portkey, you can specify your OpenAI organization and project IDs along with your API key. This is particularly useful if you belong to multiple organizations or are accessing projects through a legacy user API key.
Specifying the organization and project IDs helps you maintain better control over your access rules, usage, and costs.
In Portkey, you can add your Org & Project details by,
Let’s explore each method in more detail.
When selecting OpenAI from the dropdown menu while creating a virtual key, Portkey automatically displays optional fields for the organization ID and project ID alongside the API key field.
Get your OpenAI API key from here, then add it to Portkey to create the virtual key that can be used throughout Portkey.
Portkey takes budget management a step further than OpenAI. While OpenAI allows setting budget limits per project, Portkey enables you to set budget limits for each virtual key you create. For more information on budget limits, refer to this documentation:
You can also specify the organization and project details in the gateway config, either at the root level or within a specific target.
You can also pass your organization and project details directly when making a request using curl, the OpenAI SDK, or the Portkey SDK.
Portkey supports the complete host of it’s functionality via the OpenAI SDK so you don’t need to migrate away from it.
Please find more information in the relevant sections: