Understanding Azure AI Foundry Deployments
Azure AI Foundry offers three different ways to deploy models, each with unique endpoints and configurations:- AI Services: Azure-managed models accessed through Azure AI Services endpoints
- Managed: User-managed deployments running on dedicated Azure compute resources
- Serverless: Seamless, scalable deployment without managing infrastructure
OpenAI modes on Azure
If you’re specifically looking to use OpenAI models on Azure, you should use Azure OpenAI instead, which is optimized for OpenAI models.
Integrate
To integrate Azure AI Foundry with Portkey, you’ll need to create a virtual key. Integrations securely store your Azure AI Foundry credentials in Portkey’s vault, allowing you to use a simple identifier in your code instead of handling sensitive authentication details directly. Navigate to the Inteagrations section in Portkey and select “Azure AI Foundry” as your provider.Creating Your Azure AI Foundry Integration
Integrate Azure AI Foundry with Portkey to centrally manage your AI models and deployments. This guide walks you through setting up the integration using API key authentication.Prerequisites
Before creating your integration, you’ll need:- An active Azure AI Foundry account
- Access to your Azure AI Foundry portal
- A deployed model on Azure Foundry
Step 1: Start Creating Your Integration
Navigate to the Integrations page in your Portkey dashboard and Select Azure AI Foundry as your provider.
Step 2: Configure Integration Details
Fill in the basic information for your integration:- Name: A descriptive name for this integration (e.g., “Azure AI Production”)
- Short Description: Optional context about this integration’s purpose
-
Slug: A unique identifier used in API calls (e.g., “azure-ai-prod”)
Step 3: Set Up Authentication
Portkey supports three authentication methods for Azure AI Foundry. For most use cases, we recommend using the Default (API Key) method.Gather Your Azure Credentials
From your Azure AI Foundry portal, you’ll need to collect:
- Navigate to your model deployment in Azure AI Foundry
- Click on the deployment to view details
- Copy the API Key from the authentication section
- Copy the Target URI - this is your endpoint URL
- Note the API Version from your deployment URL
- Azure Deployment Name (Optional): Only required for Managed Services deployments
Enter Credentials in Portkey

Adding Multiple Models to Your Azure AI Foundry Integration
You can deploy multiple models through a single Azure AI Foundry integration by using Portkey’s custom models feature.Steps to Add Additional Models
- Navigate to your Azure AI Foundry integration in Portkey
- Select the Model Provisioning step
- Click Add Model in the top-right corner

Configure Your Model
Enter the following details for your Azure deployment: Model Slug: Use your Azure Model Deployment name exactly as it appears in Azure AI Foundry
gpt-4
for GPT-4 deployments)
This is just for reference. If you can’t find the particular model, you can just choose a similar model.
Sample Request
Once you’ve created your virtual key, you can start making requests to Azure AI Foundry models through Portkey.Install the Portkey SDK with npm
Advanced Features
Function Calling
Azure AI Foundry supports function calling (tool calling) for compatible models. Here’s how to implement it with Portkey:Vision Capabilities
Process images alongside text using Azure AI Foundry’s vision capabilities:Structured Outputs
Get consistent, parseable responses in specific formats:Relationship with Azure OpenAI
For Azure OpenAI specific models and deployments, we recommend using the existing Azure OpenAI provider in Portkey:Azure OpenAI Integration
Learn how to integrate Azure OpenAI with Portkey for access to OpenAI models hosted on Azure.
Portkey Features with Azure AI Foundry
Setting Up Fallbacks
Create fallback configurations to ensure reliability when working with Azure AI Foundry models:Load Balancing Between Models
Distribute requests across multiple models for optimal performance:Conditional Routing
Route requests based on specific conditions like user type or content requirements:Managing Prompts with Azure AI Foundry
You can manage all prompts to Azure AI Foundry in the Prompt Library. Once you’ve created and tested a prompt in the library, use theportkey.prompts.completions.create
interface to use the prompt in your application.
Next Steps
Explore these additional resources to make the most of your Azure AI Foundry integration with Portkey:Add Metadata
Learn how to add custom metadata to your Azure AI Foundry requests.
Gateway Configs
Configure advanced gateway features for your Azure AI Foundry requests.
Request Tracing
Trace your Azure AI Foundry requests for better observability.
Setup Fallbacks
Create fallback configurations between different providers.