Portkey’s virtual key system allows you to securely store your LLM API keys in our vault, utilizing a unique virtual identifier to streamline API key management.
We are upgrading the Virtual Key experience with the Model Catalog feature.
With Model Catalog, you can now:
virtual key
) with the model param in your LLM requestsThis feature also provides the following benefits:
These can be managed within your account under the “Virtual Keys” tab.
Tip: You can register multiple keys for one provider or use different names for the same key for easy identification.
Azure Virtual Keys allow you to manage multiple Azure deployments under a single virtual key. This feature simplifies API key management and enables flexible usage of different Azure OpenAI models. You can create multiple deployments under the same resource group and manage them using a single virtual key.
Configure Multiple Azure Deployments
To use the required deployment, simply pass the alias
of the deployment as the model
in LLM request body. In case the models is left empty or the specified alias does not exist, the default deployment is used.
Your API keys are encrypted and stored in secure vaults, accessible only at the moment of a request. Decryption is performed exclusively in isolated workers and only when necessary, ensuring the highest level of data security.
We randomly generate virtual keys and link them separately to the securely stored keys. This means, your raw API keys can not be reverse engineered from the virtual keys.
Add the virtual key directly to the initialization configuration for Portkey.
Alternatively, you can override the virtual key during the completions call as follows:
Add the virtual key directly to the initialization configuration for the OpenAI client.
Alternatively, you can override the virtual key during the completions call as follows:
Portkey supports creating virtual keys for your privately hosted LLMs, allowing you to manage them alongside commercial providers.
This allows you to use your self-hosted models with all Portkey features including observability, reliability, and access control.
Configure Self-Hosted LLM
For more details, see Bring Your Own LLM.
Portkey provides a simple way to set budget limits for any of your virtual keys and helps you manage your spending on AI providers (and LLMs) - giving you confidence and control over your application’s costs.
Choose your Virtual Key within Portkey’s prompt templates, and it will be automatically retrieved and ready for use.
Set the virtual key when utilizing Portkey’s custom LLM as shown below: