This feature is available on all Portkey plans.
Portkey Follows OpenAI Spec
Portkey API is powered by its battle-tested open-source AI Gateway, which converts all incoming requests to the OpenAI signature and returns OpenAI-compliant responses.Switching Providers is a Breeze
Integrating Local or Private Models
Portkey can also route to and observe your locally or privately hosted LLMs, as long as the model is compliant with one of the 15+ providers supported by Portkey and the URL is exposed publicly. Simply specify thecustom_host
parameter along with the provider
name, and Portkey will handle the communication with your local model.
Note:When using
custom_host
, include the version identifier (e.g., /v1
) in the URL. Portkey will append the actual endpoint path (/chat/completions
, /completions
, or /embeddings
) automatically. (For Ollama models, this works differently. Check here)