Execute your saved prompt templates on Portkey
Portkey Prompts API completely for both requests and responses, making it a drop-in replacement existing for your existing Chat or Completions calls.
Send Variables
Create your Propmt Template on Portkey UI, define variables, and pass them with this API:
Override Prompt Settings
You can override any model hyperparameter saved in the prompt template by sending its new value at the time of making a request:
Call Specific Prompt Version
Passing the {promptId}
always calls the Published
version of your prompt.
But, you can also call a specific template version by appending its version number, like {promptId@12}
:
Version Tags:
@latest
: Calls the @{NUMBER}
(like @12
): Calls the specified version numberNo Suffix
: Here, Portkey defaults to the Published
versionStreaming
Prompts API also supports streaming responses, and completely follows the OpenAI schema.
stream:True
explicitly in your request to enable streamingThe unique identifier of the prompt template to use
Note: Although hyperparameters are shown grouped here (like messages, max_tokens, temperature, etc.), they should only be passed at the root level, alongside 'variables' and 'stream'.
Successful completion response
The response is of type object
.