Integrate Mistral AI models with Portkey’s AI Gateway
Portkey provides a robust and secure gateway to integrate various Large Language Models (LLMs) into applications, including Mistral AI’s models.With Portkey, take advantage of features like fast AI gateway access, observability, prompt management, and more, while securely managing API keys through Model Catalog.
from portkey_ai import Portkey# 1. Install: pip install portkey-ai# 2. Add @mistral-ai provider in model catalog# 3. Use it:portkey = Portkey(api_key="PORTKEY_API_KEY")response = portkey.chat.completions.create( model="@mistral-ai/mistral-large-latest", messages=[{"role": "user", "content": "Say this is a test"}])print(response.choices[0].message.content)
Tip: You can also set provider="@mistral-ai" in Portkey() and use just model="mistral-large-latest" in the request.
Mistral AI provides a dedicated Codestral endpoint for code generation. Use the customHost property to access it:
Copy
Ask AI
from portkey_ai import Portkeyportkey = Portkey( api_key="PORTKEY_API_KEY", provider="@mistral-ai", custom_host="https://codestral.mistral.ai/v1")code_completion = portkey.chat.completions.create( model="codestral-latest", messages=[{"role": "user", "content": "Write a minimalist Python code to validate the proof for the special number 1729"}])print(code_completion.choices[0].message.content)
Your Codestral requests will show up on Portkey logs with code snippets rendered beautifully:
Tool calling lets models trigger external tools based on conversation context. You define available functions, the model chooses when to use them, and your application executes them and returns results.Portkey supports Mistral tool calling and makes it interoperable across multiple providers. With Portkey Prompts, you can templatize your prompts and tool schemas.
Copy
Ask AI
from portkey_ai import Portkeyportkey = Portkey( api_key="PORTKEY_API_KEY", provider="@mistral-ai")tools = [{ "type": "function", "function": { "name": "getWeather", "description": "Get the current weather", "parameters": { "type": "object", "properties": { "location": {"type": "string", "description": "City and state"}, "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]} }, "required": ["location"] } }}]response = portkey.chat.completions.create( model="mistral-large-latest", messages=[ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "What's the weather like in Delhi?"} ], tools=tools, tool_choice="auto")print(response.choices[0].finish_reason)
Manage all prompt templates to Mistral AI in the Prompt Library. All current Mistral AI models are supported, and you can easily test different prompts.Use the portkey.prompts.completions.create interface to use the prompt in an application.