Skip to main content

Quick Start

Get started with Snowflake Cortex in under 2 minutes:
from portkey_ai import Portkey

# 1. Install: pip install portkey-ai
# 2. Add @cortex provider in model catalog
# 3. Use it:

portkey = Portkey(api_key="PORTKEY_API_KEY")

response = portkey.chat.completions.create(
    model="@cortex/claude-3-5-sonnet",
    messages=[{"role": "user", "content": "Hello!"}]
)

print(response.choices[0].message.content)

Add Provider in Model Catalog

Before making requests, add Snowflake Cortex to your Model Catalog:
  1. Go to Model Catalog → Add Provider
  2. Select Snowflake Cortex
  3. Enter your API key/JWT Token from Snowflake
  4. Name your provider (e.g., cortex)

Complete Setup Guide

See all setup options and detailed configuration instructions

Supported Models

Snowflake Cortex provides access to various AI models through the Snowflake platform:
  • Claude (Anthropic)
  • Llama (Meta)
  • Mistral
  • And other popular models
Check Snowflake Cortex documentation for the complete model list.

Next Steps

Gateway Configs

Add fallbacks, load balancing, and more

Observability

Monitor and trace your Cortex requests

Prompt Library

Manage and version your prompts

Metadata

Add custom metadata to requests
For complete SDK documentation:

SDK Reference

Complete Portkey SDK documentation
Last modified on February 9, 2026