Skip to main content

Quick Start

Get started with Workers AI in under 2 minutes:
from portkey_ai import Portkey

# 1. Install: pip install portkey-ai
# 2. Add @workers-ai provider in model catalog
# 3. Use it:

portkey = Portkey(api_key="PORTKEY_API_KEY")

response = portkey.chat.completions.create(
    model="@workers-ai/@cf/meta/llama-3.2-3b-instruct",
    messages=[{"role": "user", "content": "Hello!"}]
)

print(response.choices[0].message.content)

Add Provider in Model Catalog

Before making requests, add Workers AI to your Model Catalog:
  1. Go to Model Catalog → Add Provider
  2. Select Workers AI
  3. Enter your Cloudflare API key
  4. Name your provider (e.g., workers-ai)

Complete Setup Guide

See all setup options and detailed configuration instructions

Workers AI Capabilities

Image Generation

Workers AI supports image generation with various models:
from portkey_ai import Portkey

portkey = Portkey(api_key="PORTKEY_API_KEY", provider="@workers-ai")

image = portkey.images.generate(
    model="@cf/stabilityai/stable-diffusion-xl-base-1.0",
    prompt="A beautiful sunset over mountains"
)

print(image.data[0].url)

Supported Models

Cloudflare Workers AI provides serverless AI inference with various models: Chat Models:
  • @cf/meta/llama-3.2-3b-instruct
  • @cf/meta/llama-3.1-8b-instruct
  • @cf/mistral/mistral-7b-instruct-v0.1
Image Generation Models:
  • @cf/stabilityai/stable-diffusion-xl-base-1.0
Check Cloudflare Workers AI documentation for the complete model list.

Next Steps

Gateway Configs

Add fallbacks, load balancing, and more

Observability

Monitor and trace your Workers AI requests

Prompt Library

Manage and version your prompts

Caching

Cache responses at the edge
For complete SDK documentation:

SDK Reference

Complete Portkey SDK documentation
Last modified on February 9, 2026