OpenRouter 🔀
Access 100+ AI models through a single unified API.
🌐
What you can do: Route requests to any AI model (OpenAI, Anthropic, Google, Meta, etc.), automatic fallbacks, provider-specific optimizations, and cost tracking - all through one API.
Setup
Add your OpenRouter API key in the ProtectMyAPI Dashboard.
Chat Completions
Basic Chat
let openRouter = ProtectMyAPI.openRouterService()
let response = try await openRouter.createChatCompletion(
request: OpenRouterChatRequest(
model: "anthropic/claude-3.5-sonnet",
messages: [
.system("You are a helpful assistant."),
.user("Explain the difference between TCP and UDP")
]
)
)
print(response.choices.first?.message.content ?? "")Switch Between Models
Use the same code with any model:
// OpenAI
let gpt = try await openRouter.createChatCompletion(
request: OpenRouterChatRequest(
model: "openai/gpt-4-turbo",
messages: [.user("Hello!")]
)
)
// Google
let gemini = try await openRouter.createChatCompletion(
request: OpenRouterChatRequest(
model: "google/gemini-pro-1.5",
messages: [.user("Hello!")]
)
)
// Meta
let llama = try await openRouter.createChatCompletion(
request: OpenRouterChatRequest(
model: "meta-llama/llama-3.1-70b-instruct",
messages: [.user("Hello!")]
)
)Streaming
for try await chunk in openRouter.createChatCompletionStream(
request: OpenRouterChatRequest(
model: "anthropic/claude-3.5-sonnet",
messages: [.user("Write a detailed guide on Swift concurrency")]
)
) {
print(chunk.choices.first?.delta?.content ?? "", terminator: "")
}Vision (Image Analysis)
let response = try await openRouter.createChatCompletion(
request: OpenRouterChatRequest(
model: "anthropic/claude-3.5-sonnet", // or "openai/gpt-4-vision-preview"
messages: [
.user(content: [
.text("What's in this image?"),
.imageUrl(url: "https://example.com/image.jpg")
])
]
)
)
print(response.choices.first?.message.content ?? "")With Base64 Image
let imageData = // ... your image data
let base64 = imageData.base64EncodedString()
let response = try await openRouter.createChatCompletion(
request: OpenRouterChatRequest(
model: "openai/gpt-4-vision-preview",
messages: [
.user(content: [
.text("Describe this image in detail"),
.imageUrl(url: "data:image/jpeg;base64,\(base64)")
])
]
)
)Fallbacks & Routing
Automatic Fallbacks
Route to backup models if primary fails:
let response = try await openRouter.createChatCompletion(
request: OpenRouterChatRequest(
model: "anthropic/claude-3.5-sonnet",
messages: [.user("Hello!")],
route: "fallback",
models: [
"anthropic/claude-3.5-sonnet",
"openai/gpt-4-turbo",
"google/gemini-pro-1.5"
]
)
)Provider Preferences
Specify preferred providers:
let response = try await openRouter.createChatCompletion(
request: OpenRouterChatRequest(
model: "openai/gpt-4",
messages: [.user("Hello!")],
providerPreferences: OpenRouterProviderPreferences(
order: ["OpenAI", "Azure"],
allowFallbacks: true
)
)
)Function Calling
let response = try await openRouter.createChatCompletion(
request: OpenRouterChatRequest(
model: "openai/gpt-4-turbo",
messages: [.user("What's the weather in Tokyo?")],
tools: [
OpenRouterTool(
type: "function",
function: OpenRouterFunction(
name: "get_weather",
description: "Get weather for a location",
parameters: [
"type": "object",
"properties": [
"location": ["type": "string"]
],
"required": ["location"]
]
)
)
]
)
)
if let toolCall = response.choices.first?.message.toolCalls?.first {
print("Function: \(toolCall.function.name)")
print("Args: \(toolCall.function.arguments)")
}JSON Mode
let response = try await openRouter.createChatCompletion(
request: OpenRouterChatRequest(
model: "openai/gpt-4-turbo",
messages: [
.system("Output valid JSON only."),
.user("List 3 fruits with colors")
],
responseFormat: OpenRouterResponseFormat(type: "json_object")
)
)List Available Models
let models = try await openRouter.listModels()
for model in models.data {
print("\(model.id): \(model.name)")
print(" Context: \(model.contextLength)")
print(" Pricing: \(model.pricing.prompt)/\(model.pricing.completion)")
}Popular Models
Premium Models
| Model | Provider | Context | Best For |
|---|---|---|---|
anthropic/claude-3.5-sonnet | Anthropic | 200K | Coding, analysis |
openai/gpt-4-turbo | OpenAI | 128K | General purpose |
openai/gpt-4o | OpenAI | 128K | Fast, multimodal |
google/gemini-pro-1.5 | 1M | Long context |
Open Source Models
| Model | Provider | Context | Best For |
|---|---|---|---|
meta-llama/llama-3.1-405b-instruct | Meta | 128K | Highest open quality |
meta-llama/llama-3.1-70b-instruct | Meta | 128K | Balance |
mistralai/mixtral-8x22b-instruct | Mistral | 65K | MoE efficiency |
Vision Models
| Model | Description |
|---|---|
openai/gpt-4-vision-preview | GPT-4 with vision |
anthropic/claude-3.5-sonnet | Claude with vision |
google/gemini-pro-vision | Gemini with vision |
Free Models
| Model | Description |
|---|---|
meta-llama/llama-3.1-8b-instruct:free | Free Llama 8B |
google/gemma-7b-it:free | Free Gemma |
mistralai/mistral-7b-instruct:free | Free Mistral |
App Information
Include app info for analytics:
let response = try await openRouter.createChatCompletion(
request: OpenRouterChatRequest(
model: "anthropic/claude-3.5-sonnet",
messages: [.user("Hello!")],
httpReferer: "https://myapp.com",
xTitle: "My Mobile App"
)
)Pricing
OpenRouter passes through provider pricing with a small markup (~5%). Benefits:
- Single bill for all providers
- Automatic currency handling
- Credit-based system
- No separate accounts needed
Check their models page for current pricing.