Mistral AI
Integrate Mistral Large, Codestral, and Mixtral with ProtectMyAPI.
💨
Supported Models: Mistral Large, Mistral Small, Codestral, Mixtral 8x7B
Features
- ✅ Chat Completions
- ✅ Code generation (Codestral)
- ✅ Fill-in-the-middle (FIM)
- ✅ Function Calling
- ✅ JSON Mode
- ✅ Streaming responses
Setup
1. Add your API key to Secrets
In the ProtectMyAPI dashboard:
- Go to your app → Secrets
- Add a secret named
MISTRAL_API_KEY - Paste your Mistral API key as the value
2. Create an endpoint
Create an endpoint with:
- Name: Mistral Chat
- Slug:
mistral-chat - Target URL:
https://api.mistral.ai/v1/chat/completions - Method: POST
- Auth Type: Bearer
- Auth Value:
{{MISTRAL_API_KEY}}
Code Examples
import ProtectMyAPI
// Simple chat
let response = try await ProtectMyAPI.shared.request(
endpoint: "mistral-chat",
method: .POST,
body: [
"model": "mistral-large-latest",
"messages": [
["role": "user", "content": "Explain machine learning"]
]
]
)
// Code generation with Codestral
let codeResponse = try await ProtectMyAPI.shared.request(
endpoint: "mistral-chat",
method: .POST,
body: [
"model": "codestral-latest",
"messages": [
["role": "user", "content": "Write a binary search in Python"]
]
]
)
// JSON Mode
let jsonResponse = try await ProtectMyAPI.shared.request(
endpoint: "mistral-chat",
method: .POST,
body: [
"model": "mistral-large-latest",
"messages": [
["role": "user", "content": "List 3 programming languages with their year of creation as JSON"]
],
"response_format": ["type": "json_object"]
]
)Models
| Model | Best For | Context |
|---|---|---|
mistral-large-latest | Complex tasks | 128K |
mistral-small-latest | Fast, simple tasks | 128K |
codestral-latest | Code generation | 32K |
open-mixtral-8x7b | Open source | 32K |
Fill-in-the-Middle (FIM)
Codestral supports FIM for code completion:
Create an endpoint:
- Slug:
codestral-fim - Target URL:
https://api.mistral.ai/v1/fim/completions
let fimResponse = try await ProtectMyAPI.shared.request(
endpoint: "codestral-fim",
method: .POST,
body: [
"model": "codestral-latest",
"prompt": "def fibonacci(n):",
"suffix": " return result",
"max_tokens": 256
]
)Function Calling
let response = try await ProtectMyAPI.shared.request(
endpoint: "mistral-chat",
method: .POST,
body: [
"model": "mistral-large-latest",
"messages": [
["role": "user", "content": "What's the weather in Paris?"]
],
"tools": [
[
"type": "function",
"function": [
"name": "get_weather",
"description": "Get weather for a location",
"parameters": [
"type": "object",
"properties": [
"location": ["type": "string"]
],
"required": ["location"]
]
]
]
]
]
)