Flutter SDK π¦
Build secure cross-platform apps with hardware-backed device attestation and 20+ AI providers. One codebase - iOS AND Android protected!
How it works: Every API request is cryptographically verified using Appleβs App Attest (iOS) or Googleβs Play Integrity (Android). This proves the request came from YOUR legitimate app on a real device - not a hacker, bot, or compromised device. Your API keys never touch the device.
Requirements
| Requirement | Details |
|---|---|
| Flutter 3.0+ | Modern Flutter version |
| Dart 2.17+ | Null safety and modern features |
| iOS 14+ | App Attest support |
| Android API 21+ | Play Integrity support |
| Physical Device | Simulators/emulators have limited attestation |
| ProtectMyAPI Account | Sign up free |
Testing: Test on real devices! iOS Simulator doesnβt support App Attest, and Android Emulator has limited Play Integrity support.
Installation
Add the Package
In your pubspec.yaml:
dependencies:
protectmyapi: ^1.0.0Then run:
flutter pub getiOS Setup
- Open
ios/Runner.xcworkspacein Xcode - Select the Runner target
- Go to Signing & Capabilities
- Click + Capability and add App Attest
Android Setup
- Go to Google Cloud Console
- Enable Play Integrity API
- In ProtectMyAPI Dashboard, add:
- Your Package Name (e.g.,
com.yourcompany.yourapp) - Your SHA-256 fingerprint
- Your Package Name (e.g.,
# Find your SHA-256
cd android && ./gradlew signingReportEnsure your android/app/build.gradle has:
android {
defaultConfig {
minSdkVersion 21
}
}Initialize the SDK
import 'package:flutter/material.dart';
import 'package:protectmyapi/protectmyapi.dart';
void main() async {
WidgetsFlutterBinding.ensureInitialized();
await ProtectMyAPI.initialize(
appId: "app_your_id_here",
);
runApp(MyApp());
}Configuration Options
The SDK offers extensive configuration:
await ProtectMyAPI.initialize(
appId: "app_your_id_here",
environment: Environment.production,
// Security Options
enableDeviceAttestation: true, // App Attest / Play Integrity
enableCertificatePinning: true, // Pin TLS certificates
pinnedCertificateHashes: [ // Custom certificate hashes
"sha256/AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA="
],
enableSecurityChecks: true, // Run security checks
allowCompromisedDevices: false, // Block jailbroken/rooted
allowSimulator: false, // Block simulators/emulators
enableRequestSigning: true, // Sign all requests
// Network Options
timeout: Duration(seconds: 30), // Request timeout
maxRetryAttempts: 3, // Auto-retry failed requests
retryDelay: Duration(seconds: 1), // Initial retry delay
// Logging
logLevel: LogLevel.info, // Logging verbosity
);Configuration Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
appId | String | Required | Your app ID from the dashboard |
environment | Environment | production | production or development |
enableDeviceAttestation | bool | true | Enable App Attest / Play Integrity |
enableCertificatePinning | bool | true | Pin TLS certificates |
pinnedCertificateHashes | List<String> | null | Custom certificate pins |
enableSecurityChecks | bool | true | Run jailbreak/root detection |
allowCompromisedDevices | bool | false | Allow jailbroken/rooted devices |
allowSimulator | bool | false | Allow simulators/emulators |
enableRequestSigning | bool | true | Sign all requests |
timeout | Duration | 30s | Network timeout |
maxRetryAttempts | int | 3 | Automatic retries |
retryDelay | Duration | 1s | Retry delay (exponential backoff) |
logLevel | LogLevel | info | Logging verbosity |
Making Secure Requests
Basic Requests
import 'package:protectmyapi/protectmyapi.dart';
// GET request
final users = await ProtectMyAPI.instance.get<List<User>>(
"https://api.example.com/users",
);
// POST request
final newUser = await ProtectMyAPI.instance.post<User>(
"https://api.example.com/users",
body: {"name": "John", "email": "[email protected]"},
);
// PUT request
final updated = await ProtectMyAPI.instance.put<User>(
"https://api.example.com/users/123",
body: {"name": "John Updated"},
);
// DELETE request
await ProtectMyAPI.instance.delete("https://api.example.com/users/123");
// Generic request with full control
final response = await ProtectMyAPI.instance.request<MyResponse>(
url: "https://api.example.com/data",
method: "PATCH",
headers: {"X-Custom-Header": "value"},
body: someData,
);With FutureBuilder
FutureBuilder<List<User>>(
future: ProtectMyAPI.instance.get("https://api.example.com/users"),
builder: (context, snapshot) {
if (snapshot.connectionState == ConnectionState.waiting) {
return const CircularProgressIndicator();
}
if (snapshot.hasError) {
return Text("Error: ${snapshot.error}");
}
return UserList(users: snapshot.data!);
},
)With Riverpod
final usersProvider = FutureProvider<List<User>>((ref) async {
return ProtectMyAPI.instance.get("https://api.example.com/users");
});
// In your widget:
Consumer(
builder: (context, ref, child) {
final usersAsync = ref.watch(usersProvider);
return usersAsync.when(
data: (users) => UserList(users: users),
loading: () => const CircularProgressIndicator(),
error: (error, stack) => Text("Error: $error"),
);
},
)With BLoC
class UserBloc extends Bloc<UserEvent, UserState> {
UserBloc() : super(UserInitial()) {
on<LoadUsers>(_onLoadUsers);
}
Future<void> _onLoadUsers(LoadUsers event, Emitter<UserState> emit) async {
emit(UserLoading());
try {
final users = await ProtectMyAPI.instance.get<List<User>>(
"https://api.example.com/users",
);
emit(UserLoaded(users));
} on ProtectMyAPIException catch (e) {
emit(UserError(e.message));
}
}
}Fetching Secrets
// Get a single secret
final stripeKey = await ProtectMyAPI.instance.getSecret("stripe_publishable_key");
// Get multiple secrets
final secrets = await ProtectMyAPI.instance.getSecrets([
"stripe_key",
"pusher_key",
]);Device Registration
// Register the device (usually automatic)
final registration = await ProtectMyAPI.instance.registerDevice();
print("Device ID: ${registration.deviceId}");
print("Attestation valid: ${registration.isValid}");Security Features
Security Report
final report = await ProtectMyAPI.instance.getSecurityReport();
print("Jailbroken/Rooted: ${report.isCompromised}");
print("Debugger attached: ${report.isDebuggerAttached}");
print("Simulator/Emulator: ${report.isEmulator}");
print("Tampered: ${report.isTampered}");
print("Platform: ${report.platform}"); // iOS or Android
print("Failed checks: ${report.failedChecks}");Manual Security Checks
final security = ProtectMyAPI.instance.security;
// Individual checks
final isCompromised = await security.isDeviceCompromised();
final isDebugged = await security.isDebuggerAttached();
final isEmulator = await security.isEmulator();
final isTampered = await security.isAppTampered();
// Block if compromised
if (isCompromised || isDebugged) {
showSecurityAlert();
return;
}Whatβs Detected
| Check | iOS | Android |
|---|---|---|
| Jailbreak/Root | Cydia, Sileo, Unc0ver | Magisk, SuperSU, Xposed |
| Debugger | LLDB, Xcode | ADB, JDWP |
| Simulator/Emulator | iOS Simulator | Genymotion, AVD |
| Hooking | Frida, Cycript | Frida, Xposed |
| Tampering | Resigned apps | Repackaged APKs |
AI Providers
ProtectMyAPI provides native Dart integrations with 20+ AI providers.
OpenAI (GPT-4, DALL-E, Whisper)
final openai = ProtectMyAPIAI.openAIService();
// Simple chat
final answer = await openai.chat(
message: "Explain quantum computing",
model: "gpt-4o-mini",
systemPrompt: "You are a helpful assistant",
);
// Streaming response
await for (final chunk in openai.streamChat(message: "Write a story")) {
stdout.write(chunk);
}
// Full request
final request = OpenAIChatCompletionRequest(
model: "gpt-4o",
messages: [
OpenAIChatMessage.system("You are a coding expert"),
OpenAIChatMessage.user("Write a Dart function"),
],
maxTokens: 1000,
temperature: 0.7,
);
final response = await openai.chatCompletion(body: request);
// Vision - analyze images
final imageBytes = await File('photo.jpg').readAsBytes();
final description = await openai.analyzeImage(
prompt: "What's in this image?",
imageData: imageBytes,
model: "gpt-4o",
);
// Image generation with DALL-E
final images = await openai.generateImage(
prompt: "A futuristic city at sunset",
model: "dall-e-3",
size: OpenAIImageSize.landscape1792x1024,
quality: OpenAIImageQuality.hd,
);
// Text to speech
final audioData = await openai.textToSpeech(
text: "Hello, welcome to my app!",
voice: OpenAIVoice.nova,
);
// Audio transcription (Whisper)
final audioFile = await File('audio.mp3').readAsBytes();
final transcription = await openai.transcribeAudio(
audioData: audioFile,
language: "en",
);
// Embeddings
final embeddings = await openai.createEmbedding(
input: "Hello world",
model: "text-embedding-3-small",
);
// Multi-turn conversation
final session = openai.chatSession(
model: "gpt-4o",
systemPrompt: "You are helpful",
);
final response1 = await session.send("What is Dart?");
final response2 = await session.send("Compare it to Kotlin");
session.clearHistory();Anthropic (Claude)
final claude = ProtectMyAPIAI.anthropicService();
// Simple chat
final answer = await claude.chat(
message: "Explain machine learning",
model: "claude-3-5-sonnet-20241022",
maxTokens: 1024,
);
// Streaming
await for (final chunk in claude.streamChat(message: "Write an essay")) {
stdout.write(chunk);
}
// With system prompt
final response = await claude.chat(
message: "Review this code",
model: "claude-3-5-sonnet-20241022",
systemPrompt: "You are an expert code reviewer",
);
// Vision - analyze images
final analysis = await claude.analyzeImage(
prompt: "Describe this diagram",
imageData: imageBytes,
mimeType: "image/png",
);
// Tool use (function calling)
final tools = [
AnthropicTool(
name: "get_weather",
description: "Get weather for a location",
inputSchema: {
"type": "object",
"properties": {
"location": {"type": "string"},
},
"required": ["location"],
},
),
];
final toolResponse = await claude.createMessageWithTools(
message: "What's the weather in Tokyo?",
tools: tools,
);
// Multi-turn conversation
final session = claude.chatSession(model: "claude-3-5-sonnet-20241022");
await session.send("What is Rust?");
await session.send("Show me an example");Google Gemini
final gemini = ProtectMyAPIAI.geminiService();
// Simple text generation
final text = await gemini.generateText(
prompt: "Write a haiku about coding",
model: "gemini-2.0-flash",
);
// Streaming
await for (final chunk in gemini.streamContent(prompt: "Explain relativity")) {
stdout.write(chunk);
}
// Vision - analyze images
final response = await gemini.analyzeImage(
prompt: "What's in this photo?",
imageData: imageBytes,
mimeType: "image/jpeg",
);
// Audio transcription
final transcript = await gemini.transcribeAudio(
audioData: audioFile,
mimeType: "audio/mp4",
);
// Structured output (JSON mode)
final structured = await gemini.generateStructuredOutput(
prompt: "List 3 programming languages",
schema: {
"type": "array",
"items": {"type": "string"},
},
);
// Image generation
final image = await gemini.generateImage(
prompt: "A serene mountain landscape",
);
// Multi-turn chat
final chat = gemini.chat(systemInstruction: "You are a math tutor");
await chat.send("What is calculus?");
await chat.send("Give me an example");Mistral AI
final mistral = ProtectMyAPIAI.mistralService();
// Chat completion
final answer = await mistral.chat(
message: "Explain neural networks",
model: MistralModel.mistralLarge,
);
// Streaming
await for (final chunk in mistral.streamChat(message: "Write a poem")) {
stdout.write(chunk);
}
// Code generation
final code = await mistral.chat(
message: "Write a Python quicksort",
model: MistralModel.codestral,
systemPrompt: "You are an expert programmer",
);
// Embeddings
final embeddings = await mistral.createEmbedding(
input: "Hello world",
model: MistralModel.mistralEmbed,
);
// JSON mode
final json = await mistral.chat(
message: "List top 5 cities",
responseFormat: ResponseFormat.json,
);Groq (Ultra-Fast Inference)
final groq = ProtectMyAPIAI.groqService();
// Lightning-fast chat
final answer = await groq.chat(
message: "What is the speed of light?",
model: GroqModel.llama31_70b,
);
// Streaming
await for (final chunk in groq.streamChat(
message: "Tell a joke",
model: GroqModel.mixtral,
)) {
stdout.write(chunk);
}
// Whisper transcription (fastest!)
final transcript = await groq.transcribeAudio(
audioData: audioFile,
model: GroqModel.whisperLargeV3,
);DeepSeek
final deepseek = ProtectMyAPIAI.deepSeekService();
// Chat with DeepSeek V3
final answer = await deepseek.chat(
message: "Explain transformer architecture",
model: DeepSeekModel.deepseekChat,
);
// Code generation
final code = await deepseek.chat(
message: "Implement binary search in Dart",
model: DeepSeekModel.deepseekCoder,
);
// Reasoning (DeepSeek R1)
final reasoning = await deepseek.chat(
message: "Solve this step by step...",
model: DeepSeekModel.deepseekReasoner,
);Stability AI (Image Generation)
final stability = ProtectMyAPIAI.stabilityService();
// Stable Image Ultra (highest quality)
final ultraImage = await stability.generateUltra(
prompt: "A majestic castle on a cliff",
negativePrompt: "blurry, low quality",
aspectRatio: AspectRatio.landscape16x9,
);
// Stable Image Core
final coreImage = await stability.generateCore(
prompt: "A cute robot",
stylePreset: StylePreset.digitalArt,
);
// SD3.5
final sd35Image = await stability.generateSD35(
prompt: "Abstract art",
model: SD35Model.large,
);
// Upscale (4x)
final upscaled = await stability.upscaleFast(imageData: originalImage);
// Conservative upscale
final upscaledConservative = await stability.upscaleConservative(
imageData: originalImage,
prompt: "A detailed landscape",
);
// Image editing
final edited = await stability.searchAndReplace(
imageData: originalImage,
searchPrompt: "cat",
prompt: "dog",
);
// Remove background
final noBackground = await stability.removeBackground(imageData: photo);
// Inpainting
final inpainted = await stability.inpaint(
imageData: originalImage,
maskData: mask,
prompt: "A garden",
);
// Outpainting
final expanded = await stability.outpaint(
imageData: originalImage,
direction: Direction.right,
prompt: "Continue the landscape",
);
// Style transfer
final stylized = await stability.controlStyle(
imageData: contentImage,
styleImageData: styleImage,
);
// Video generation
final video = await stability.imageToVideo(imageData: startImage);ElevenLabs (Voice & Audio)
final elevenlabs = ProtectMyAPIAI.elevenLabsService();
// Text to speech
final audio = await elevenlabs.textToSpeech(
text: "Hello, welcome to my app!",
voiceId: "EXAVITQu4vr4xnSDxMaL", // Sarah
modelId: ElevenLabsModel.multilingualV2,
voiceSettings: VoiceSettings(
stability: 0.5,
similarityBoost: 0.75,
style: 0.5,
),
);
// Streaming TTS
await for (final chunk in elevenlabs.streamTextToSpeech(text: "Long text...")) {
audioPlayer.append(chunk);
}
// Speech to speech
final converted = await elevenlabs.speechToSpeech(
audio: originalAudio,
voiceId: "targetVoiceId",
removeBackgroundNoise: true,
);
// Voice cloning
final clonedVoice = await elevenlabs.createVoiceClone(
name: "My Custom Voice",
files: [sample1, sample2, sample3],
);
// Sound effects
final soundEffect = await elevenlabs.generateSoundEffect(
text: "A thunderstorm with heavy rain",
durationSeconds: 10,
);
// Audio isolation
final cleanAudio = await elevenlabs.isolateAudio(audio: noisyAudio);Perplexity (AI Search)
final perplexity = ProtectMyAPIAI.perplexityService();
// Web-grounded search
final response = await perplexity.createChatCompletion(
request: PerplexityChatRequest(
model: PerplexityModel.sonarLarge,
messages: [PerplexityMessage.user("Latest iPhone features?")],
searchRecencyFilter: RecencyFilter.week,
),
);
// Get answer with citations
print(response.choices.first?.message.content);
for (final citation in response.citations ?? []) {
print("Source: $citation");
}
// Streaming
await for (final chunk in perplexity.createChatCompletionStream(request)) {
stdout.write(chunk.choices.first?.delta?.content ?? "");
}Together AI
final together = ProtectMyAPIAI.togetherService();
// Chat with open-source models
final answer = await together.createChatCompletion(
request: TogetherChatRequest(
model: "meta-llama/Llama-3.1-70B-Instruct",
messages: [TogetherMessage.user("Explain transformers")],
),
);
// Image generation (FLUX)
final image = await together.generateImage(
prompt: "A cyberpunk city",
model: "black-forest-labs/FLUX.1-schnell",
width: 1024,
height: 768,
);
// Embeddings
final embeddings = await together.createEmbeddings(
request: TogetherEmbeddingRequest(
model: "togethercomputer/m2-bert-80M-8k-retrieval",
input: ["Hello world"],
),
);Replicate
final replicate = ProtectMyAPIAI.replicateService();
// Run any model
final prediction = await replicate.createPrediction(
request: ReplicatePredictionRequest(
model: "stability-ai/sdxl",
input: {"prompt": "A beautiful sunset"},
),
waitForCompletion: true,
);
// Flux Schnell
final fluxImage = await replicate.generateFluxSchnell(prompt: "A serene lake");
// Flux Pro
final fluxProImage = await replicate.generateFluxPro(prompt: "Detailed portrait");
// Manage predictions
final predictions = await replicate.listPredictions();
final status = await replicate.getPrediction(id: "prediction_id");
await replicate.cancelPrediction(id: "prediction_id");Fireworks AI
final fireworks = ProtectMyAPIAI.fireworksService();
// Chat completion
final answer = await fireworks.chat(
message: "Explain quantum computing",
model: FireworksModel.llama3_70b,
);
// DeepSeek R1 via Fireworks
final reasoning = await fireworks.deepSeekR1(
message: "Solve: If x + 5 = 12, what is x?",
);
// Streaming
await for (final chunk in fireworks.streamChat(message: "Write a story")) {
stdout.write(chunk);
}
// Image generation
final image = await fireworks.generateImage(
prompt: "A fantasy landscape",
model: FireworksModel.stableDiffusionXL,
);OpenRouter (200+ Models)
final openrouter = ProtectMyAPIAI.openRouterService();
// Chat with automatic fallback
final answer = await openrouter.chat(
message: "Tell me a joke",
models: [OpenRouterModel.gpt4o, OpenRouterModel.claudeSonnet],
route: Route.fallback,
);
// Vision
final description = await openrouter.chatWithVision(
message: "What's in this image?",
imageURL: "https://example.com/image.jpg",
models: [OpenRouterModel.gpt4o],
);
// Streaming
await for (final chunk in openrouter.streamChat(message: "Write a poem")) {
stdout.write(chunk);
}Brave Search
final brave = ProtectMyAPIAI.braveSearchService();
// Web search
final webResults = await brave.webSearch(
query: "best programming languages 2024",
count: 10,
safesearch: SafeSearch.moderate,
freshness: Freshness.month,
);
// News search
final news = await brave.newsSearch(
query: "AI developments",
freshness: Freshness.day,
);
// Image search
final images = await brave.imageSearch(query: "cute cats", count: 20);
// Video search
final videos = await brave.videoSearch(query: "Flutter tutorials");
// Local search
final local = await brave.localSearch(
query: "coffee shops near me",
country: "US",
);
// Suggestions
final suggestions = await brave.suggest(query: "how to");DeepL (Translation)
final deepl = ProtectMyAPIAI.deepLService();
// Simple translation
final translated = await deepl.translate(
text: "Hello, how are you?",
targetLang: DeepLLanguage.german,
sourceLang: DeepLLanguage.english,
formality: Formality.formal,
);
// Batch translation
final translations = await deepl.translate(
texts: ["Hello", "Goodbye", "Thank you"],
targetLang: DeepLLanguage.spanish,
);
// With language detection
final detailed = await deepl.translateWithDetails(
text: "Bonjour le monde",
targetLang: DeepLLanguage.english,
);
print("Detected: ${detailed.detectedSourceLanguage}");
// Document translation
final docHandle = await deepl.translateDocument(
documentData: pdfBytes,
filename: "document.pdf",
targetLang: DeepLLanguage.french,
);Fal.ai (Fast Image Gen)
final fal = ProtectMyAPIAI.falService();
// Fast SDXL
final sdxlImage = await fal.generateFastSDXL(
prompt: "A beautiful landscape",
imageSize: FalImageSize.squareHD,
numInferenceSteps: 25,
guidanceScale: 7.5,
);
// Flux
final fluxImage = await fal.generateFlux(
prompt: "Detailed portrait",
imageSize: FalImageSize.landscapeHD,
);
// Flux Schnell (fastest)
final fastImage = await fal.generateFluxSchnell(
prompt: "Quick sketch",
numInferenceSteps: 4,
);
// Flux Pro
final proImage = await fal.generateFluxPro(prompt: "Professional quality");
// With custom LoRA
final loraImage = await fal.generateFluxLoRA(
prompt: "A portrait in custom style",
loras: [FalLoRAWeight(path: "url-to-lora", scale: 0.8)],
);
// Virtual try-on
final tryOn = await fal.virtualTryOn(
personImage: personData,
garmentImage: clothingData,
);Open-Meteo (Weather)
final weather = ProtectMyAPIAI.openMeteoService();
// Simple forecast
final forecast = await weather.getSimpleForecast(
latitude: 37.7749,
longitude: -122.4194,
days: 7,
);
// Detailed forecast
final detailed = await weather.getForecast(
latitude: 37.7749,
longitude: -122.4194,
hourly: [
HourlyVariable.temperature2m,
HourlyVariable.precipitationProbability,
],
daily: [
DailyVariable.temperatureMax,
DailyVariable.temperatureMin,
],
temperatureUnit: TemperatureUnit.fahrenheit,
);
// Historical weather
final historical = await weather.getHistoricalWeather(
latitude: 37.7749,
longitude: -122.4194,
startDate: "2024-01-01",
endDate: "2024-01-31",
);
// Air quality
final airQuality = await weather.getAirQuality(
latitude: 37.7749,
longitude: -122.4194,
);
// Marine forecast
final marine = await weather.getMarineForecast(
latitude: 37.7749,
longitude: -122.4194,
);Error Handling
try {
final response = await ProtectMyAPI.instance.get("https://api.example.com/data");
// Success!
} on ProtectMyAPIException catch (e) {
switch (e.type) {
case ProtectMyAPIExceptionType.attestationFailed:
showAlert("Device couldn't be verified. ${e.message}");
break;
case ProtectMyAPIExceptionType.networkError:
showAlert("Network error: ${e.message}");
break;
case ProtectMyAPIExceptionType.serverError:
showAlert("Server error ${e.statusCode}: ${e.message}");
break;
case ProtectMyAPIExceptionType.rateLimited:
showAlert("Too many requests. Try again in ${e.retryAfter}s");
break;
case ProtectMyAPIExceptionType.unauthorized:
showAlert("Invalid credentials");
break;
case ProtectMyAPIExceptionType.notInitialized:
showAlert("SDK not initialized");
break;
default:
showAlert("Error: ${e.message}");
}
} catch (e) {
showAlert("Unexpected error: $e");
}Best Practices
Do:
- Initialize ProtectMyAPI before
runApp() - Test on real devices (both iOS and Android)
- Handle all error cases gracefully
- Use streaming for long AI responses
- Enable certificate pinning in production
- Keep
allowCompromisedDevicesandallowSimulatorfalse in production
Donβt:
- Test only on simulators/emulators
- Put API keys in your app code
- Ignore error handling
- Disable security checks in production
- Trust attestation from debug builds
Platform-Specific Considerations
iOS
- App Attest uses the Secure Enclave (hardware)
- Requires physical device for testing
- Add App Attest capability in Xcode
Android
- Play Integrity uses TEE/StrongBox (hardware)
- Emulator has limited support
- Requires Google Play Services
- Add SHA-256 fingerprint to dashboard
Cross-Platform Tips
import 'dart:io';
// Check platform
if (Platform.isIOS) {
// iOS-specific logic
} else if (Platform.isAndroid) {
// Android-specific logic
}
// Use platform-specific UI patterns
final isIOS = Theme.of(context).platform == TargetPlatform.iOS;How It Works
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β Your App β β ProtectMyAPI β β AI Provider β
β (Flutter) β β Server β β (OpenAI, etc) β
ββββββββββ¬βββββββββ ββββββββββ¬βββββββββ ββββββββββ¬βββββββββ
β β β
β 1. Request + attestation β β
β βββββββββββββββββββββββββ>β β
β β β
β β 2. Verify with β
β β Apple/Google β
β β β
β β 3. Add API key & forward β
β β βββββββββββββββββββββββββ>β
β β β
β β 4. Get response β
β β <ββββββββββββββββββββββββββ
β β β
β 5. Return response β β
β <ββββββββββββββββββββββββββ β- Your app makes a request β SDK adds attestation (App Attest or Play Integrity)
- ProtectMyAPI verifies β Apple/Google confirms device integrity
- API key is added server-side β Your secrets never touch the device
- Request is forwarded β ProtectMyAPI proxies to the AI provider
- Response returns β Your app gets the result
FAQ
Q: Why limited attestation on simulators/emulators?
Attestation requires hardware security (Secure Enclave/TEE) not available in virtual environments.
Q: Will this work on jailbroken/rooted devices?
No. Compromised devices fail attestation by design.
Q: Does this slow down my app?
First request: ~200-300ms extra. Subsequent requests: ~20-30ms overhead.
Q: What if ProtectMyAPI servers are down?
We maintain 99.9% uptime globally. Always implement error handling.
Q: Can I use different settings per platform?
Yes! Check
Platform.isIOSorPlatform.isAndroidfor conditional logic.
Next Steps
- π± iOS SDK β iOS-specific features
- π€ Android SDK β Android-specific features
- π§ AI Providers β Integrate 20+ AI services
- π Dashboard Guide β Manage your apps