Android SDK π€
Build secure Android apps with hardware-backed device attestation and 20+ AI providers.
How it works: Every API request from your app is cryptographically verified using Googleβs Play Integrity (TEE/StrongBox). This proves the request came from YOUR legitimate app on a genuine Android device - not a hacker, bot, or rooted device. Your API keys never touch the device.
Requirements
| Requirement | Details |
|---|---|
| Android 5.0+ (API 21) | Minimum supported version |
| Google Play Services | Required for Play Integrity |
| Android Studio | For building your app |
| ProtectMyAPI Account | Sign up free |
Testing: For best results, test on a real device with Google Play Services. Emulators have limited Play Integrity support.
Installation
Add the SDK
In your appβs build.gradle.kts (Module level):
dependencies {
implementation("com.protectmyapi:android-sdk:1.0.0")
}Groovy:
dependencies {
implementation 'com.protectmyapi:android-sdk:1.0.0'
}Click Sync Now in Android Studio.
Set Up Play Integrity
- Go to Google Cloud Console
- Create or select a project
- Enable Play Integrity API
- In ProtectMyAPI Dashboard, add:
- Your Package Name (e.g.,
com.yourcompany.yourapp) - Your SHA-256 fingerprint
- Your Package Name (e.g.,
Find your SHA-256:
./gradlew signingReport
# Look for SHA-256 under your release variantInitialize the SDK
Create an Application class (or modify your existing one):
import android.app.Application
import com.protectmyapi.sdk.ProtectMyAPI
import com.protectmyapi.sdk.ProtectMyAPIConfiguration
class MyApp : Application() {
override fun onCreate() {
super.onCreate()
val config = ProtectMyAPIConfiguration(
appId = "app_your_id_here",
environment = ProtectMyAPIConfiguration.Environment.PRODUCTION
)
ProtectMyAPI.initialize(this, config)
}
}Register in AndroidManifest.xml:
<application
android:name=".MyApp"
android:label="@string/app_name"
...>
<!-- your activities -->
</application>Configuration Options
The SDK offers extensive configuration for security and behavior:
val config = ProtectMyAPIConfiguration(
appId = "app_your_id_here",
environment = ProtectMyAPIConfiguration.Environment.PRODUCTION,
// Security Options
enablePlayIntegrity = true, // Use Play Integrity (default: true)
enableCertificatePinning = true, // Pin TLS certificates (default: true)
pinnedCertificateHashes = listOf( // Custom certificate hashes
"sha256/AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA="
),
enableSecurityChecks = true, // Run security checks (default: true)
allowRootedDevices = false, // Block rooted devices (default: false)
allowEmulator = false, // Block emulator (default: false)
enableRequestSigning = true, // Sign all requests (default: true)
// Network Options
timeout = 30_000L, // Request timeout in milliseconds
maxRetryAttempts = 3, // Auto-retry failed requests
retryDelayMillis = 1000L, // Initial retry delay
// Logging
logLevel = ProtectMyAPIConfiguration.LogLevel.INFO
)
ProtectMyAPI.initialize(this, config)Configuration Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
appId | String | Required | Your app ID from the dashboard |
environment | Environment | PRODUCTION | PRODUCTION or DEVELOPMENT |
enablePlayIntegrity | Boolean | true | Enable Play Integrity verification |
enableCertificatePinning | Boolean | true | Pin TLS certificates to prevent MITM |
pinnedCertificateHashes | List<String> | null | Custom certificate hash pins |
enableSecurityChecks | Boolean | true | Run root, debugger, and tampering checks |
allowRootedDevices | Boolean | false | Allow requests from rooted devices |
allowEmulator | Boolean | false | Allow requests from emulators |
enableRequestSigning | Boolean | true | Cryptographically sign all requests |
timeout | Long | 30000 | Network request timeout (ms) |
maxRetryAttempts | Int | 3 | Number of automatic retries |
retryDelayMillis | Long | 1000 | Initial delay between retries |
logLevel | LogLevel | INFO | Logging verbosity |
Making Secure Requests
Basic Requests
import com.protectmyapi.sdk.ProtectMyAPI
import kotlinx.coroutines.launch
// In a ViewModel or Coroutine scope
viewModelScope.launch {
try {
// GET request
val users = ProtectMyAPI.get<List<User>>("https://api.example.com/users")
// POST request
val newUser = ProtectMyAPI.post<User>(
url = "https://api.example.com/users",
body = mapOf("name" to "John", "email" to "[email protected]")
)
// PUT request
val updated = ProtectMyAPI.put<User>(
url = "https://api.example.com/users/123",
body = mapOf("name" to "John Updated")
)
// DELETE request
ProtectMyAPI.delete("https://api.example.com/users/123")
// Generic request with full control
val response = ProtectMyAPI.request<MyResponse>(
url = "https://api.example.com/data",
method = "PATCH",
headers = mapOf("X-Custom-Header" to "value"),
body = someData
)
} catch (e: ProtectMyAPIException) {
// Handle error
}
}With Jetpack Compose
@Composable
fun UserScreen(viewModel: UserViewModel = viewModel()) {
val uiState by viewModel.uiState.collectAsState()
LaunchedEffect(Unit) {
viewModel.loadUsers()
}
when (val state = uiState) {
is UiState.Loading -> CircularProgressIndicator()
is UiState.Success -> UserList(state.users)
is UiState.Error -> Text("Error: ${state.message}")
}
}
class UserViewModel : ViewModel() {
private val _uiState = MutableStateFlow<UiState>(UiState.Loading)
val uiState: StateFlow<UiState> = _uiState
fun loadUsers() {
viewModelScope.launch {
try {
val users = ProtectMyAPI.get<List<User>>("https://api.example.com/users")
_uiState.value = UiState.Success(users)
} catch (e: ProtectMyAPIException) {
_uiState.value = UiState.Error(e.message ?: "Unknown error")
}
}
}
}Fetching Secrets
// Get a single secret
val stripeKey = ProtectMyAPI.getSecret("stripe_publishable_key")
// Get multiple secrets
val secrets = ProtectMyAPI.getSecrets(listOf("stripe_key", "pusher_key"))Device Registration
// Register the device (usually automatic)
val registration = ProtectMyAPI.registerDevice()
println("Device ID: ${registration.deviceId}")
println("Attestation valid: ${registration.isValid}")
// Verify integrity manually
val nonce = ProtectMyAPI.requestNonce()
val isValid = ProtectMyAPI.verifyIntegrity(nonce)Security Features
Security Report
Get a comprehensive security assessment:
val checker = ProtectMyAPI.getSecurityChecker()
val report = checker.performSecurityChecks()
println("Rooted: ${report.isRooted}")
println("Debugger attached: ${report.isDebuggerAttached}")
println("Running in Emulator: ${report.isEmulator}")
println("Tampered: ${report.isTampered}")
println("Failed checks: ${report.failedChecks}")Manual Security Checks
val checker = ProtectMyAPI.getSecurityChecker()
// Individual checks
val isRooted = checker.isDeviceRooted()
val isDebugged = checker.isDebuggerAttached()
val isEmulator = checker.isEmulator()
val isTampered = checker.isAppTampered()
val hasHooks = checker.detectHookingFrameworks()
// Block if compromised
if (isRooted || isDebugged) {
showSecurityAlert()
return
}Whatβs Detected
| Check | Description |
|---|---|
| Root | Magisk, SuperSU, Xposed, and 100+ root indicators |
| Debugger | ADB, JDWP, Android Studio debugging |
| Emulator | Genymotion, Android Emulator, virtual devices |
| Hooking | Frida, Xposed, LSPosed, EdXposed, substrate |
| Tampering | Repackaged apps, modified signatures, code injection |
AI Providers
ProtectMyAPI provides native Kotlin integrations with 20+ AI providers. All API keys are stored securely on our servers.
OpenAI (GPT-4, DALL-E, Whisper)
val openai = ProtectMyAPIAI.openAIService()
// Simple chat
val answer = openai.chat(
message = "Explain quantum computing",
model = OpenAIModel.GPT4O_MINI,
systemPrompt = "You are a helpful assistant"
)
// Streaming response
openai.streamChat(message = "Write a story").collect { chunk ->
print(chunk)
}
// Full request with all options
val request = OpenAIChatRequest(
model = "gpt-4o",
messages = listOf(
OpenAIMessage.system("You are a coding expert"),
OpenAIMessage.user("Write a Kotlin function to sort a list")
),
maxTokens = 1000,
temperature = 0.7,
topP = 0.9
)
val response = openai.chatCompletion(request)
// Vision - analyze images
val bitmap = BitmapFactory.decodeResource(resources, R.drawable.photo)
val description = openai.analyzeImage(
prompt = "What's in this image?",
imageData = bitmap.toByteArray(),
model = OpenAIModel.GPT4O
)
// Image generation with DALL-E
val images = openai.generateImage(
prompt = "A futuristic city at sunset",
model = "dall-e-3",
size = ImageSize.LANDSCAPE_1792x1024,
quality = ImageQuality.HD,
style = ImageStyle.VIVID
)
// Text to speech
val audioData = openai.textToSpeech(
text = "Hello, welcome to my app!",
voice = Voice.NOVA,
model = "tts-1-hd"
)
// Audio transcription (Whisper)
val audioFile = // ... load audio
val transcription = openai.transcribeAudio(
audioData = audioFile,
language = "en"
)
// Embeddings
val embeddings = openai.createEmbedding(
input = "Hello world",
model = "text-embedding-3-small"
)
// Multi-turn conversation
val session = openai.chatSession(model = OpenAIModel.GPT4O)
val response1 = session.send("What is Kotlin?")
val response2 = session.send("How does it compare to Java?")
session.clearHistory()Anthropic (Claude)
val claude = ProtectMyAPIAI.anthropicService()
// Simple chat
val answer = claude.chat(
message = "Explain machine learning",
model = ClaudeModel.CLAUDE_35_SONNET,
maxTokens = 1024
)
// Streaming
claude.streamChat(message = "Write an essay").collect { chunk ->
print(chunk)
}
// With system prompt
val response = claude.chat(
message = "Review this code",
model = ClaudeModel.CLAUDE_35_SONNET,
systemPrompt = "You are an expert code reviewer"
)
// Vision - analyze images
val analysis = claude.analyzeImage(
prompt = "Describe this diagram",
imageData = imageBytes,
mimeType = "image/png"
)
// Tool use (function calling)
val tools = listOf(
AnthropicTool(
name = "get_weather",
description = "Get weather for a location",
inputSchema = mapOf(
"type" to "object",
"properties" to mapOf(
"location" to mapOf("type" to "string")
),
"required" to listOf("location")
)
)
)
val toolResponse = claude.createMessageWithTools(
message = "What's the weather in Tokyo?",
tools = tools
)
// Multi-turn conversation
val session = claude.chatSession(model = ClaudeModel.CLAUDE_35_SONNET)
session.send("What is Rust?")
session.send("Show me an example")Google Gemini
val gemini = ProtectMyAPIAI.geminiService()
// Simple text generation
val text = gemini.generateText(
prompt = "Write a haiku about coding",
model = "gemini-2.0-flash"
)
// Streaming
gemini.streamContent(prompt = "Explain relativity").collect { chunk ->
print(chunk)
}
// Vision - analyze images
val response = gemini.analyzeImage(
prompt = "What's in this photo?",
imageData = imageBytes,
mimeType = "image/jpeg"
)
// Audio transcription
val transcript = gemini.transcribeAudio(
audioData = audioFile,
mimeType = "audio/mp4"
)
// Structured output (JSON mode)
val structured = gemini.generateStructuredOutput(
prompt = "List 3 programming languages",
schema = mapOf(
"type" to "array",
"items" to mapOf("type" to "string")
)
)
// Image generation
val image = gemini.generateImage(
prompt = "A serene mountain landscape"
)
// Multi-turn chat
val chat = gemini.chat(systemInstruction = "You are a math tutor")
chat.send("What is calculus?")
chat.send("Give me an example")Mistral AI
val mistral = ProtectMyAPIAI.mistralService()
// Chat completion
val answer = mistral.chat(
message = "Explain neural networks",
model = MistralModel.MISTRAL_LARGE
)
// Streaming
mistral.streamChat(message = "Write a poem").collect { chunk ->
print(chunk)
}
// Code generation
val code = mistral.chat(
message = "Write a Python quicksort",
model = MistralModel.CODESTRAL,
systemPrompt = "You are an expert programmer"
)
// Embeddings
val embeddings = mistral.createEmbedding(
input = "Hello world",
model = MistralModel.MISTRAL_EMBED
)
// JSON mode
val json = mistral.chat(
message = "List top 5 cities",
model = MistralModel.MISTRAL_LARGE,
responseFormat = ResponseFormat.JSON
)Groq (Ultra-Fast Inference)
val groq = ProtectMyAPIAI.groqService()
// Lightning-fast chat
val answer = groq.chat(
message = "What is the speed of light?",
model = GroqModel.LLAMA31_70B
)
// Streaming
groq.streamChat(message = "Tell a joke", model = GroqModel.MIXTRAL).collect {
print(it)
}
// Whisper transcription (fastest!)
val transcript = groq.transcribeAudio(
audioData = audioFile,
model = GroqModel.WHISPER_LARGE_V3
)DeepSeek
val deepseek = ProtectMyAPIAI.deepSeekService()
// Chat with DeepSeek V3
val answer = deepseek.chat(
message = "Explain transformer architecture",
model = DeepSeekModel.DEEPSEEK_CHAT
)
// Code generation
val code = deepseek.chat(
message = "Implement binary search in Kotlin",
model = DeepSeekModel.DEEPSEEK_CODER
)
// Reasoning (DeepSeek R1)
val reasoning = deepseek.chat(
message = "Solve this step by step...",
model = DeepSeekModel.DEEPSEEK_REASONER
)Stability AI (Image Generation)
val stability = ProtectMyAPIAI.stabilityService()
// Stable Image Ultra (highest quality)
val ultraImage = stability.generateUltra(
prompt = "A majestic castle on a cliff",
negativePrompt = "blurry, low quality",
aspectRatio = AspectRatio.LANDSCAPE_16_9
)
// Stable Image Core (balanced)
val coreImage = stability.generateCore(
prompt = "A cute robot",
stylePreset = StylePreset.DIGITAL_ART
)
// SD3.5 (latest model)
val sd35Image = stability.generateSD35(
prompt = "Abstract art",
model = SD35Model.LARGE
)
// Upscale images (4x)
val upscaled = stability.upscaleFast(imageData = originalImage)
// Conservative upscale
val upscaledConservative = stability.upscaleConservative(
imageData = originalImage,
prompt = "A detailed landscape"
)
// Image editing
val edited = stability.searchAndReplace(
imageData = originalImage,
searchPrompt = "cat",
prompt = "dog"
)
// Remove background
val noBackground = stability.removeBackground(imageData = photo)
// Inpainting
val inpainted = stability.inpaint(
imageData = originalImage,
maskData = mask,
prompt = "A garden"
)
// Outpainting
val expanded = stability.outpaint(
imageData = originalImage,
direction = Direction.RIGHT,
prompt = "Continue the landscape"
)
// Style transfer
val stylized = stability.controlStyle(
imageData = contentImage,
styleImageData = styleImage
)
// Video generation
val video = stability.imageToVideo(imageData = startImage)ElevenLabs (Voice & Audio)
val elevenlabs = ProtectMyAPIAI.elevenLabsService()
// Text to speech
val audio = elevenlabs.textToSpeech(
text = "Hello, welcome to my app!",
voiceId = "EXAVITQu4vr4xnSDxMaL", // Sarah
modelId = ElevenLabsModel.MULTILINGUAL_V2,
voiceSettings = VoiceSettings(
stability = 0.5f,
similarityBoost = 0.75f,
style = 0.5f
)
)
// Streaming TTS
elevenlabs.streamTextToSpeech(text = "Long text...").collect { chunk ->
audioPlayer.append(chunk)
}
// Speech to speech
val converted = elevenlabs.speechToSpeech(
audio = originalAudio,
voiceId = "targetVoiceId",
removeBackgroundNoise = true
)
// Voice cloning
val clonedVoice = elevenlabs.createVoiceClone(
name = "My Custom Voice",
files = listOf(sample1, sample2, sample3)
)
// Sound effects
val soundEffect = elevenlabs.generateSoundEffect(
text = "A thunderstorm with heavy rain",
durationSeconds = 10
)
// Audio isolation
val cleanAudio = elevenlabs.isolateAudio(audio = noisyAudio)Perplexity (AI Search)
val perplexity = ProtectMyAPIAI.perplexityService()
// Web-grounded search
val response = perplexity.createChatCompletion(
request = PerplexityChatRequest(
model = PerplexityModel.SONAR_LARGE,
messages = listOf(PerplexityMessage.user("Latest iPhone features?")),
searchRecencyFilter = RecencyFilter.WEEK
)
)
// Get answer with citations
println(response.choices.firstOrNull()?.message?.content)
response.citations?.forEach { println("Source: $it") }
// Streaming
perplexity.createChatCompletionStream(request).collect { chunk ->
print(chunk.choices.firstOrNull()?.delta?.content ?: "")
}Together AI
val together = ProtectMyAPIAI.togetherService()
// Chat with open-source models
val answer = together.createChatCompletion(
request = TogetherChatRequest(
model = "meta-llama/Llama-3.1-70B-Instruct",
messages = listOf(TogetherMessage.user("Explain transformers"))
)
)
// Image generation (FLUX)
val image = together.generateImage(
prompt = "A cyberpunk city",
model = "black-forest-labs/FLUX.1-schnell",
width = 1024,
height = 768
)
// Embeddings
val embeddings = together.createEmbeddings(
request = TogetherEmbeddingRequest(
model = "togethercomputer/m2-bert-80M-8k-retrieval",
input = listOf("Hello world")
)
)Replicate
val replicate = ProtectMyAPIAI.replicateService()
// Run any model
val prediction = replicate.createPrediction(
request = ReplicatePredictionRequest(
model = "stability-ai/sdxl",
input = mapOf("prompt" to "A beautiful sunset")
),
waitForCompletion = true
)
// Flux Schnell
val fluxImage = replicate.generateFluxSchnell(prompt = "A serene lake")
// Flux Pro
val fluxProImage = replicate.generateFluxPro(prompt = "Detailed portrait")
// Manage predictions
val predictions = replicate.listPredictions()
val status = replicate.getPrediction(id = "prediction_id")
replicate.cancelPrediction(id = "prediction_id")Fireworks AI
val fireworks = ProtectMyAPIAI.fireworksService()
// Chat completion
val answer = fireworks.chat(
message = "Explain quantum computing",
model = FireworksModel.LLAMA3_70B
)
// DeepSeek R1 via Fireworks
val reasoning = fireworks.deepSeekR1(
message = "Solve: If x + 5 = 12, what is x?"
)
// Streaming
fireworks.streamChat(message = "Write a story").collect {
print(it)
}
// Image generation
val image = fireworks.generateImage(
prompt = "A fantasy landscape",
model = FireworksModel.STABLE_DIFFUSION_XL
)OpenRouter (200+ Models)
val openrouter = ProtectMyAPIAI.openRouterService()
// Chat with automatic fallback
val answer = openrouter.chat(
message = "Tell me a joke",
models = listOf(OpenRouterModel.GPT4O, OpenRouterModel.CLAUDE_SONNET),
route = Route.FALLBACK
)
// Vision
val description = openrouter.chatWithVision(
message = "What's in this image?",
imageURL = "https://example.com/image.jpg",
models = listOf(OpenRouterModel.GPT4O)
)
// Streaming
openrouter.streamChat(message = "Write a poem").collect {
print(it)
}Brave Search
val brave = ProtectMyAPIAI.braveSearchService()
// Web search
val webResults = brave.webSearch(
query = "best programming languages 2024",
count = 10,
safesearch = SafeSearch.MODERATE,
freshness = Freshness.MONTH
)
// News search
val news = brave.newsSearch(
query = "AI developments",
freshness = Freshness.DAY
)
// Image search
val images = brave.imageSearch(query = "cute cats", count = 20)
// Video search
val videos = brave.videoSearch(query = "Kotlin tutorials")
// Local search
val local = brave.localSearch(
query = "coffee shops near me",
country = "US"
)
// Suggestions
val suggestions = brave.suggest(query = "how to")DeepL (Translation)
val deepl = ProtectMyAPIAI.deepLService()
// Simple translation
val translated = deepl.translate(
text = "Hello, how are you?",
targetLang = DeepLLanguage.GERMAN,
sourceLang = DeepLLanguage.ENGLISH,
formality = Formality.FORMAL
)
// Batch translation
val translations = deepl.translate(
texts = listOf("Hello", "Goodbye", "Thank you"),
targetLang = DeepLLanguage.SPANISH
)
// With language detection
val detailed = deepl.translateWithDetails(
text = "Bonjour le monde",
targetLang = DeepLLanguage.ENGLISH
)
println("Detected: ${detailed.detectedSourceLanguage}")
// Document translation
val docHandle = deepl.translateDocument(
documentData = pdfData,
filename = "document.pdf",
targetLang = DeepLLanguage.FRENCH
)Fal.ai (Fast Image Gen)
val fal = ProtectMyAPIAI.falService()
// Fast SDXL
val sdxlImage = fal.generateFastSDXL(
prompt = "A beautiful landscape",
imageSize = FalImageSize.SQUARE_HD,
numInferenceSteps = 25,
guidanceScale = 7.5f
)
// Flux
val fluxImage = fal.generateFlux(
prompt = "Detailed portrait",
imageSize = FalImageSize.LANDSCAPE_HD
)
// Flux Schnell (fastest)
val fastImage = fal.generateFluxSchnell(
prompt = "Quick sketch",
numInferenceSteps = 4
)
// Flux Pro
val proImage = fal.generateFluxPro(
prompt = "Professional quality"
)
// With custom LoRA
val loraImage = fal.generateFluxLoRA(
prompt = "A portrait in custom style",
loras = listOf(FalLoRAWeight(path = "url-to-lora", scale = 0.8f))
)
// Virtual try-on
val tryOn = fal.virtualTryOn(
personImage = personData,
garmentImage = clothingData
)Open-Meteo (Weather)
val weather = ProtectMyAPIAI.openMeteoService()
// Simple forecast
val forecast = weather.getSimpleForecast(
latitude = 37.7749,
longitude = -122.4194,
days = 7
)
// Detailed forecast
val detailed = weather.getForecast(
latitude = 37.7749,
longitude = -122.4194,
hourly = listOf(
HourlyVariable.TEMPERATURE_2M,
HourlyVariable.PRECIPITATION_PROBABILITY
),
daily = listOf(
DailyVariable.TEMPERATURE_MAX,
DailyVariable.TEMPERATURE_MIN
),
temperatureUnit = TemperatureUnit.FAHRENHEIT
)
// Historical weather
val historical = weather.getHistoricalWeather(
latitude = 37.7749,
longitude = -122.4194,
startDate = "2024-01-01",
endDate = "2024-01-31"
)
// Air quality
val airQuality = weather.getAirQuality(
latitude = 37.7749,
longitude = -122.4194
)
// Marine forecast
val marine = weather.getMarineForecast(
latitude = 37.7749,
longitude = -122.4194
)Error Handling
try {
val response = ProtectMyAPI.get<MyData>("https://api.example.com/data")
// Success!
} catch (e: ProtectMyAPIException.IntegrityCheckFailed) {
// Device couldn't be verified (rooted, emulator, etc.)
when (e.reason) {
IntegrityFailureReason.ROOTED -> showAlert("Rooted device detected")
IntegrityFailureReason.EMULATOR -> showAlert("Please use a real device")
IntegrityFailureReason.TAMPERED -> showAlert("App integrity compromised")
else -> showAlert("Verification failed: ${e.message}")
}
} catch (e: ProtectMyAPIException.NetworkError) {
showAlert("Network error: ${e.message}")
} catch (e: ProtectMyAPIException.ServerError) {
showAlert("Server error ${e.statusCode}: ${e.message}")
} catch (e: ProtectMyAPIException.RateLimited) {
showAlert("Too many requests. Try again in ${e.retryAfterSeconds}s")
} catch (e: ProtectMyAPIException.Unauthorized) {
showAlert("Invalid credentials")
} catch (e: ProtectMyAPIException.NotInitialized) {
showAlert("SDK not initialized")
} catch (e: Exception) {
showAlert("Error: ${e.message}")
}Best Practices
Do:
- Initialize ProtectMyAPI in your Application class
- Test on real devices with Google Play Services
- Handle all error cases gracefully
- Use streaming for long AI responses with Kotlin Flow
- Enable certificate pinning in production
- Keep
allowRootedDevicesandallowEmulatorfalse in production
Donβt:
- Test only on emulators
- Put API keys in your app code
- Ignore error handling
- Disable security checks in production
- Trust attestation results from debug builds
How It Works
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β Your App β β ProtectMyAPI β β AI Provider β
β β β Server β β (OpenAI, etc) β
ββββββββββ¬βββββββββ ββββββββββ¬βββββββββ ββββββββββ¬βββββββββ
β β β
β 1. Request + attestation β β
β βββββββββββββββββββββββββ>β β
β β β
β β 2. Verify with Google β
β β β
β β 3. Add API key & forward β
β β βββββββββββββββββββββββββ>β
β β β
β β 4. Get response β
β β <ββββββββββββββββββββββββββ
β β β
β 5. Return response β β
β <ββββββββββββββββββββββββββ β- Your app makes a request β SDK adds Play Integrity token
- ProtectMyAPI verifies with Google β Google confirms the device and app integrity
- API key is added server-side β Your secrets never touch the device
- Request is forwarded β ProtectMyAPI proxies to the AI provider
- Response returns β Your app gets the result
FAQ
Q: Why are attestation checks limited on emulators?
Play Integrity uses hardware security features (TEE/StrongBox) not available on emulators.
Q: Will this work on rooted devices?
No. Rooted devices fail integrity checks by design, protecting you from compromised environments.
Q: Does this slow down my app?
First request takes ~300ms extra for attestation. Subsequent requests add ~30ms overhead.
Q: What if ProtectMyAPI servers are down?
We maintain 99.9% uptime with global redundancy. Always implement error handling.
Q: Can I use my own backend alongside ProtectMyAPI?
Yes! Use ProtectMyAPI for AI and secrets, call your backend directly for other endpoints.
Next Steps
- π± iOS SDK β Add iOS support
- π¦ Flutter SDK β Cross-platform development
- π§ AI Providers β Integrate 20+ AI services
- π Dashboard Guide β Manage your apps