🤖 AI Coding Assistants

AI Assistant Integration 🤖

Let your AI coding assistant help you integrate ProtectMyAPI - no manual coding required!

Vibe Coding Friendly! We provide context files that AI assistants (Cursor, GitHub Copilot, Windsurf, etc.) can use to understand our SDK and help you integrate it automatically.


Quick Setup for AI Assistants

Most AI assistants can fetch context from a URL. Simply tell your assistant:

Use https://docs.protectmyapi.com/llms-full.txt as context for ProtectMyAPI integration

Or paste this into your conversation:

@https://docs.protectmyapi.com/llms-full.txt

Option 2: Download Context Files


Platform-Specific Setup

Cursor

Add to Project Rules

Create a .cursorrules file in your project root:

curl -o .cursorrules https://docs.protectmyapi.com/cursorrules.txt

Or Add Full Context

Create a docs/ folder and add the full context:

mkdir -p docs
curl -o docs/protectmyapi.md https://docs.protectmyapi.com/llms-full.txt

Start Coding

Now just tell Cursor what you want:

“Add ProtectMyAPI to my iOS app and create a chat screen using OpenAI”

Cursor will automatically use the context to generate correct code!


Example Prompts

Once your AI assistant has context, try these prompts:

Basic Integration

“Set up ProtectMyAPI in my iOS app and add a simple OpenAI chat”

“Initialize ProtectMyAPI in my Android Application class”

“Add ProtectMyAPI to my Flutter app with proper error handling”

AI Features

“Create a chat UI with streaming responses using OpenAI via ProtectMyAPI”

“Add image generation with Stability AI, show loading state and save to photos”

“Build a voice assistant using ElevenLabs text-to-speech”

“Create a translation feature using DeepL”

Advanced

“Add all 20 AI providers to my app with a provider picker”

“Implement conversation history with OpenAI including system prompts”

“Add security checks to block jailbroken devices”

“Create an AI search feature with Perplexity including source citations”


What the AI Gets

The context files include:

ContentDescription
Quick StartInit code for iOS, Android, Flutter
All ServicesHow to get each AI service
Code ExamplesChat, streaming, images, TTS, search
ConfigurationAll options explained
Error HandlingTry/catch patterns
Model ReferencePopular models for each provider
SecurityDevice checks and attestation

Tips for Best Results

💡

Be Specific: Instead of “add AI”, say “add OpenAI chat with streaming using ProtectMyAPI”

  1. Mention the platform: “iOS”, “Android”, or “Flutter”
  2. Specify the AI provider: “OpenAI”, “Anthropic”, “Stability AI”, etc.
  3. Describe the feature: “chat”, “image generation”, “voice synthesis”
  4. Include UI if needed: “with a SwiftUI chat interface”

Good Prompt Examples

✅ “Using ProtectMyAPI, create an iOS SwiftUI view with an OpenAI chat interface that supports streaming responses”

✅ “Add Stability AI image generation to my Android app with ProtectMyAPI, include a loading indicator”

✅ “Build a Flutter screen that uses ProtectMyAPI’s ElevenLabs service for text-to-speech with voice selection”

Less Effective Prompts

❌ “Add AI to my app” (too vague)

❌ “Make a chatbot” (doesn’t mention ProtectMyAPI or platform)


Keep Context Updated

We update our context files when we add new features. To get the latest:

# Update your local context file
curl -o .cursorrules https://docs.protectmyapi.com/cursorrules.txt
 
# Or the full context
curl -o docs/protectmyapi.md https://docs.protectmyapi.com/llms-full.txt

Need Help?