Overview
Confidential AI API provides a secure, OpenAI-compatible interface for running AI models in TEE on GPU hardware. This enables developers to integrate AI applications with hardware-level privacy protection, ensuring user data remain confidential during inference.Prerequisites
Before you begin, ensure you have enough funds to get the API key. You need at least $5 in your account. Go to Dashboard and click Deposit to add funds. Navigate to Dashboard → Confidential AI API and click Enable. Then create your first API key and click the key to copy.
Make Your Secure Request
Replace<API_KEY>
with your actual API key in the examples below. We use DeepSeek V3 0324 model as an example, but you can choose any other available models.
Available Models
We support 6+ models running in GPU TEE. Click the GPU TEE checkbox to see all options.Model | Name | Context |
---|---|---|
DeepSeek V3 0324 | phala/deepseek-chat-v3-0324 | 163K |
Llama 3.3 70B Instruct | phala/llama-3.3-70b-instruct | 131K |
GPT OSS 120B | phala/gpt-oss-120b | 131K |
Qwen3 Coder | phala/qwen3-coder | 262K |
Qwen2.5 7B Instruct | phala/qwen-2.5-7b-instruct | 32K |
Verify Your AI is Running Securely
Once you finished your secure request, every response comes with cryptographic proof that it ran in a secure TEE. This proof is generated by the TEE. ensures the response is secure and trustworthy. Click Verify to learn how to verify your AI is running securely.Next Steps
There are some advanced features you could use with Confidential AI API.- Tool Calling help you call tools from your AI models.
- Images and Vision help you use images and vision models in Confidential AI.
- Structured Output help you get structured output from your AI models.
- Streaming help you get streaming response from your AI models.
- Playground help you play with Confidential AI models in a private environment.