Use Perplexity’s Sonar API with OpenAI’s client libraries for seamless integration.
Perplexity’s Sonar API is fully compatible with OpenAI’s Chat Completions API format. This means you can use existing OpenAI client libraries and simply change the base URL to start using Perplexity’s web search-powered models.
These parameters work exactly the same as OpenAI’s API:
model
- Model name (use Perplexity model names)messages
- Chat messages arraytemperature
- Sampling temperature (0-2)max_tokens
- Maximum tokens in responsetop_p
- Nucleus sampling parameterfrequency_penalty
- Frequency penalty (-2.0 to 2.0)presence_penalty
- Presence penalty (-2.0 to 2.0)stream
- Enable streaming responsesThese Perplexity-specific parameters are also included:
search_domain_filter
- Limit or exclude specific domainssearch_recency_filter
- Filter by content recencyreturn_citations
- Include citation URLs in responsereturn_images
- Include image URLs in responsereturn_related_questions
- Include related questionssearch_mode
- “web” (default) or “academic” mode selector.Install the OpenAI library: pip install openai
After making a request, your response object will include both standard OpenAI fields and Perplexity-specific fields:
choices[0].message.content
: The main model responsemodel
: The model usedusage
: Token usage detailscitations
: (Perplexity) List of source URLssearch_results
: (Perplexity) Array of search result objectssonar-pro
, sonar-reasoning
, etc.)Bearer
token format in Authorization header