API Updates February 2024

Announcing Our Newest Model

New Model: mixtral-8x7b-instruct

We're excited to announce that pplx-api is now serving the latest open-source mixture-of-experts model, mixtral-8x7b-instruct, at the blazingly fast speed of inference you are accustomed to.

Online LLMs and general availability for pplx-api

We’re excited to share two new PPLX models: pplx-7b-online and pplx-70b-online. These first-of-a-kind models are integrated with our in-house search technology for factual grounding. Read our blog post for more information!
https://blog.perplexity.ai/blog/introducing-pplx-online-llms

Models removed: replit-code-v1.5-3b and openhermes-2-mistral-7b

We have removed support for replit-code-v1.5-3b and openhermes-2-mistral-7b. There are no immediate plans to add these models back. If you were a user who enjoyed openhermes-2-mistral-7b, try instead using our in-house models, pplx-7b-chat, pplx-70b-chat!

Versioning

The Perplexity AI API is currently in beta release v0. Clients are not protected from backwards incompatible changes and cannot specify their desired API version. Examples of backwards incompatible changes include...