Back to all models
Perplexity

Perplexity Llama 3.1 Sonar

Perplexity

Overview

A specialized Perplexity model with a 128,000-token context window, built on Llama 3.1 for real-time information retrieval. It integrates web search capabilities and knowledge-intensive training, providing factually accurate responses for applications like research assistance, question answering, and current event analysis.

Key Strengths

Information retrieval
Web search capabilities
Factual responses

Capabilities

Text Generation
Code Generation
Function Calling
Reasoning
Web Search

Categories

General Purpose
Information Retrieval

Specifications

Context Size

128,000 tokens

Pricing

Input$0.4 / 1M tokens
Output$1.2 / 1M tokens

Documentation

View Documentation