Mistral: Ministral 8B

mistralai/ministral-8b

About Mistral: Ministral 8B

Ministral 8B is an 8B parameter model featuring a unique interleaved sliding-window attention pattern for faster, memory-efficient inference. Designed for edge use cases, it supports up to 128k context length and excels in knowledge and reasoning tasks. It outperforms peers in the sub-10B category, making it perfect for low-latency, privacy-first applications.

Specifications

Context Length

128,000

Tokenizer

Mistral

Pricing

Prompt

0.099

Completion

0.099

Image

0

Request

0

Last updated: 4/11/2025