Mistral: Codestral Mamba

mistralai/codestral-mamba

About Mistral: Codestral Mamba

A 7.3B parameter Mamba-based model designed for code and reasoning tasks.

  • Linear time inference, allowing for theoretically infinite sequence lengths
  • 256k token context window
  • Optimized for quick responses, especially beneficial for code productivity
  • Performs comparably to state-of-the-art transformer models in code and reasoning tasks
  • Available under the Apache 2.0 license for free use, modification, and distribution

Specifications

Context Length

262,144

Tokenizer

Mistral

Pricing

Prompt

0.250

Completion

0.250

Image

0

Request

0

Last updated: 4/11/2025