AI21: Jamba Mini 1.6

ai21/jamba-1.6-mini

About AI21: Jamba Mini 1.6

AI21 Jamba Mini 1.6 is a hybrid foundation model combining State Space Models (Mamba) with Transformer attention mechanisms. With 12 billion active parameters (52 billion total), this model excels in extremely long-context tasks (up to 256K tokens) and achieves superior inference efficiency, outperforming comparable open models on tasks such as retrieval-augmented generation (RAG) and grounded question answering. Jamba Mini 1.6 supports multilingual tasks across English, Spanish, French, Portuguese, Italian, Dutch, German, Arabic, and Hebrew, along with structured JSON output and tool-use capabilities.

Usage of this model is subject to the Jamba Open Model License.

Specifications

Context Length

256,000

Tokenizer

Other

Pricing

Prompt

0.199

Completion

0.399

Image

0

Request

0

Last updated: 4/11/2025