AI21: Jamba 1.5 Large
ai21/jamba-1-5-large
About AI21: Jamba 1.5 Large
Jamba 1.5 Large is part of AI21's new family of open models, offering superior speed, efficiency, and quality.
It features a 256K effective context window, the longest among open models, enabling improved performance on tasks like document summarization and analysis.
Built on a novel SSM-Transformer architecture, it outperforms larger models like Llama 3.1 70B on benchmarks while maintaining resource efficiency.
Read their announcement to learn more.
Specifications
Context Length
256,000
Tokenizer
Other
Pricing
Prompt
2.000
Completion
8.000
Image
0
Request
0
Last updated: 4/11/2025