AI21 Jamba 1.5 Large
Model navigation navigation

Jamba 1.5 Large is a state-of-the-art, hybrid SSM-Transformer instruction following foundation model. It's a Mixture-of-Expert model with 94B total parameters and 398B active parameters. The Jamba family of models are the most powerful & efficient long-context models on the market, offering a 256K context window, the longest available.. For long context input, they deliver up to 2.5X faster inference than leading models of comparable sizes. Jamba supports function calling/tool use, structured output (JSON), and grounded generation with citation mode and documents API. Jamba officially supports English, French, Spanish, Portuguese, German, Arabic and Hebrew, but can also work in many other languages.
Model Developer Name: Jamba 1.5 Large
Jamba 1.5 Large is a state-of-the-art, hybrid SSM-Transformer instruction following foundation model
94B total parameters and 398B active parameters
Models input text only.
Models generate text only.
Jamba 1.5 Large was trained in Q3 2024 with data covering through early March 2024.
Name | Params | Content Length |
---|---|---|
Jamba 1.5 Mini | 52B (12B active) | 256K |
Jamba 1.5 Large | 398B (94B active) | 256K |