AI21 Jamba 1.5 Mini
Model navigation navigation

Jamba 1.5 Mini is a state-of-the-art, hybrid SSM-Transformer instruction following foundation model. It's a Mixture-of-Expert model with 52B total parameters and 12B active parameters. The Jamba family of models are the most powerful & efficient long-context models on the market, offering a 256K context window, the longest available.. For long context input, they deliver up to 2.5X faster inference than leading models of comparable sizes. Jamba supports function calling/tool use, structured output (JSON), and grounded generation with citation mode and documents API. Jamba officially supports English, French, Spanish, Portuguese, German, Arabic and Hebrew, but can also work in many other languages.
Model Developer Name: AI21 Labs
Jamba 1.5 Mini is a state-of-the-art, hybrid SSM-Transformer instruction following foundation model
52B total parameters and 12B active parameters
Model inputs text only.
Model generates text only.
Jamba 1.5 Mini was trained in Q3 2024 with data covering through early March 2024.
Name | Params | Content Length |
---|---|---|
Jamba 1.5 Mini | 52B (12B active) | 256K |
Jamba 1.5 Large | 398B (94B active) | 256K |