You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the feature
Is it possible to add support to Mamba, a deep learning architecture focused on long sequence modeling, for details please see https://en.wikipedia.org/wiki/Deep_learning
Your proposal
Just asking
The text was updated successfully, but these errors were encountered:
I’d like to formally propose expanding Jina Serve’s architecture support to include Mamba-based models for long-sequence processing workloads.
Mamba has demonstrated promising results as a linear-time alternative to transformers, especially in scenarios requiring efficient processing of long sequences (e.g., time series, logs, streaming text). Adding Mamba support aligns well with Jina’s modularity and future-facing AI service goals.
🧠 Proposed Design Consideration
Rather than integrating Mamba as a core dependency, I propose:
Developing a MambaExecutor template (similar to other model executors)
Exposing the Mamba config (e.g., sequence length, hidden dimensions) via Executor.args or environment variables
Making it deployable through Jina Hub or Flow YAML for on-demand use
This would keep Jina Serve agnostic of any one architecture while giving users the flexibility to adopt Mamba where it makes sense.
Describe the feature
Is it possible to add support to Mamba, a deep learning architecture focused on long sequence modeling, for details please see https://en.wikipedia.org/wiki/Deep_learning
Your proposal
Just asking
The text was updated successfully, but these errors were encountered: