-
Notifications
You must be signed in to change notification settings - Fork 4.9k
Open
Labels
enhancementNew feature or requestNew feature or request
Description
Motivation
Recently, Structured State Space Models (SSMs), such as S4, S5, and Mamba, have achieved strong performance in
long sequence modeling tasks. Compared to Transformers, SSMs can efficiently capture long-term dependencies
with linear time complexity, making them well-suited for large-scale time series forecasting in finance.
Proposal
I would like to suggest adding support for SSM-based deep learning models (starting with Mamba) in Qlib’s
model zoo. This will provide users with a new sequence modeling architecture in addition to existing RNN-
and Transformer-based models.
Benefits
- Improved efficiency in modeling long financial time series compared to Transformers.
- Strong scalability with lower computational overhead.
- Enriched model selection for researchers and practitioners using Qlib.
Possible Next Steps
- Implement a basic SSM/Mamba model class under
qlib/model/
. - Provide training and inference examples for Alpha158/Alpha360 datasets.
- Add benchmark results to compare against Transformer-based models.
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request