Skip to content

dxpython/KALFormer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

KALFormer: Knowledge-Augmented Attention Learning for Long-Term Time Series Forecasting

Official implementation of KALFormer, a hybrid deep learning model that integrates LSTM, Self-Attention, Knowledge Graph Embedding (KAN), and Transformer modules to address long-range dependencies, complex dynamics, and external factors in time series forecasting.


🧠 Motivation

Time series forecasting is critical in domains such as energy systems, financial markets, traffic management, and meteorology. However, it faces persistent challenges:

  1. Long-Term Dependencies – RNN/LSTM architectures struggle to retain distant temporal information.
  2. Complex Dynamics – Real-world time series exhibit non-linear, high-dimensional, and irregular patterns.
  3. External Influences – Exogenous variables (e.g., weather, policies, holidays) strongly impact prediction accuracy.

KALFormer is designed to overcome these limitations by fusing memory, attention, and structured knowledge integration.


📦 Project Structure

KALFormer/
├─ dataset/                 # Preprocessed benchmark datasets                    
├─ experiment/              
│  ├─ Ablation_experiment/  # Reproduction of ablation study
│  ├─ Compare_experiment/   # Baseline comparisons
│  └─ model/                # Core modules (LSTM, Attention, KAN, Transformer)
├─ utils/                   # Helper functions (data split, normalization, metrics)
├─ configs/                 # Configurations for each dataset & horizon
├─ scripts/                 # Shell scripts to reproduce results
├─ images/                  # Figures (high-resolution for paper)
├─ README.md                
└─ requirements.txt     

⚙️ Installation

git clone https://github.com/dxpython/KALFormer.git
cd KALFormer
pip install -r requirements.txt

Tested on Python 3.9, PyTorch 1.12+, **CUDA 11.3*2.


📊 Datasets

KALFormer is evaluated on 5 real-world datasets:

Dataset Timesteps Features Granularity
Traffic 17,544 862 Hourly
Weather 52,696 21 10-min
Electricity 26,304 321 Hourly
ETTh1/2 17,420 7 Hourly
ETTm1/2 69,680 7 5-min

📥 Download Links (persistent DOI repositories)


🧱 Model Architecture

KALFormer consists of four key modules:

  1. LSTM Layers – Sequential encoder for local temporal patterns.
  2. Self-Attention – Captures long-range dependencies among time steps.
  3. Knowledge Graph Network (KAN) – Graph convolution to integrate external structured knowledge.
  4. Transformer Block (Multi-Head Attention) – Fuses global contextual signals with graph-informed features.

Framework


📊 Evaluation

KALFormer is benchmarked against baselines (LSTM, Informer, Autoformer, FEDformer). Metrics: MSE, MAE, RMSE, MAPE, with confidence intervals from 5 runs.

Performance

Performance

Performance

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published