Skip to content

Production time series forecasting dashboard with LightGBM, AutoTS, XGBoost, and Prophet. Multi-model comparison with automatic preprocessing and feature engineering.

Notifications You must be signed in to change notification settings

uprootiny/timeseries-forecast-demo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Time Series Forecasting Dashboard

Multi-model time series forecasting with LightGBM, AutoTS, XGBoost, and Prophet

Production-grade forecasting dashboard demonstrating state-of-the-art time series models with automatic preprocessing, feature engineering, and model comparison.

Features

  • Multiple Forecasting Models:

    • LightGBM - Gradient boosting with time-based features
    • AutoTS - Automated ensemble model selection
    • XGBoost - Alternative gradient boosting implementation
    • Prophet - Facebook's production forecasting library
  • Automatic Preprocessing:

    • Missing value imputation
    • Outlier detection
    • Frequency normalization
    • Train/test splitting
  • Feature Engineering:

    • Time-based features (month, day, quarter, etc.)
    • Lag features (1, 7, 14, 30 periods)
    • Rolling statistics (mean, std)
    • Difference features (rate of change)
  • Model Comparison:

    • RMSE, MAE, MAPE metrics
    • Training time tracking
    • Automatic best model selection
    • Confidence intervals
  • Production Ready:

    • FastAPI backend with OpenAPI docs
    • Docker containerization
    • Health checks and monitoring
    • Rate limiting
    • CORS configuration

Quick Start

Local Development

  1. Install dependencies:
cd backend
pip install -r requirements.txt
  1. Run the server:
uvicorn main:app --reload --port 8005
  1. Access API docs:

Docker

cd backend
docker build -t timeseries-forecast .
docker run -p 8005:8005 timeseries-forecast

API Usage

1. Upload Dataset

curl -X POST "http://localhost:8005/api/upload" \
  -H "Content-Type: multipart/form-data" \
  -F "file=@your_timeseries.csv"

Expected CSV format:

date,value
2023-01-01,100.5
2023-01-02,102.3
2023-01-03,98.7
...

Response:

{
  "filename": "your_timeseries.csv",
  "rows": 365,
  "columns": ["date", "value"],
  "start_date": "2023-01-01",
  "end_date": "2023-12-31",
  "frequency": "D",
  "missing_values": 0
}

2. Generate Forecast

curl -X POST "http://localhost:8005/api/forecast?dataset_id=your_timeseries.csv" \
  -H "Content-Type: application/json" \
  -d '{
    "model_type": "all",
    "horizon": 30,
    "date_column": "date",
    "value_column": "value",
    "frequency": "D"
  }'

Model Types:

  • lightgbm - LightGBM only
  • autots - AutoTS only
  • xgboost - XGBoost only
  • prophet - Prophet only
  • all - Run all models and compare (default)

Response:

{
  "results": [
    {
      "model_name": "lightgbm",
      "forecast": [
        {
          "date": "2024-01-01",
          "forecast": 105.2,
          "lower_bound": 95.3,
          "upper_bound": 115.1
        },
        ...
      ],
      "metrics": {
        "model_name": "lightgbm",
        "rmse": 3.45,
        "mae": 2.67,
        "mape": 2.3,
        "training_time_seconds": 1.23
      },
      "historical_data": [...]
    },
    ...
  ],
  "best_model": "lightgbm"
}

Model Details

LightGBM

Strengths:

  • Fast training and inference
  • Handles large datasets well
  • Good with noisy data
  • Feature importance analysis

Use Cases:

  • High-frequency data (daily, hourly)
  • Large datasets (>10K points)
  • When speed is critical

Parameters:

  • Objective: regression
  • Boosting: GBDT
  • Learning rate: 0.05
  • Num leaves: 31

AutoTS

Strengths:

  • Fully automated model selection
  • Ensemble of multiple methods
  • No hyperparameter tuning needed
  • Built-in cross-validation

Use Cases:

  • When you want "best" model automatically
  • Exploring multiple approaches
  • Production systems with varying data

Methods Evaluated:

  • ARIMA, ETS, Prophet, LSTM
  • Regression models
  • Statistical models

XGBoost

Strengths:

  • Robust gradient boosting
  • Regularization prevents overfitting
  • Handles missing values
  • Parallel processing

Use Cases:

  • Alternative to LightGBM
  • When regularization is needed
  • Baseline comparison

Parameters:

  • Objective: regression
  • Max depth: 6
  • Learning rate: 0.05
  • Subsample: 0.8

Prophet

Strengths:

  • Handles seasonality automatically
  • Robust to missing data
  • Interpretable components
  • Industry-proven (Facebook)

Use Cases:

  • Business metrics (sales, traffic)
  • Daily/weekly seasonality
  • Holiday effects
  • Trend changes

Features:

  • Automatic trend detection
  • Multiple seasonality
  • Holiday effects
  • Outlier handling

Technical Stack

  • Backend: FastAPI, Python 3.11
  • Models: LightGBM, AutoTS, XGBoost, Prophet
  • Data: pandas, numpy, scikit-learn
  • Deployment: Docker, Render.com
  • Testing: pytest, httpx

Performance

Typical Training Times (1000-point dataset):

  • LightGBM: ~1-2 seconds
  • XGBoost: ~1-2 seconds
  • Prophet: ~3-5 seconds
  • AutoTS: ~10-30 seconds (evaluates multiple models)

Inference: <100ms per forecast

Deployment

Render.com

  1. Fork this repository
  2. Create new Web Service on Render
  3. Connect repository
  4. Render will auto-detect render.yaml
  5. Deploy!

Environment Variables

See .env.example for all configuration options.

Required:

  • PORT - API port (default: 8005)
  • ENVIRONMENT - development/production

Optional:

  • MAX_FILE_SIZE_MB - Max CSV upload size (default: 50)
  • MAX_FORECAST_HORIZON - Max forecast periods (default: 365)
  • CORS_ORIGINS - Allowed origins

Development

Project Structure

timeseries-forecast-demo/
├── backend/
│   ├── api/
│   │   ├── __init__.py
│   │   └── models.py           # Pydantic models
│   ├── forecasting/
│   │   ├── __init__.py
│   │   ├── preprocessing.py    # Data cleaning, features
│   │   └── models/
│   │       ├── lightgbm_forecaster.py
│   │       ├── autots_forecaster.py
│   │       ├── xgboost_forecaster.py
│   │       └── prophet_forecaster.py
│   ├── tests/
│   ├── main.py                 # FastAPI app
│   ├── config.py               # Settings
│   ├── requirements.txt
│   └── Dockerfile
├── render.yaml
├── .env.example
└── README.md

Testing

cd backend
pytest tests/ -v --cov=. --cov-report=html

Adding New Models

  1. Create new forecaster in forecasting/models/
  2. Implement: train(), forecast(), evaluate()
  3. Add to main.py model selection
  4. Update API docs

Use Cases

  • Financial: Stock price, revenue forecasting
  • Operations: Demand planning, inventory optimization
  • Marketing: Traffic prediction, campaign ROI
  • IoT: Sensor data, anomaly detection
  • Energy: Load forecasting, consumption prediction

Limitations

  • Data Requirements: Minimum 60-90 points recommended
  • Frequency: Best for daily/weekly data
  • Stationarity: Some models assume stationarity
  • Exogenous Variables: Not supported in current version

Future Enhancements

  • Multi-variate forecasting
  • Exogenous variables support
  • LSTM/GRU deep learning models
  • Anomaly detection
  • Interactive frontend dashboard
  • Model persistence/caching
  • Hyperparameter optimization

License

MIT

Author

Built to demonstrate production-grade ML engineering with modern forecasting techniques.

Technologies: LightGBM, AutoTS, XGBoost, Prophet, FastAPI, Docker

About

Production time series forecasting dashboard with LightGBM, AutoTS, XGBoost, and Prophet. Multi-model comparison with automatic preprocessing and feature engineering.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published