-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Closed
Labels
Description
Background
Currently Chronos rely on orca.automl for their pytorch model single node support. This will cause
- Future
chronos-inferencewill rely onorca(anddllib) which make the whole package fat. - Can not use
nano's speed up. - pytorch-lightning Trainer will bring more feature to Chronos (such as efficient lr/bs tuning)
Implementation Step
We will implement the whole plan mainly in 5 PRs
-
add a new
nano.pytorch.pl_base_model, this model should add time series specific training step ... -
add
chronos.metricto replaceorca.automl.metric. Chronos: add independentchronos.metric#3608 -
change forecaster(distributed=False) path to bigdl-nano(pl) Chronos: pytorch based forecasters on bigdl-nano #3539
-
change TSPipeline path to bigdl-nano(pl) Chronos: TSPipeline on bigdl-nano #3609
-
change Automodel (except .fit) to bigdl-nano(pl) Chronos: AutoProphet & AutoARIMA without orca when inference #3745