Skip to content

Transformer implementation with PyTorch for remaining useful life prediction on turbofan engine with NASA CMAPSS data set. Inspired by Mo, Y., Wu, Q., Li, X., & Huang, B. (2021). Remaining useful life estimation via transformer encoder enhanced by a gated convolutional unit. Journal of Intelligent Manufacturing, 1-10.

License

Notifications You must be signed in to change notification settings

survml/survml-transformer-rul-prediction

Repository files navigation

PyTorch Transformer for RUL Prediction

Transferred from https://github.com/jiaxiang-cheng/PyTorch-Transformer-for-RUL-Prediction
An implementation with Transformer encoder and convolution layers with PyTorch for remaining useful life prediction.
Author: Jiaxiang Cheng, Nanyang Technological University, Singapore

Python PyTorch

Quick Run

Simply run python train.py --dataset FD001. And you will get the training loss and testing result for each epoch where the RMSE is from the test set:

Epoch: 0, loss: 9474.43470, RMSE: 61.11946
Epoch: 1, loss: 5858.27227, RMSE: 46.03318
Epoch: 2, loss: 3208.53410, RMSE: 29.78244
Epoch: 3, loss: 1310.71390, RMSE: 22.94705
...

The testing is conducted for each epoch as the data set is not large so it's no big deal but you may remove them and only do the evaluation after finishing the training epochs.

Environment Details

python==3.8.8
numpy==1.20.1
pandas==1.2.4
matplotlib==3.3.4
pytorch==1.8.1

Credit

This work is inpired by Mo, Y., Wu, Q., Li, X., & Huang, B. (2021). Remaining useful life estimation via transformer encoder enhanced by a gated convolutional unit. Journal of Intelligent Manufacturing, 1-10.

Citation

DOI

License

License

About

Transformer implementation with PyTorch for remaining useful life prediction on turbofan engine with NASA CMAPSS data set. Inspired by Mo, Y., Wu, Q., Li, X., & Huang, B. (2021). Remaining useful life estimation via transformer encoder enhanced by a gated convolutional unit. Journal of Intelligent Manufacturing, 1-10.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages