Skip to content

Effinformer: A Deep-Learning-Based Data-Driven Modeling of DC–DC Bidirectional Converters (Published in: IEEE Transactions on Instrumentation and Measurement (*IEEE TIM*))

License

Notifications You must be signed in to change notification settings

SQY2021/Effinformer_IEEE-TIM

Repository files navigation

Effinformer: A Deep Learning-Based Data-Driven Modeling of DC-DC Bidirectional Converters

Python 3.6 PyTorch 1.2 cuDNN 7.3.1 License CC BY-NC-SA

This is a GitHub repository for the project "Effinformer: A Deep Learning-Based Data-Driven Modeling of DC-DC Bidirectional Converters". DOI: 10.1109/TIM.2023.3318701.Published in: IEEE Transactions on Instrumentation and Measurement (IEEE TIM). The project aims to develop a data-driven model for a DC-DC bidirectional converter using an efficient self-attention network (Effinformer).

We design a practical end-to-end DL framework named efficient Informer (Effinformer), which leverages a sparse self-attention mechanism to reduce the computational cost of the network while improving accuracy and efficiency significantly. Specifically, the distilling blocks based on dilated causal convolutional layers are constructed to obtain a larger receptive field and extract long-term historical information. To mine potential trend features and expedite computation, we suggest an alternative approach that replaces the original multi-head attention in the decoder of Informer. Finally, an appropriate gated linear unit (GLU) is chosen to improve prediction accuracy.

image

Fig. 5. Structure diagram of the encoder stacking three attention blocks and two dilated causal convolution layers.

Requirements

  • matplotlib == 3.1.1
  • numpy == 1.19.4
  • pandas == 0.25.1
  • scikit_learn == 0.21.3
  • torch == 1.9.0

Usage

Clone this repository to your local machine. git clone https://github.com/SQY2021/Effinformer.git Dependencies can be installed using the following command:

pip install -r requirements.txt

Train the model:

bash ./Effinformer.sh

Procedure to run this code

  • Click on code and download zip
  • Upload the file on your google drive
  • Open the file with zip extractor in the drive
  • You can click open the file in Colab and run the file only after the file is mounted in your drive
  • All changes to the Prediction Length, epochs, dataset, etc can be made in the Effinformer.py file
  • You may make changes in the data set used and target to be run

A Simple Example

To demonstrate the model's prediction process, we utilize a subset of our data set consisting of 20,000 sampling points. This smaller-scale data set will allow us to effectively showcase the model's capabilities. See Predict.ipynb for workflow.

^back to top

Baselines

In this article, the following models are also utilized for comparison.

References

We appreciate the following github repositories a lot for their valuable code base or datasets:

https://github.com/zhouhaoyi/Informer2020

https://github.com/thuml/Autoformer

https://github.com/locuslab/TCN

https://github.com/OrigamiSL/TCCT2021-Neurocomputing-

https://github.com/timeseriesAI/tsai

Citation

@article{Shang2023Effinformer,
  title={Effinformer: A Deep-Learning-Based Data-Driven Modeling of DC–DC Bidirectional Converters},
  author={Shang Q, Xiao F, Fan Y, et al.},
  journal={IEEE Transactions on Instrumentation and Measurement},
  volume={72},
  pages={1-13},
  year={2023},
  publisher={IEEE}
}

Contact

If you have any questions, feel free to contact Qianyi Shang through Email (21000504@nue.eud.cn) or Github issues. Pull requests are highly welcomed!

^back to top

About

Effinformer: A Deep-Learning-Based Data-Driven Modeling of DC–DC Bidirectional Converters (Published in: IEEE Transactions on Instrumentation and Measurement (*IEEE TIM*))

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published