Skip to content

adrienpetralia/TransApp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ADF & TransApp

Intro image

A Transformer-Based Framework for Appliance Detection Using Smart Meter Consumption Series (VLDB 2024)

GitHub issues

News: This work has been accepted for publication in the Proceedings of the VLDB Endowment and will appear in the 50th International Conference on Very Large Databases (VLDB 2024).

References

Adrien Petralia, Philippe Charpentier, and Themis Palpanas. ADF & TransApp: A Transformer-Based Framework for Appliance Detection Using Smart Meter Consumption Series. Proceedings of the VLDB Endowment (PVLDB), 17(3): 553 - 562, 2023. doi:10.14778/3632093.363211

Proposed approach

Appliance Detection Framework

We propose the Appliance Detection Framework (ADF) to detect the presence of appliances in households, using real-world consumption series, which are sampled at a very low frequency, and are long and variable-length. ADF addresses these challenges by operating at individual subsequences of each consumption series, instead of each series in its entirety. The framework can be used with any time series classifier designed to predict probabilities.

Framework image

TransApp

We propose TransApp, a Transformer-based time series classifier, which can first be pretrained in a self-supervised manner to enhance its ability on appliances detection tasks. This way, TransApp can significantly improve its accuracy.

Model architecture

The proposed architecture lies in combination of a strong embedding block made of dilated convolutional layers followed by a Transformer encoder using Diagonally Masked Self-Attention (DMSA).

TransAppModel image

Two steps training process

Self-supervised pretraining. The use of a self-supervised pretraining of a Transformer architecture on an auxiliary task has been used in the past to boost the model performance on downstream tasks. This process is inspired by the mask-based pretraining of vision transformer and requires only the input consumption series without any appliance information label. It results in a reconstruction objective of a corrupted (masked) time series fed to the model input.

Supervised pretraining. The supervised training results in a simple binary classification process using labeled time series.

Two-steps training image

Experiments

We provide a jupyter-notebook example to use our Appliance Detection Framework combined with our TransApp classifier on the CER data : experiments/TransAppExample.ipynb.

In addition, to reproduce papers experiments, use the following guidelines.

Pretraining TransApp

Pretraining TransApp in a self-supervised way using non labeled data :

sh LaunchTransAppPretraining.sh

Appliance Detection with TransApp

Use our Appliance Detection Framework combined with TransApp to detect appliance in consumption time series :

sh LaunchTransAppClassif.sh

Appliance Detection with other time series classifiers

Inside our Appliance Detection Framework

Use our Appliance Detection Framework combined with ConvNet, ResNet or InceptionTime to detect appliance in consumption time series :

sh LaunchModelsClassif.sh

Outside our Appliance Detection Framework

Please refer to this Github ApplianceDetectionBenchmark to reproduce the experiments, where an extensive evaluation of different time series classifiers have been conducted, inluding on the datasets used in this study.

Getting Started

Prerequisites

Python version : >= Python 3.7

Overall, the required python packages are listed as follows:

Installation

Use pip to install all the required libraries listed in the requirements.txt file.

pip install -r requirements.txt

Data

The data used in this project comes from two sources:

  • CER smart meter dataset from the ISSDA archive.
  • Private smart meter dataset provide by EDF (Electricité De France).

You may find more information on how to access the datasets in the data folder.

Contributors

Acknowledgments

We would like to thanks Paul Boniol for the valuable discussions on this project. Work supported by EDF R&D and ANRT French program.