Skip to content

tinymlcontest/tinymlcontest2022_demo_evaluation

Repository files navigation

Evaluation code for the ACM/IEEE TinyML Contest at ICCAD 2022

What's in this repository?

This repository contains evaluation code to evaluate three metrics of the neural network implemented with either X-CUBE-AI or other frameworks.

This code uses four main scripts, described below, to evaluate your model on the testing dataset.

How the evaluation is conducted?

The evaluation is mainly based on MCU. In evaluation, the PC sends one IEGM segment to the board. The board will conduct inference on the received IEGM segment and sends the inference result back to the PC. The communication is through UART.

How do I run these scripts?

You can run this classifier code by installing the requirements

pip install requirements.txt

and referring to How to validate X-CUBE-AI model on board.md or How to validate your own model on board.md to modify the C source code project to be deployed on board

and running

python validation.py

After running the scripts, the metrics

  1. F_Beta score F-B
  2. Average latency L over all segments from testing dataset

will be reported.

The metric Flash Usage F is based on Code + RO Data + RW Data reported by Keil when building and loading the C code on board.

Which part of the C code I could change?

For teams using X-Cube-AI, you do not need to modify any files after replacing the 4 files to the generated C code project. If you do have some implementation to be integrated into the project, you should only make the changes inside the function aiRun() and MX_CUBE_AI_Init().

For teams using their own framework, you should only make the changes inside the function aiRun() and Model_Init() as indicated in How to validate the model with framework other than X-CUBE-AI on board.

The rest provided functions must be retained to enable the evaluation and fair evaluation.

How do I obtain the final scoring?

After obtaining three metrics, you can evaluate the scores of your models using the scoring function specified in TinyML Contest 2022 evaluation.

How do I submit the design?

You can refer to TinyML Contest 2022 Submission for submission instructions.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published