Skip to content

hkust-zhiyao/PANDA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PANDA

Power efficiency is a critical design objective in modern microprocessor design. To evaluate the impact of architectural-level design decisions, an accurate yet efficient architecture-level power model is desired. However, widely adopted data-independent analytical power models like McPAT and Wattch have been criticized for their unreliable accuracy. While some machine learning (ML) methods have been proposed for architecture-level power modeling, they rely on sufficient known designs for training and perform poorly when the number of available designs is limited, which is typically the case in realistic scenarios. PANDA is an architecture-level power evaluation method by unifying analytical and machine learning solutions. We propose PANDA, an innovative architecture-level solution that combines the advantages of analytical and ML power models. It achieves unprecedented high accuracy on unknown new designs even when there are very limited designs for training, which is a common challenge in practice.

Introduction

The modeling flow of McPAT-Calib can be divided into two parts, the focus is on PANDA Power Evaluation:

  1. Microarchitecture Simulation: Use the microarchitecture simulator (gem5) to complete the simulation of the given BOOM microarchitecture configuration and benchmark.
  2. PANDA Power Evaluation: Using the configuration parameters and events generated by the microarchitecture simulator to predict the power consumption.

Quick Start

We prepared a example dataset in "example_data", with feature_set and label_set. So you can just run the PANDA power model, the result will be visulized as figures in "result_figure". (The example dataset is coming soon!)

cd power_model
python PANDA.py

About

PANDA: Architecture-Level Power Evaluation by Unifying Analytical and Machine Learning Solutions

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages