Skip to content

Latest commit

 

History

History
79 lines (63 loc) · 4.23 KB

INSTRUCTIONS.md

File metadata and controls

79 lines (63 loc) · 4.23 KB

Buy Supplies

You will need:

  • EmonPi, CT & Voltage (from emonpi store) (~$350 USD)
  • 32gb SD card ($10 USD)
  • Power strip ($10 USD)

Build Experimental Setup

  • Strip insulation and clip to black or white wire. If you clip to white, you'd need to invert your current readings in the firmware.

image

Calibrate EmonPi

Clone SD card to 32gb

Since the 8gb emonpi SD card is partitioned in a way that will not allow installation of tensorflow, it is necessary to clone it to a larger card.

Install Required Packages

  • Tensorflow
  • Numpy sudo apt install --no-install-recommends python2.7-minimal python2.7 or #sudo apt install python-numpy python-scipy
  • Scikit-learn

Compile & Upload Firmware

Create Necessary Folders

In terminal on your pi:

  • mkdir /home/pi/DEV/ (where scripts will reside)
  • mkdir /home/pi/DEV/data_train (where training dataset will reside)

Train Data Collection

  • Run train.py and collect your desired number of samples of each appliance. We did 100 of each, for 700 samples. Note that as you cycle appliances constantly, their properties will change as they warm up. Might be smart to let it cool after each run, but that would take a ton of time.
  • Run 'train.py --empty' to collect another 100 empty windows for baseline comparison. This helps the accuracy of the model
  • Copy from training folder on your pi to your favorite Google Drive local location you'll be running CoLab:

scp -r pi@(your pi's IP):/home/pi/DEV/data_train/ /Users/name/YourFavoriteGoogleDriveLocation/

  • Convert to .zip file and grab the GD sharing url that follows the 'id=....' e.g. "1ShWZ4olv0SdT6jpRbVgF2C7q0tZZVKYc" Be sure to put this as "zip_id" variable under the 'LINK DATA TO INSTANCE' title block in the colab training file

Neural Network Model Creation/Upload

  • Open Google Colaboratory and run the jupyter notebook "model_creation.ipynb"
  • Find your file in your Google Drive and download locally. Navigate to that folder
  • Upload the .h5 file (the model) to your pi's main DEV folder, e.g.:

scp wb_model_1.h5 pi@(your pi's IP):/home/pi/DEV/

Test Model Functionality Locally

  • Here's a script to allow local test before uploading to the pi, which saves a bit of hassle

Upload Classifier Encoding To Pi

  • In "model_creation.ipynb", copy the dictionary printout from line print("appliance_dict = ",encoding)
  • Paste it into your main script:

appliance_dict = {0: 'cell', 1: 'desklamp', 2: 'fan', 3: 'kettle', 4: 'laptop', 5: 'monitor', 6: 'none', 7: 'sadlamp'} #N.B. update this with each model!

Classify!

  • This script will print out the result of the classification and the confidence, after you've put it in the DEV folder on your pi.
  • See future work for inspiration to expand the capabilities

Desk Lamp image Full Spectrum Lamp (sadlamp) image Fan image Kettle image Laptop image Monitor image iPhone image