Skip to content

Hyperparameter Tuning with Microsoft NNI to automated machine learning (AutoML) experiments. The tool dispatches and runs trial jobs generated by tuning algorithms to search the best neural architecture and/or hyper-parameters in different environments like local machine, remote servers and cloud.

License

Notifications You must be signed in to change notification settings

mohd-faizy/Hyperparameter-Tuning-with-Microsoft-Network-Intelligence-Toolkit-NNI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

author made-with-Markdown Language Platform Maintained Last Commit GitHub issues Open Source Love svg2 Stars GitHub GitHub license Size


Hyperparameter Tuning with Microsoft Neural Network Intelligence Toolkit

This repository demonstrates how to perform hyperparameter tuning using Microsoft’s NNI toolkit. NNI automates feature engineering, neural architecture search, hyperparameter tuning, and model compression for deep learning tasks.

The below diagram illustrates high-level Architecture of NNI

🌱 ➤ Prerequisites

  • NNI requires Python >= 3.7. It is tested and supported on Ubuntu >= 18.04, Windows 10 >= 21H2, and macOS >= 11
  • Pip (Python package manager)

👁️ ➤ Setup Instructions

👉 Create a Virtual Environment

  1. Open your terminal or command prompt.

  2. Clone the repository git clone https://github.com/mohd-faizy/Hyperparameter-Tuning-with-Microsoft-Network-Intelligence-Toolkit-NNI.git `

  3. Navigate to your project directory.

  4. Run the following command to create a virtual environment (replace <environment_name> with your desired name):

    python -m venv venv

👉 Activate the Virtual Environment

  • On Windows:

    .\venv\Scripts\activate
  • On macOS/Linux:

    source ./venv/bin/activate

👉 Install Dependencies

pip install -r requirements.txt

➤ Run the Hyperparameter Tuning Experiment

  1. Creating the Hyperparameter Search Space:
    1. In order to perform the hyper-parameter tunning, we first need to create the search space that describs the value range of each hyper-parameter.
    2. we can use the .json code to describe the range & this is the dictionary of all the hyper-parameter values that we want to run for our experiment.

🔸search_space.json

{
  "dropout_rate": {
    "_type": "uniform",
    "_value": [0.1, 0.9]
    },
  "num_units": {
    "_type": "choice",
    "_value": [ 32, 64, 128, 256, 512]
    },
  "lr": {
    "_type": "choice",
    "_value": [ 0.1, 0.01, 0.001, 0.03, 0.003, 0.06, 0.006]
    },
  "batch_size": {
    "_type": "choice",
    "_value": [ 16, 32, 64, 128, 256, 512]
    },
  "activation": {
    "_type": "choice",
    "_value": [ "relu", "sigmoid", "tanh"]
    }
}
  1. Then we need to create another file called config.yml & which contain all the information regarding the configuration information of our experiment.

🔸config.yml

experimentName: mnist
trialConcurrency: 1
maxExperimentDuration: 1h
maxTrialNumber: 10
searchSpaceFile: search_space.json
useAnnotation: false
trialCommand: python main.py
trialCodeDirectory: .
tuner:
  name: TPE
  classArgs:
    optimize_mode: maximize
trainingService:
  platform: local
  1. Then at last we need to Execute the following command to start the NNI experiment:

    👇

    nnictl create --config config.yml    

🛠️ Monitor the Experiment

  1. Open the NNI dashboard in your web browser (usually at http://localhost:8080).
    [2024-05-22 00:52:19] Web portal URLs: http://192.168.0.106:8080 http://172.25.208.1:8080 http://127.0.0.1:8080
    [2024-05-22 00:52:19] To stop experiment run "nnictl stop c8usp7bz" or "nnictl stop --all"
    [2024-05-22 00:52:19] Reference: <https://nni.readthedocs.io/en/stable/reference/nnictl.html>
    
  2. Observe the trial jobs, intermediate results, and final results.

📄 ➤ Launching the NNI Dashboard

Hyperparameter Visualization

🔸Top 100% percent🔸

🔸Top 20% percent🔸

🔸Top 5% percent🔸

🔸Top 1% percent🔸

  • default metric: 0.9793

🔸Duration🔸

🔸Intermediate result🔸

🍰 ➤ Contributing

Contributions are welcome!

🗂️ ➤ Additional Resources

⚖ ➤ License

This project is licensed under the MIT License. See LICENSE for details.

❤️ ➤ Support

If you find this repository helpful, show your support by starring it! For questions or feedback, reach out on Twitter(X).

$\color{skyblue}{\textbf{Connect with me:}}$

🔃 ➤ If you have questions or feedback, feel free to reach out!!!


About

Hyperparameter Tuning with Microsoft NNI to automated machine learning (AutoML) experiments. The tool dispatches and runs trial jobs generated by tuning algorithms to search the best neural architecture and/or hyper-parameters in different environments like local machine, remote servers and cloud.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages