This repository has been archived by the owner on Sep 18, 2024. It is now read-only.
NNI v2.6 Release
NOTE: NNI v2.6 is the last version that supports Python 3.6. From next release NNI will require Python 3.7+.
Hyper-Parameter Optimization
Experiment
- The legacy experiment config format is now deprecated. (doc of new config)
- If you are still using legacy format, nnictl will show equivalent new config on start. Please save it to replace the old one.
- nnictl now uses
nni.experiment.Experiment
APIs as backend. The output message of create, resume, and view commands have changed. - Added Kubeflow and Frameworkcontroller support to hybrid mode. (doc)
- The hidden tuner manifest file has been updated. This should be transparent to users, but if you encounter issues like failed to find tuner, please try to remove
~/.config/nni
.
Algorithms
- Random tuner now supports classArgs
seed
. (doc) - TPE tuner is refactored: (doc)
- Support classArgs
seed
. - Support classArgs
tpe_args
for expert users to customize algorithm behavior. - Parallel optimization has been turned on by default. To turn it off set
tpe_args.constant_liar_type
tonull
(orNone
in Python). parallel_optimize
andconstant_liar_type
has been removed. If you are using them please update your config to usetpe_args.constant_liar_type
instead.
- Support classArgs
- Grid search tuner now supports all search space types, including uniform, normal, and nested choice. (doc)
Neural Architecture Search
- Enhancement to serialization utilities (doc) and changes to recommended practice of customizing evaluators. (doc)
- Support latency constraint on edge device for ProxylessNAS based on nn-Meter. (doc)
- Trial parameters are showed more friendly in Retiarii experiments.
- Refactor NAS examples of ProxylessNAS and SPOS.
Model Compression
- New Pruner Supported in Pruning V2
- Support
nni.trace
wrappedOptimizer
in Pruning V2. In the case of not affecting the user experience as much as possible, trace the input parameters of the optimizer. (doc) - Optimize Taylor Pruner, APoZ Activation Pruner, Mean Activation Pruner in V2 memory usage.
- Add more examples for Pruning V2.
- Add document for pruning config list. (doc)
- Parameter
masks_file
ofModelSpeedup
now acceptspathlib.Path
object. (Thanks to @dosemeion) (doc) - Bug Fix
- Fix Slim Pruner in V2 not sparsify the BN weight.
- Fix Simulator Annealing Task Generator generates config ignoring 0 sparsity.
Documentation
- Supported GitHub feature "Cite this repository".
- Updated index page of readthedocs.
- Updated Chinese documentation.
- From now on NNI only maintains translation for most import docs and ensures they are up to date.
- Reorganized HPO tuners' doc.
Bugfixes
- Fixed a bug where numpy array is used as a truth value. (Thanks to @khituras)
- Fixed a bug in updating search space.
- Fixed a bug that HPO search space file does not support scientific notation and tab indent.
- For now NNI does not support mixing scientific notation and YAML features. We are waiting for PyYAML to update.
- Fixed a bug that causes DARTS 2nd order to crash.
- Fixed a bug that causes deep copy of mutation primitives (e.g., LayerChoice) to crash.
- Removed blank at bottom in Web UI overview page.