微软AutoML开源工具NNI使用说明

技术

1.NNI-介绍

• NNI(NeuralNetwork Intelligence):An open source AutoMLtoolkit for neural architecture search and hyper-parameter tuning.

https://github.com/microsoft/nni

https://nni.readthedocs.io/en/latest/

picture.image

2.NNI-安装

• Install& Verify

• 1.python -m pip install --upgrade nni

• 2.git clone -b v1.1 https://github.com/Microsoft/nni.git

下面3-4可选,也可以直接pip install nni

•3. cd nni

•4. powershell -ExecutionPolicyBypass -file install.ps1

3.NNI-mnist实例

• Run the MNIST example

• nnictl create --config

nni\examples\trials\mnist\config_windows.yml

the result as follows:

picture.image

you can view the result though access to http://127.0.0.1:8080 on local system

4.NNI-WEBUI可以查看训练过程

picture.image

picture.image

picture.image

picture.image

5.NNI architecture & KeyConcepts

picture.image

Key Concepts

Experiment : Anexperiment is one task of, for example, finding out the best hyperparameters of amodel, finding out the best neural network architecture. It consists of trialsand AutoMLalgorithms.

Search Space : Itmeans the feasible region for tuning the model. For example, the value range ofeach hyperparameters.

Configuration : Aconfiguration is an instance from the search space, that is, each hyperparameter hasa specific value.

Trial : Trial is an individual attempt atapplying a new configuration (e.g., a set of hyperparametervalues, a specific nerualarchitecture). Trial code should be able to run with the providedconfiguration.

Tuner : Tuner is an AutoMLalgorithm, which generates a new configuration for the next try. A new trialwill run with this configuration.

Assessor : Assessor analyzes trial’s intermediate results (e.g., periodically evaluatedaccuracy on test dataset) to tell whether this trial can be early stopped ornot.

Training Platform : Itmeans where trials are executed. Depending on your experiment’s configuration,it could be your local machine, or remote servers, or large-scale trainingplatform (e.g., OpenPAI,Kubernetes).

下面分别介绍NNI的两种使用方式API和anotation

6.NNI API

• Step 1 - Prepare a Search Space

parameters file

picture.image picture.image

Reference:https://github.com/microsoft/nni/blob/master/docs/en\_US/Tutorial/SearchSpaceSpec.md

• Step 2 - Update model codes

• import nni

• Get configuration from Tuner

RECEIVED_PARAMS = nni.get_next_parameter()

RECEIVED\_PARAMS is an object, for example:



{"conv\_size": 2, "hidden\_size":124, "learning\_rate":0.0307,   "dropout\_rate":0.2029}

• Reportmetric data periodically (optional)

nni.report_intermediate_result(metrics)

This metrics is reported to assessor .

Usually, metrics could be periodically evaluated loss or accuracy

• Report performance of the configuration

nni.report_final_result(metrics)

This metrics is reported to tuner .

Reference:https://nni.readthedocs.io/en/latest/sdk\_reference.html

• Step 3 - Enable NNI API

• useAnnotation:false

• searchSpacePath:/path/to/your/search_space.json

• Reference:https://github.com/microsoft/nni/blob/master/docs/en\_US/Tutorial/ExperimentConfig.md

7.NNI Python Annotation

• Step 1 - Update codes with annotations

• @nni.variable willtake effect on its following line, which is an assignment statement whose leftvalue mustbe specified by the keyword name in @nni.variable

picture.image

• Step 2 - Enable NNI Annotation

• useAnnotation:true

• Reference:https://github.com/microsoft/nni/blob/master/docs/en\_US/Tutorial/AnnotationSpec.md

0
0
0
0
关于作者
关于作者

文章

0

获赞

0

收藏

0

相关资源
高性能存储虚拟化方案 NVMe over Fabric 在火山引擎的演进
在云计算中,虚拟化存储扮演着重要角色,其中 iSCSI 协议在业界开放、流行多年。近年来,拥有更优性能的 NVMe over Fabrics 协议也得到了发展。本次分享介绍了 NVMe over Fabrics 在云原生和虚拟化方向的演进工作和成果。
相关产品
评论
未登录
看完啦,登录分享一下感受吧~
暂无评论