Commands for Compilation

Before compiling the training framework, you need to compile the inference engine for the platform of interest, such as Android.

Run the following command under MNN_ROOT/build/ to compile the MNN training framework:

  1. cmake .. -DMNN_BUILD_TRAIN=ON -MNN_BUILD_TRAIN_MINI=OFF -MNN_USE_OPENCV=OFF

If you enable the MNN_USE_OPENCV macro, OpenCV is used, which is used in some demos.
For mobile/embedded devices, it is recommended to set MNN_BUILD_TRAIN_MINI = ON, which does not the compille the built-in DataSet and models. There are generally other solutions on mobile/embedded devices for the dataset and models.

Compilation Artifacts

  1. MNNTrain: Training Framework Library
  2. runTrainDemo.out: entry program for the training framework demo

Run the demo program

Run the compiled runTrainDemo.out

./runTrainDemo.out

You can see the output

Usage: ./runTrainDemo.out CASENAME [ARGS] Valid Case: DataLoaderDemo DataLoaderTest DistillTrainQuant ImageDatasetDemo LinearRegress MatMulGradTest MnistInt8Train MnistTrain MnistTrainSnapshot MobilenetV2PostTrain MobilenetV2Train MobilenetV2TrainQuant MobilenetV2Transfer NNGrad NNGradV2 NNGradV3 OctaveMnist PostTrain PostTrainMobilenet QuanByMSE QuanMnist TestMSE

Lists all demo examples available for use. If we want to run the training of the MNIST dataset, run

./runTrainDemo.out MnistTrain

You can see the output

usage: ./runTrainDemo.out MnistTrain /path/to/unzipped/mnist/data/ [depthwise]

which indicates that we need to download and decompress the MNIST dataset, and then specify the path of the decompressed MNIST data in the command line. The last parameter is optional. If set, then a MNIST model using depthwise convolution is trained.

All executable demos are in the MNN_ROOT/tools/train/source/demo/path and can be used as a reference to implement your customized features.