Linux / macOS / Ubuntu
build product first:
# in root directory of MNN
mkdir build
cd build
cmake .. -DMNN_BUILD_BENCHMARK=true && make -j4
then execute the commmand:
./benchmark.out models_folder [loop_count] [forwardtype]
forwardtype is in these options: 0->CPU,1->Metal,3->OpenCL,6->OpenGL,7->Vulkan. Here are benchmark models: models.
Android
You can directly execute the script bench_android.sh
in the benchmark directory. It builds in armeabi-v7a architecture by default, and in arm64-v8a architecture if builds with parameter of arm64-v8a. BenchmarkModels will be pushed to your device if executed with parameter of -p.benchmark.txt
will be generated in benchmark directory after the execution.
iOS
- Prepare models with running the script
get_model.sh
in the tools/scropt; - Open demo project in demo/iOS and run with
Benchmark
button at right-top edge, you can switch model, forward type and thread number for benchmark with bottom toolbar.
Benchmark for Models Constructed usig MNN Express
Use the following command for help:
./benchmarkExprModels.out help
Example:
./benchmarkExprModels.out MobileNetV1_100_1.0_224 10 0 4
./benchmarkExprModels.out MobileNetV2_100 10 0 4
./benchmarkExprModels.out ResNet_100_18 10 0 4
./benchmarkExprModels.out GoogLeNet_100 10 0 4
./benchmarkExprModels.out SqueezeNet_100 10 0 4
./benchmarkExprModels.out ShuffleNet_100_4 10 0 4
The links to the papers for these models are in the header files, e.g. benchmark/exprModels/MobileNetExpr.hpp