Watch 3 Star 4 Fork 1

Atlas200dk / sample-classificationC++Apache-2.0

Create your Gitee Account
Explore and code with more than 5 million developers,Free private repositories !:)
Sign up
本Application支持运行在Atlas 200 DK或者AI加速云服务器上,实现了对常见的分类网络的推理功能并输出前n个推理结果 spread retract

Clone or download
Notice: Creating folder will generate an empty file .keep, because not support in Git


This case is only used for learning. It is not responsible for the effect and does not support commercial use.

Classification Network Application (C++)

This application can run on the Atlas 200 DK or the AI acceleration cloud server to implement inference on a common classification network and output the first n inference results.

The applications in the current version branch adapt to DDK&RunTime and later.


Before deploying this sample, ensure that:

  • Mind Studio has been installed.
  • The Atlas 200 DK developer board has been connected to Mind Studio, the cross compiler has been installed, the SD card has been prepared, and basic information has been configured.


You can use either of the following methods:

  1. Quick deployment: visit


    • The quick deployment script can be used to deploy multiple samples rapidly. Select classification.
    • The quick deployment script automatically completes code download, model conversion, and environment variable configuration. To learn about the detailed deployment process, select the common deployment mode. Go to 2. Common deployment.
  2. Common deployment: visit


    • In this deployment mode, you need to manually download code, convert models, and configure environment variables. After that, you will have a better understanding of the process.


  1. Open the project.

    Go to the directory that stores the decompressed installation package as the Mind Studio installation user in CLI mode, for example, $HOME/MindStudio-ubuntu/bin. Run the following command to start Mind Studio:


    Open the sample-classification project, as shown in Figure 1.

    Figure 1 Opening the classification project

  2. Configure project information in the src/param_configure.conf file.

    Figure 2 Configuration file path

    The default configurations of the configuration file are as follows:

    • remote_host: IP address of the Atlas 200 DK developer board
    • model_name: offline model name


    • All the three parameters must be set. Otherwise, the build fails.
    • Do not use double quotation marks ("") during parameter settings.
    • You can type only one model name in the configuration file. In this example, the AlexNet model is used as an example. You can replace it with a model listed in the common deployment by referring to the operation procedure.
    • Modify the default configurations as required.
  3. Run the script to adjust configuration parameters and download and compile the third-party library. Open the Terminal window of Mind Studio. By default, the home directory of the code is used. Run the script in the background to deploy the environment, as shown in Figure 3.

    Figure 3 Running the script


    • During the first deployment, if no third-party library is used, the system automatically downloads and builds the third-party library, which may take a long time. The third-party library can be directly used for the subsequent build.
    • During deployment, select the IP address of the host that communicates with the developer board. Generally, the IP address is the IP address configured for the virtual NIC. If the IP address is in the same network segment as the IP address of the developer board, it is automatically selected for deployment. If they are not in the same network segment, you need to manually type the IP address of the host that communicates with the Atlas DK to complete the deployment.
  4. Start building. Open Mind Studio and choose Build > Build > Build-Configuration from the main menu. The build and run folders are generated in the directory, as shown in Figure 4.

    Figure 4 Build and file generating

    When you build a project for the first time, Build > Build is unavailable. You need to choose Build > Edit Build Configuration to set parameters before the build.

  5. Upload the images to be inferred to any directory of the HwHiAiUser user on the host side.

    The image requirements are as follows:

    • Format: jpg, png, and bmp
    • Width of the input image: an integer ranging from 16px to 4096px
    • Height of the input image: an integer ranging from 16px to 4096px


  1. On the toolbar of Mind Studio, click Run and choose Run > Run 'sample-classification'. As shown in Figure 5, the executable application is running on the developer board.

    Figure 5 Running application

    You can ignore the error information reported during the execution because Mind Studio cannot transfer parameters for an executable application. In the preceding steps, the executable application and dependent library files are deployed to the developer board. You need to log in to the developer board in SSH mode and manually execute the files in the corresponding directory. For details, see the following steps.

  2. Log in to the host side as the HwHiAiUser user in SSH mode on Ubuntu Server where Mind Studio is located.

    ssh HwHiAiUser@host_ip

    For the Atlas 200 DK, the default value of host_ip is (USB connection mode) or (NIC connection mode).

  3. Go to the path of the executable files of the classification network application.

    cd ~/HIAI_PROJECTS/workspace_mind_studio/sample-classification_XXXXX/out


    • In this path, _XXXXX _in sample-classification_XXXXX is a combination of letters and digits generated randomly each time the application is built.
  4. Run the application.

    Run the script to print the inference result on the execution terminal.

    Command example:

    python3 -w 227 -h 227 -i ./example.jpg -n 10

    • -w/model_width: width of the input image of a model. The value is an integer ranging from 16 to 4096.Here is the width of the input data required for the sample model 227. If you use other models, please refer to Input data width
    • -h/model_height: height of the input image of a model. The value is an integer ranging from 16 to 4096.Here is the sample model requires high input data: 227, if you use other models, please refer to Input data high
    • -i/input_path: path of the input image. It can be a directory, indicating that all images in the current directory are used as input. (Multiple inputs can be specified).
    • -n/top_n: the first n inference results that are output

    For other parameters, run the python3 --help command to check help information.

Comments ( 1 )

Sign in for post a comment