1# Using MindSpore Lite for Image Classification (C/C++)
2
3## When to Use
4
5You can use [MindSpore](../../reference/apis-mindspore-lite-kit/_mind_spore.md) to quickly deploy AI algorithms into your application to perform AI model inference for image classification.
6
7Image classification can be used to recognize objects in images and is widely used in medical image analysis, auto driving, e-commerce, and facial recognition.
8
9## Basic Concepts
10
11- N-API: a set of native APIs used to build ArkTS components. N-APIs can be used to encapsulate C/C++ libraries into ArkTS modules.
12
13## Development Process
14
151. Select an image classification model.
162. Use the MindSpore Lite inference model on the device to classify the selected images.
17
18## Environment Setup
19
20Install DevEco Studio 4.1 or later, and update the SDK to API version 11 or later.
21
22## How to Develop
23
24The following uses inference on an image in the album as an example to describe how to use MindSpore Lite to implement image classification.
25
26### Selecting a Model
27
28This sample application uses [mobilenetv2.ms](https://download.mindspore.cn/model_zoo/official/lite/mobilenetv2_openimage_lite/1.5/mobilenetv2.ms) as the image classification model. The model file is available in the **entry/src/main/resources/rawfile** project directory.
29
30If you have other pre-trained models for image classification, convert the original model into the .ms format by referring to [Using MindSpore Lite for Model Conversion](mindspore-lite-converter-guidelines.md).
31
32### Writing Code
33
34#### Image Input and Preprocessing
35
361. Call [@ohos.file.picker](../../reference/apis-core-file-kit/js-apis-file-picker.md) to pick up the desired image in the album.
37
38   ```ts
39   import { photoAccessHelper } from '@kit.MediaLibraryKit';
40   import { BusinessError } from '@kit.BasicServicesKit';
41
42   let uris: Array<string> = [];
43
44   // Create an image picker instance.
45   let photoSelectOptions = new photoAccessHelper.PhotoSelectOptions();
46
47   // Set the media file type to IMAGE and set the maximum number of media files that can be selected.
48   photoSelectOptions.MIMEType = photoAccessHelper.PhotoViewMIMETypes.IMAGE_TYPE;
49   photoSelectOptions.maxSelectNumber = 1;
50
51   // Create an album picker instance and call select() to open the album page for file selection. After file selection is done, the result set is returned through photoSelectResult.
52   let photoPicker = new photoAccessHelper.PhotoViewPicker();
53   photoPicker.select(photoSelectOptions, async (
54     err: BusinessError, photoSelectResult: photoAccessHelper.PhotoSelectResult) => {
55     if (err) {
56       console.error('MS_LITE_ERR: PhotoViewPicker.select failed with err: ' + JSON.stringify(err));
57       return;
58     }
59     console.info('MS_LITE_LOG: PhotoViewPicker.select successfully, ' +
60       'photoSelectResult uri: ' + JSON.stringify(photoSelectResult));
61     uris = photoSelectResult.photoUris;
62     console.info('MS_LITE_LOG: uri: ' + uris);
63   })
64   ```
65
662. Based on the input image size, call [@ohos.multimedia.image](../../reference/apis-image-kit/js-apis-image.md) and [@ohos.file.fs](../../reference/apis-core-file-kit/js-apis-file-fs.md) to perform operations such as cropping the image, obtain the image buffer, and standardizing the image.
67
68   ```ts
69   import { image } from '@kit.ImageKit';
70   import { fileIo } from '@kit.CoreFileKit';
71
72   let modelInputHeight: number = 224;
73   let modelInputWidth: number = 224;
74
75   // Based on the specified URI, call fileIo.openSync to open the file to obtain the FD.
76   let file = fileIo.openSync(this.uris[0], fileIo.OpenMode.READ_ONLY);
77   console.info('MS_LITE_LOG: file fd: ' + file.fd);
78
79   // Based on the FD, call fileIo.readSync to read the data in the file.
80   let inputBuffer = new ArrayBuffer(4096000);
81   let readLen = fileIo.readSync(file.fd, inputBuffer);
82   console.info('MS_LITE_LOG: readSync data to file succeed and inputBuffer size is:' + readLen);
83
84   // Perform image preprocessing through PixelMap.
85   let imageSource = image.createImageSource(file.fd);
86   imageSource.createPixelMap().then((pixelMap) => {
87     pixelMap.getImageInfo().then((info) => {
88       console.info('MS_LITE_LOG: info.width = ' + info.size.width);
89       console.info('MS_LITE_LOG: info.height = ' + info.size.height);
90       // Crop the image based on the input image size and obtain the image buffer readBuffer.
91       pixelMap.scale(256.0 / info.size.width, 256.0 / info.size.height).then(() => {
92         pixelMap.crop(
93           { x: 16, y: 16, size: { height: modelInputHeight, width: modelInputWidth } }
94         ).then(async () => {
95           let info = await pixelMap.getImageInfo();
96           console.info('MS_LITE_LOG: crop info.width = ' + info.size.width);
97           console.info('MS_LITE_LOG: crop info.height = ' + info.size.height);
98           // Set the size of readBuffer.
99           let readBuffer = new ArrayBuffer(modelInputHeight * modelInputWidth * 4);
100           await pixelMap.readPixelsToBuffer(readBuffer);
101           console.info('MS_LITE_LOG: Succeeded in reading image pixel data, buffer: ' +
102           readBuffer.byteLength);
103           // Convert readBuffer to the float32 format, and standardize the image.
104           const imageArr = new Uint8Array(
105             readBuffer.slice(0, modelInputHeight * modelInputWidth * 4));
106           console.info('MS_LITE_LOG: imageArr length: ' + imageArr.length);
107           let means = [0.485, 0.456, 0.406];
108           let stds = [0.229, 0.224, 0.225];
109           let float32View = new Float32Array(modelInputHeight * modelInputWidth * 3);
110           let index = 0;
111           for (let i = 0; i < imageArr.length; i++) {
112             if ((i + 1) % 4 == 0) {
113               float32View[index] = (imageArr[i - 3] / 255.0 - means[0]) / stds[0]; // B
114               float32View[index+1] = (imageArr[i - 2] / 255.0 - means[1]) / stds[1]; // G
115               float32View[index+2] = (imageArr[i - 1] / 255.0 - means[2]) / stds[2]; // R
116               index += 3;
117             }
118           }
119           console.info('MS_LITE_LOG: float32View length: ' + float32View.length);
120           let printStr = 'float32View data:';
121           for (let i = 0; i < 20; i++) {
122             printStr += ' ' + float32View[i];
123           }
124           console.info('MS_LITE_LOG: float32View data: ' + printStr);
125         })
126       })
127     });
128   });
129   ```
130
131#### Writing Inference Code
132
133Call [MindSpore](../../reference/apis-mindspore-lite-kit/_mind_spore.md) to implement inference on the device. The operation process is as follows:
134
1351. Include the corresponding header file.
136
137   ```c++
138   #include <iostream>
139   #include <sstream>
140   #include <stdlib.h>
141   #include <hilog/log.h>
142   #include <rawfile/raw_file_manager.h>
143   #include <mindspore/types.h>
144   #include <mindspore/model.h>
145   #include <mindspore/context.h>
146   #include <mindspore/status.h>
147   #include <mindspore/tensor.h>
148   #include "napi/native_api.h"
149   ```
150
1512. Read the model file.
152
153   ```c++
154   #define LOGI(...) ((void)OH_LOG_Print(LOG_APP, LOG_INFO, LOG_DOMAIN, "[MSLiteNapi]", __VA_ARGS__))
155   #define LOGD(...) ((void)OH_LOG_Print(LOG_APP, LOG_DEBUG, LOG_DOMAIN, "[MSLiteNapi]", __VA_ARGS__))
156   #define LOGW(...) ((void)OH_LOG_Print(LOG_APP, LOG_WARN, LOG_DOMAIN, "[MSLiteNapi]", __VA_ARGS__))
157   #define LOGE(...) ((void)OH_LOG_Print(LOG_APP, LOG_ERROR, LOG_DOMAIN, "[MSLiteNapi]", __VA_ARGS__))
158
159   void *ReadModelFile(NativeResourceManager *nativeResourceManager, const std::string &modelName, size_t *modelSize) {
160       auto rawFile = OH_ResourceManager_OpenRawFile(nativeResourceManager, modelName.c_str());
161       if (rawFile == nullptr) {
162           LOGE("MS_LITE_ERR: Open model file failed");
163           return nullptr;
164       }
165       long fileSize = OH_ResourceManager_GetRawFileSize(rawFile);
166       void *modelBuffer = malloc(fileSize);
167       if (modelBuffer == nullptr) {
168           LOGE("MS_LITE_ERR: OH_ResourceManager_ReadRawFile failed");
169       }
170       int ret = OH_ResourceManager_ReadRawFile(rawFile, modelBuffer, fileSize);
171       if (ret == 0) {
172           LOGI("MS_LITE_LOG: OH_ResourceManager_ReadRawFile failed");
173           OH_ResourceManager_CloseRawFile(rawFile);
174           return nullptr;
175       }
176       OH_ResourceManager_CloseRawFile(rawFile);
177       *modelSize = fileSize;
178       return modelBuffer;
179   }
180   ```
181
1823. Create a context, set parameters such as the number of threads and device type, and load the model.
183
184   ```c++
185   void DestroyModelBuffer(void **buffer) {
186       if (buffer == nullptr) {
187           return;
188       }
189       free(*buffer);
190       *buffer = nullptr;
191   }
192
193   OH_AI_ContextHandle CreateMSLiteContext(void *modelBuffer) {
194       // Set executing context for model.
195       auto context = OH_AI_ContextCreate();
196       if (context == nullptr) {
197           DestroyModelBuffer(&modelBuffer);
198           LOGE("MS_LITE_ERR: Create MSLite context failed.\n");
199           return nullptr;
200       }
201       auto cpu_device_info = OH_AI_DeviceInfoCreate(OH_AI_DEVICETYPE_CPU);
202
203       OH_AI_DeviceInfoSetEnableFP16(cpu_device_info, true);
204       OH_AI_ContextAddDeviceInfo(context, cpu_device_info);
205
206       LOGI("MS_LITE_LOG: Build MSLite context success.\n");
207       return context;
208   }
209
210   OH_AI_ModelHandle CreateMSLiteModel(void *modelBuffer, size_t modelSize, OH_AI_ContextHandle context) {
211       // Create model
212       auto model = OH_AI_ModelCreate();
213       if (model == nullptr) {
214           DestroyModelBuffer(&modelBuffer);
215           LOGE("MS_LITE_ERR: Allocate MSLite Model failed.\n");
216           return nullptr;
217       }
218
219       // Build model object
220       auto build_ret = OH_AI_ModelBuild(model, modelBuffer, modelSize, OH_AI_MODELTYPE_MINDIR, context);
221       DestroyModelBuffer(&modelBuffer);
222       if (build_ret != OH_AI_STATUS_SUCCESS) {
223           OH_AI_ModelDestroy(&model);
224           LOGE("MS_LITE_ERR: Build MSLite model failed.\n");
225           return nullptr;
226       }
227       LOGI("MS_LITE_LOG: Build MSLite model success.\n");
228       return model;
229   }
230   ```
231
2324. Set the model input data and perform model inference.
233
234   ```c++
235   constexpr int K_NUM_PRINT_OF_OUT_DATA = 20;
236
237   // Set the model input data.
238   int FillInputTensor(OH_AI_TensorHandle input, std::vector<float> input_data) {
239       if (OH_AI_TensorGetDataType(input) == OH_AI_DATATYPE_NUMBERTYPE_FLOAT32) {
240           float *data = (float *)OH_AI_TensorGetMutableData(input);
241           for (size_t i = 0; i < OH_AI_TensorGetElementNum(input); i++) {
242               data[i] = input_data[i];
243           }
244           return OH_AI_STATUS_SUCCESS;
245       } else {
246           return OH_AI_STATUS_LITE_ERROR;
247       }
248   }
249
250   // Execute model inference.
251   int RunMSLiteModel(OH_AI_ModelHandle model, std::vector<float> input_data) {
252       // Set input data for model.
253       auto inputs = OH_AI_ModelGetInputs(model);
254
255       auto ret = FillInputTensor(inputs.handle_list[0], input_data);
256       if (ret != OH_AI_STATUS_SUCCESS) {
257           LOGE("MS_LITE_ERR: RunMSLiteModel set input error.\n");
258           return OH_AI_STATUS_LITE_ERROR;
259       }
260       // Get model output.
261       auto outputs = OH_AI_ModelGetOutputs(model);
262       // Predict model.
263       auto predict_ret = OH_AI_ModelPredict(model, inputs, &outputs, nullptr, nullptr);
264       if (predict_ret != OH_AI_STATUS_SUCCESS) {
265           LOGE("MS_LITE_ERR: MSLite Predict error.\n");
266           return OH_AI_STATUS_LITE_ERROR;
267       }
268       LOGI("MS_LITE_LOG: Run MSLite model Predict success.\n");
269       // Print output tensor data.
270       LOGI("MS_LITE_LOG: Get model outputs:\n");
271       for (size_t i = 0; i < outputs.handle_num; i++) {
272           auto tensor = outputs.handle_list[i];
273           LOGI("MS_LITE_LOG: - Tensor %{public}d name is: %{public}s.\n", static_cast<int>(i),
274                OH_AI_TensorGetName(tensor));
275           LOGI("MS_LITE_LOG: - Tensor %{public}d size is: %{public}d.\n", static_cast<int>(i),
276                (int)OH_AI_TensorGetDataSize(tensor));
277           LOGI("MS_LITE_LOG: - Tensor data is:\n");
278           auto out_data = reinterpret_cast<const float *>(OH_AI_TensorGetData(tensor));
279           std::stringstream outStr;
280           for (int i = 0; (i < OH_AI_TensorGetElementNum(tensor)) && (i <= K_NUM_PRINT_OF_OUT_DATA); i++) {
281               outStr << out_data[i] << " ";
282           }
283           LOGI("MS_LITE_LOG: %{public}s", outStr.str().c_str());
284       }
285       return OH_AI_STATUS_SUCCESS;
286   }
287   ```
288
2895. Implement a complete model inference process.
290
291   ```c++
292   static napi_value RunDemo(napi_env env, napi_callback_info info) {
293       LOGI("MS_LITE_LOG: Enter runDemo()");
294       napi_value error_ret;
295       napi_create_int32(env, -1, &error_ret);
296       // Process the input data.
297       size_t argc = 2;
298       napi_value argv[2] = {nullptr};
299       napi_get_cb_info(env, info, &argc, argv, nullptr, nullptr);
300       bool isArray = false;
301       napi_is_array(env, argv[0], &isArray);
302       uint32_t length = 0;
303       // Obtain the length of the array.
304       napi_get_array_length(env, argv[0], &length);
305   	LOGI("MS_LITE_LOG: argv array length = %{public}d", length);
306       std::vector<float> input_data;
307       double param = 0;
308       for (int i = 0; i < length; i++) {
309           napi_value value;
310           napi_get_element(env, argv[0], i, &value);
311           napi_get_value_double(env, value, &param);
312           input_data.push_back(static_cast<float>(param));
313       }
314       std::stringstream outstr;
315       for (int i = 0; i < K_NUM_PRINT_OF_OUT_DATA; i++) {
316           outstr << input_data[i] << " ";
317       }
318   	LOGI("MS_LITE_LOG: input_data = %{public}s", outstr.str().c_str());
319       // Read model file
320       const std::string modelName = "mobilenetv2.ms";
321       LOGI("MS_LITE_LOG: Run model: %{public}s", modelName.c_str());
322       size_t modelSize;
323       auto resourcesManager = OH_ResourceManager_InitNativeResourceManager(env, argv[1]);
324       auto modelBuffer = ReadModelFile(resourcesManager, modelName, &modelSize);
325       if (modelBuffer == nullptr) {
326           LOGE("MS_LITE_ERR: Read model failed");
327           return error_ret;
328       }
329       LOGI("MS_LITE_LOG: Read model file success");
330
331       auto context = CreateMSLiteContext(modelBuffer);
332       if (context == nullptr) {
333           LOGE("MS_LITE_ERR: MSLiteFwk Build context failed.\n");
334           return error_ret;
335       }
336       auto model = CreateMSLiteModel(modelBuffer, modelSize, context);
337       if (model == nullptr) {
338           OH_AI_ContextDestroy(&context);
339           LOGE("MS_LITE_ERR: MSLiteFwk Build model failed.\n");
340           return error_ret;
341       }
342       int ret = RunMSLiteModel(model, input_data);
343       if (ret != OH_AI_STATUS_SUCCESS) {
344           OH_AI_ModelDestroy(&model);
345           OH_AI_ContextDestroy(&context);
346           LOGE("MS_LITE_ERR: RunMSLiteModel failed.\n");
347           return error_ret;
348       }
349       napi_value out_data;
350       napi_create_array(env, &out_data);
351       auto outputs = OH_AI_ModelGetOutputs(model);
352       OH_AI_TensorHandle output_0 = outputs.handle_list[0];
353       float *output0Data = reinterpret_cast<float *>(OH_AI_TensorGetMutableData(output_0));
354       for (size_t i = 0; i < OH_AI_TensorGetElementNum(output_0); i++) {
355           napi_value element;
356           napi_create_double(env, static_cast<double>(output0Data[i]), &element);
357           napi_set_element(env, out_data, i, element);
358       }
359       OH_AI_ModelDestroy(&model);
360       OH_AI_ContextDestroy(&context);
361       LOGI("MS_LITE_LOG: Exit runDemo()");
362       return out_data;
363   }
364   ```
365
3666. Write the **CMake** script to link the MindSpore Lite dynamic library.
367
368   ```c++
369   # the minimum version of CMake.
370   cmake_minimum_required(VERSION 3.4.1)
371   project(MindSporeLiteCDemo)
372
373   set(NATIVERENDER_ROOT_PATH ${CMAKE_CURRENT_SOURCE_DIR})
374
375   if(DEFINED PACKAGE_FIND_FILE)
376       include(${PACKAGE_FIND_FILE})
377   endif()
378
379   include_directories(${NATIVERENDER_ROOT_PATH}
380                       ${NATIVERENDER_ROOT_PATH}/include)
381
382   add_library(entry SHARED mslite_napi.cpp)
383   target_link_libraries(entry PUBLIC mindspore_lite_ndk)
384   target_link_libraries(entry PUBLIC hilog_ndk.z)
385   target_link_libraries(entry PUBLIC rawfile.z)
386   target_link_libraries(entry PUBLIC ace_napi.z)
387   ```
388
389#### Use N-APIs to encapsulate the C++ dynamic library into an ArkTS module.
390
3911. In **entry/src/main/cpp/types/libentry/Index.d.ts**, define the ArkTS API **runDemo ()**. The content is as follows:
392
393   ```ts
394   export const runDemo: (a: number[], b:Object) => Array<number>;
395   ```
396
3972. In the **oh-package.json5** file, associate the API with the .so file to form a complete ArkTS module.
398
399   ```json
400   {
401     "name": "libentry.so",
402     "types": "./Index.d.ts",
403     "version": "1.0.0",
404     "description": "MindSpore Lite inference module"
405   }
406   ```
407
408#### Invoke the encapsulated ArkTS module for inference and output the result.
409
410In **entry/src/main/ets/pages/Index.ets**, call the encapsulated ArkTS module to process the inference result.
411
412```ts
413import msliteNapi from 'libentry.so'
414import { resourceManager } from '@kit.LocalizationKit';
415
416let resMgr: resourceManager.ResourceManager = getContext().getApplicationContext().resourceManager;
417let max: number = 0;
418let maxIndex: number = 0;
419let maxArray: Array<number> = [];
420let maxIndexArray: Array<number> = [];
421
422// Call the runDemo function of C++. The buffer data of the input image is stored in float32View after preprocessing. For details, see Image Input and Preprocessing.
423console.info('MS_LITE_LOG: *** Start MSLite Demo ***');
424let output: Array<number> = msliteNapi.runDemo(Array.from(float32View), resMgr);
425// Obtain the maximum number of categories.
426this.max = 0;
427this.maxIndex = 0;
428this.maxArray = [];
429this.maxIndexArray = [];
430let newArray = output.filter(value => value !== max);
431for (let n = 0; n < 5; n++) {
432  max = output[0];
433  maxIndex = 0;
434  for (let m = 0; m < newArray.length; m++) {
435    if (newArray[m] > max) {
436      max = newArray[m];
437      maxIndex = m;
438    }
439  }
440  maxArray.push(Math.round(this.max * 10000));
441  maxIndexArray.push(this.maxIndex);
442  // Call the array filter function.
443  newArray = newArray.filter(value => value !== max);
444}
445console.info('MS_LITE_LOG: max:' + this.maxArray);
446console.info('MS_LITE_LOG: maxIndex:' + this.maxIndexArray);
447console.info('MS_LITE_LOG: *** Finished MSLite Demo ***');
448```
449
450### Debugging and Verification
451
4521. On DevEco Studio, connect to the device, click **Run entry**, and build your own HAP.
453
454   ```shell
455   Launching com.samples.mindsporelitecdemo
456   $ hdc shell aa force-stop com.samples.mindsporelitecdemo
457   $ hdc shell mkdir data/local/tmp/xxx
458   $ hdc file send C:\Users\xxx\MindSporeLiteCDemo\entry\build\default\outputs\default\entry-default-signed.hap "data/local/tmp/xxx"
459   $ hdc shell bm install -p data/local/tmp/xxx
460   $ hdc shell rm -rf data/local/tmp/xxx
461   $ hdc shell aa start -a EntryAbility -b com.samples.mindsporelitecdemo
462   ```
463
4642. Touch the **photo** button on the device screen, select an image, and touch **OK**. The classification result of the selected image is displayed on the device screen. In the log printing result, filter images by the keyword **MS_LITE**. The following information is displayed:
465
466   ```verilog
467   08-05 17:15:52.001   4684-4684    A03d00/JSAPP                   pid-4684              I     MS_LITE_LOG: PhotoViewPicker.select successfully, photoSelectResult uri: {"photoUris":["file://media/Photo/13/IMG_1501955351_012/plant.jpg"]}
468   ...
469   08-05 17:15:52.627   4684-4684    A03d00/JSAPP                   pid-4684              I     MS_LITE_LOG: crop info.width = 224
470   08-05 17:15:52.627   4684-4684    A03d00/JSAPP                   pid-4684              I     MS_LITE_LOG: crop info.height = 224
471   08-05 17:15:52.628   4684-4684    A03d00/JSAPP                   pid-4684              I     MS_LITE_LOG: Succeeded in reading image pixel data, buffer: 200704
472   08-05 17:15:52.971   4684-4684    A03d00/JSAPP                   pid-4684              I     MS_LITE_LOG: float32View data: float32View data: 1.2385478019714355 1.308123230934143 1.4722440242767334 1.2385478019714355 1.308123230934143 1.4722440242767334 1.2385478019714355 1.308123230934143 1.4722440242767334 1.2385478019714355 1.308123230934143 1.4722440242767334 1.2385478019714355 1.308123230934143 1.4722440242767334 1.2385478019714355 1.308123230934143 1.4722440242767334 1.2385478019714355 1.308123230934143
473   08-05 17:15:52.971   4684-4684    A03d00/JSAPP                   pid-4684              I     MS_LITE_LOG: *** Start MSLite Demo ***
474   08-05 17:15:53.454   4684-4684    A00000/[MSLiteNapi]            pid-4684              I     MS_LITE_LOG: Build MSLite model success.
475   08-05 17:15:53.753   4684-4684    A00000/[MSLiteNapi]            pid-4684              I     MS_LITE_LOG: Run MSLite model Predict success.
476   08-05 17:15:53.753   4684-4684    A00000/[MSLiteNapi]            pid-4684              I     MS_LITE_LOG: Get model outputs:
477   08-05 17:15:53.753   4684-4684    A00000/[MSLiteNapi]            pid-4684              I     MS_LITE_LOG: - Tensor 0 name is: Default/head-MobileNetV2Head/Sigmoid-op466.
478   08-05 17:15:53.753   4684-4684    A00000/[MSLiteNapi]            pid-4684              I     MS_LITE_LOG: - Tensor data is:
479   08-05 17:15:53.753   4684-4684    A00000/[MSLiteNapi]            pid-4684              I     MS_LITE_LOG: 3.43385e-06 1.40285e-05 9.11969e-07 4.91007e-05 9.50266e-07 3.94537e-07 0.0434676 3.97196e-05 0.00054832 0.000246202 1.576e-05 3.6494e-06 1.23553e-05 0.196977 5.3028e-05 3.29346e-05 4.90475e-07 1.66109e-06 7.03273e-06 8.83677e-07 3.1365e-06
480   08-05 17:15:53.781   4684-4684    A03d00/JSAPP                   pid-4684              W     MS_LITE_WARN: output length =  500 ;value =  0.0000034338463592575863,0.000014028532859811094,9.119685273617506e-7,0.000049100715841632336,9.502661555416125e-7,3.945370394831116e-7,0.04346757382154465,0.00003971960904891603,0.0005483203567564487,0.00024620210751891136,0.000015759984307806008,0.0000036493988773145247,0.00001235533181898063,0.1969769448041916,0.000053027983085485175,0.000032934600312728435,4.904751449430478e-7,0.0000016610861166554969,0.000007032729172351537,8.836767619868624e-7
481   08-05 17:15:53.831   4684-4684    A03d00/JSAPP                   pid-4684              I     MS_LITE_LOG: max:9497,7756,1970,435,46
482   08-05 17:15:53.831   4684-4684    A03d00/JSAPP                   pid-4684              I     MS_LITE_LOG: maxIndex:323,46,13,6,349
483   08-05 17:15:53.831   4684-4684    A03d00/JSAPP                   pid-4684              I     MS_LITE_LOG: *** Finished MSLite Demo ***
484   ```
485
486
487### Effects
488
489Touch the **photo** button on the device screen, select an image, and touch **OK**. The top 4 categories of the image are displayed below the image.
490
491<img src="figures/stepc1.png"  width="20%"/>     <img src="figures/step2.png" width="20%"/>     <img src="figures/step3.png" width="20%"/>     <img src="figures/stepc4.png" width="20%"/>
492