/ohos5.0/foundation/ai/neural_network_runtime/interfaces/kits/c/neural_network_runtime/ |
H A D | neural_network_core.h | 617 NN_Tensor *OH_NNTensor_Create(size_t deviceID, NN_TensorDesc *tensorDesc); 672 NN_Tensor *OH_NNTensor_CreateWithFd(size_t deviceID, 694 OH_NN_ReturnCode OH_NNTensor_Destroy(NN_Tensor **tensor); 714 NN_TensorDesc *OH_NNTensor_GetTensorDesc(const NN_Tensor *tensor); 732 void *OH_NNTensor_GetDataBuffer(const NN_Tensor *tensor); 750 OH_NN_ReturnCode OH_NNTensor_GetFd(const NN_Tensor *tensor, int *fd); 775 OH_NN_ReturnCode OH_NNTensor_GetSize(const NN_Tensor *tensor, size_t *size); 1017 NN_Tensor *inputTensor[], 1019 NN_Tensor *outputTensor[], 1062 NN_Tensor *inputTensor[], [all …]
|
H A D | neural_network_runtime_type.h | 100 typedef struct NN_Tensor NN_Tensor; typedef
|
/ohos5.0/docs/zh-cn/application-dev/reference/apis-neural-network-runtime-kit/ |
H A D | neural__network__core_8h.md | 54 …NN_Tensor](_neural_network_runtime.md#nn_tensor) \* [OH_NNTensor_Create](_neural_network_runtime.m… 55 …NN_Tensor](_neural_network_runtime.md#nn_tensor) \* [OH_NNTensor_CreateWithSize](_neural_network_r… 56 …NN_Tensor](_neural_network_runtime.md#nn_tensor) \* [OH_NNTensor_CreateWithFd](_neural_network_run… 57 …ork_runtime.md#oh_nntensor_destroy) ([NN_Tensor](_neural_network_runtime.md#nn_tensor) \*\*tensor)… 58 …e.md#oh_nntensor_gettensordesc) (const [NN_Tensor](_neural_network_runtime.md#nn_tensor) \*tensor)… 59 …e.md#oh_nntensor_getdatabuffer) (const [NN_Tensor](_neural_network_runtime.md#nn_tensor) \*tensor)… 60 …time.md#oh_nntensor_getfd) (const [NN_Tensor](_neural_network_runtime.md#nn_tensor) \*tensor, int … 61 …md#oh_nntensor_getsize) (const [NN_Tensor](_neural_network_runtime.md#nn_tensor) \*tensor, size_t … 62 …oh_nntensor_getoffset) (const [NN_Tensor](_neural_network_runtime.md#nn_tensor) \*tensor, size_t \… 73 …#oh_nnexecutor) \*executor, [NN_Tensor](_neural_network_runtime.md#nn_tensor) \*inputTensor[], siz… [all …]
|
H A D | _neural_network_runtime.md | 42 | typedef struct [NN_Tensor](#nn_tensor) [NN_Tensor](#nn_tensor) | Tensor句柄。 | 219 ### NN_Tensor subsection 222 typedef struct NN_Tensor NN_Tensor 2500 指向[NN_Tensor](#nn_tensor)实例的指针,如果创建失败就返回NULL。 2537 指向[NN_Tensor](#nn_tensor)实例的指针,如果创建失败就返回NULL。 2572 指向[NN_Tensor](#nn_tensor)实例的指针,如果创建失败就返回NULL。 2583 销毁一个[NN_Tensor](#nn_tensor)实例。 2610 获取[NN_Tensor](#nn_tensor)数据的内存地址。 2639 获取[NN_Tensor](#nn_tensor)数据所在共享内存的文件描述符。 2667 获取[NN_Tensor](#nn_tensor)数据所在共享内存上的偏移量。 [all …]
|
H A D | _neural_nework_runtime.md | 42 | typedef struct [NN_Tensor](#nn_tensor) [NN_Tensor](#nn_tensor) | Tensor句柄。 | 219 ### NN_Tensor subsection 222 typedef struct NN_Tensor NN_Tensor 2500 指向[NN_Tensor](#nn_tensor)实例的指针,如果创建失败就返回NULL。 2537 指向[NN_Tensor](#nn_tensor)实例的指针,如果创建失败就返回NULL。 2572 指向[NN_Tensor](#nn_tensor)实例的指针,如果创建失败就返回NULL。 2583 销毁一个[NN_Tensor](#nn_tensor)实例。 2610 获取[NN_Tensor](#nn_tensor)数据的内存地址。 2639 获取[NN_Tensor](#nn_tensor)数据所在共享内存的文件描述符。 2667 获取[NN_Tensor](#nn_tensor)数据所在共享内存上的偏移量。 [all …]
|
H A D | _o_h___n_n___memory.md | 12 **替代:** 推荐使用[NN_Tensor](_neural_network_runtime.md#nn_tensor)。
|
/ohos5.0/docs/en/application-dev/reference/apis-neural-network-runtime-kit/ |
H A D | neural__network__core_8h.md | 54 …NN_Tensor](_neural_network_runtime.md#nn_tensor) \* [OH_NNTensor_Create](_neural_network_runtime.m… 55 …NN_Tensor](_neural_network_runtime.md#nn_tensor) \* [OH_NNTensor_CreateWithSize](_neural_network_r… 56 …NN_Tensor](_neural_network_runtime.md#nn_tensor) \* [OH_NNTensor_CreateWithFd](_neural_network_run… 57 …runtime.md#oh_nntensor_destroy) ([NN_Tensor](_neural_network_runtime.md#nn_tensor) \*\*tensor) | D… 58 …NN_Tensor](_neural_network_runtime.md#nn_tensor) \*tensor) | Obtains an [NN_TensorDesc](_neural_ne… 59 …or_getdatabuffer) (const [NN_Tensor](_neural_network_runtime.md#nn_tensor) \*tensor) | Obtains the… 60 …(const [NN_Tensor](_neural_network_runtime.md#nn_tensor) \*tensor, int \*fd) | Obtains the file de… 61 … (const [NN_Tensor](_neural_network_runtime.md#nn_tensor) \*tensor, size_t \*size) | Obtains the s… 62 …r_getoffset) (const [NN_Tensor](_neural_network_runtime.md#nn_tensor) \*tensor, size_t \*offset) |… 73 …#oh_nnexecutor) \*executor, [NN_Tensor](_neural_network_runtime.md#nn_tensor) \*inputTensor[], siz… [all …]
|
/ohos5.0/foundation/ai/neural_network_runtime/test/unittest/components/v1_0/neural_network_core_test/ |
H A D | neural_network_core_test.cpp | 1517 NN_Tensor* tensor = nullptr; 1531 NN_Tensor* tensor = reinterpret_cast<NN_Tensor*>(hdiDevice->CreateTensor(tensorDesc)); 1544 const NN_Tensor* tensor = nullptr; 1558 NN_Tensor* tensor = reinterpret_cast<NN_Tensor*>(hdiDevice->CreateTensor(tensorDesc)); 1585 NN_Tensor* tensor = reinterpret_cast<NN_Tensor*>(hdiDevice->CreateTensor(tensorDesc)); 1614 NN_Tensor* tensor = reinterpret_cast<NN_Tensor*>(hdiDevice->CreateTensor(tensorDesc)); 1630 NN_Tensor* tensor = reinterpret_cast<NN_Tensor*>(hdiDevice->CreateTensor(tensorDesc)); 1658 NN_Tensor* tensor = reinterpret_cast<NN_Tensor*>(hdiDevice->CreateTensor(tensorDesc)); 1674 NN_Tensor* tensor = reinterpret_cast<NN_Tensor*>(hdiDevice->CreateTensor(tensorDesc)); 1703 NN_Tensor* tensor = reinterpret_cast<NN_Tensor*>(hdiDevice->CreateTensor(tensorDesc)); [all …]
|
/ohos5.0/foundation/ai/neural_network_runtime/frameworks/native/neural_network_core/ |
H A D | executor.h | 49 virtual OH_NN_ReturnCode RunSync(NN_Tensor* inputTensors[], 51 NN_Tensor* outputTensors[], 53 virtual OH_NN_ReturnCode RunAsync(NN_Tensor* inputTensors[], 55 NN_Tensor* outputTensors[],
|
H A D | neural_network_core.cpp | 1017 NN_Tensor* tensor = reinterpret_cast<NN_Tensor*>(tensorImpl); in OH_NNTensor_Create() 1049 NN_Tensor* tensor = reinterpret_cast<NN_Tensor*>(tensorImpl); in OH_NNTensor_CreateWithSize() 1053 NNRT_API NN_Tensor* OH_NNTensor_CreateWithFd(size_t deviceID, in OH_NNTensor_CreateWithFd() 1106 NN_Tensor* tensor = reinterpret_cast<NN_Tensor*>(tensorImpl); in OH_NNTensor_CreateWithFd() 1110 NNRT_API OH_NN_ReturnCode OH_NNTensor_Destroy(NN_Tensor **tensor) in OH_NNTensor_Destroy() 1139 NNRT_API NN_TensorDesc* OH_NNTensor_GetTensorDesc(const NN_Tensor *tensor) in OH_NNTensor_GetTensorDesc() 1157 NNRT_API void* OH_NNTensor_GetDataBuffer(const NN_Tensor *tensor) in OH_NNTensor_GetDataBuffer() 1639 NN_Tensor *inputTensor[], in RunSync() 1641 NN_Tensor *outputTensor[], in RunSync() 1686 NN_Tensor *inputTensor[], in OH_NNExecutor_RunSync() [all …]
|
/ohos5.0/foundation/ai/neural_network_runtime/ |
H A D | neural-network-runtime-guidelines.md | 58 | typedef struct NN_Tensor NN_Tensor | Neural Network Runtime的张量句柄,用于设置执行器的推理输入和输出张量。 | 117 | void* OH_NNTensor_GetDataBuffer(const NN_Tensor *tensor) | 获取张量数据的内存地址,可以读写张量数据。 | 119 | OH_NN_ReturnCode OH_NNTensor_GetSize(const NN_Tensor *tensor, size_t *size) | 获取张量数据所在共享内存的大小。 | 121 | OH_NN_ReturnCode OH_NNTensor_Destroy(NN_Tensor **tensor) | 销毁张量实例。 | 136 …e OH_NNExecutor_RunSync(OH_NNExecutor *executor, NN_Tensor *inputTensor[], size_t inputCount, NN_T… 137 … OH_NNExecutor_RunAsync(OH_NNExecutor *executor, NN_Tensor *inputTensor[], size_t inputCount, NN_T… 202 OH_NN_ReturnCode SetInputData(NN_Tensor* inputTensor[], size_t inputSize) 242 OH_NN_ReturnCode Print(NN_Tensor* outputTensor[], size_t outputSize) 509 NN_Tensor* inputTensors[inputCount]; 510 NN_Tensor* tensor = nullptr; [all …]
|
/ohos5.0/docs/zh-cn/application-dev/ai/nnrt/ |
H A D | neural-network-runtime-guidelines.md | 58 | typedef struct NN_Tensor NN_Tensor | Neural Network Runtime的张量句柄,用于设置执行器的推理输入和输出张量。 | 117 | void* OH_NNTensor_GetDataBuffer(const NN_Tensor *tensor) | 获取张量数据的内存地址,可以读写张量数据。 | 119 | OH_NN_ReturnCode OH_NNTensor_GetSize(const NN_Tensor *tensor, size_t *size) | 获取张量数据所在共享内存的大小。 | 121 | OH_NN_ReturnCode OH_NNTensor_Destroy(NN_Tensor **tensor) | 销毁张量实例。 | 136 …e OH_NNExecutor_RunSync(OH_NNExecutor *executor, NN_Tensor *inputTensor[], size_t inputCount, NN_T… 137 … OH_NNExecutor_RunAsync(OH_NNExecutor *executor, NN_Tensor *inputTensor[], size_t inputCount, NN_T… 193 OH_NN_ReturnCode SetInputData(NN_Tensor* inputTensor[], size_t inputSize) 233 OH_NN_ReturnCode Print(NN_Tensor* outputTensor[], size_t outputSize) 500 NN_Tensor* inputTensors[inputCount]; 501 NN_Tensor* tensor = nullptr; [all …]
|
/ohos5.0/foundation/ai/neural_network_runtime/frameworks/native/neural_network_runtime/ |
H A D | nnexecutor.h | 48 OH_NN_ReturnCode RunSync(NN_Tensor* inputTensors[], 50 NN_Tensor* outputTensors[], 52 OH_NN_ReturnCode RunAsync(NN_Tensor* inputTensors[], 54 NN_Tensor* outputTensors[], 78 OH_NN_ReturnCode CheckInputDimRanges(NN_Tensor* inputTensors[], size_t inputSize);
|
H A D | prepared_model.h | 38 virtual OH_NN_ReturnCode Run(const std::vector<NN_Tensor*>& inputs, 39 const std::vector<NN_Tensor*>& outputs,
|
H A D | hdi_prepared_model_v1_0.h | 45 OH_NN_ReturnCode Run(const std::vector<NN_Tensor*>& inputs, 46 const std::vector<NN_Tensor*>& outputs,
|
H A D | hdi_prepared_model_v2_0.h | 46 OH_NN_ReturnCode Run(const std::vector<NN_Tensor*>& inputs, 47 const std::vector<NN_Tensor*>& outputs,
|
H A D | hdi_prepared_model_v2_1.h | 46 OH_NN_ReturnCode Run(const std::vector<NN_Tensor*>& inputs, 47 const std::vector<NN_Tensor*>& outputs,
|
H A D | hdi_prepared_model_v1_0.cpp | 96 OH_NN_ReturnCode TransIOTensor(const NN_Tensor* tensor, V1_0::IOTensor& ioTensor) in TransIOTensor() 242 OH_NN_ReturnCode HDIPreparedModelV1_0::Run(const std::vector<NN_Tensor*>& inputs, in Run() 243 const std::vector<NN_Tensor*>& outputs, std::vector<std::vector<int32_t>>& outputsDims, in Run()
|
H A D | hdi_prepared_model_v2_0.cpp | 97 OH_NN_ReturnCode TransIOTensor(const NN_Tensor* tensor, V2_0::IOTensor& ioTensor) in TransIOTensor() 245 OH_NN_ReturnCode HDIPreparedModelV2_0::Run(const std::vector<NN_Tensor*>& inputs, in Run() 246 const std::vector<NN_Tensor*>& outputs, std::vector<std::vector<int32_t>>& outputsDims, in Run()
|
H A D | hdi_prepared_model_v2_1.cpp | 97 OH_NN_ReturnCode TransIOTensor(const NN_Tensor* tensor, V2_1::IOTensor& ioTensor) in TransIOTensor() 245 OH_NN_ReturnCode HDIPreparedModelV2_1::Run(const std::vector<NN_Tensor*>& inputs, in Run() 246 const std::vector<NN_Tensor*>& outputs, std::vector<std::vector<int32_t>>& outputsDims, in Run()
|
H A D | nnexecutor.cpp | 240 OH_NN_ReturnCode NNExecutor::RunSync(NN_Tensor* inputTensors[], size_t inputSize, in RunSync() 241 NN_Tensor* outputTensors[], size_t outputSize) in RunSync() 262 std::vector<NN_Tensor*> inputTensorsVec; in RunSync() 271 std::vector<NN_Tensor*> outputTensorsVec; in RunSync() 318 OH_NN_ReturnCode NNExecutor::RunAsync(NN_Tensor* inputTensors[], size_t inputSize, in RunAsync() 319 NN_Tensor* outputTensors[], size_t outputSize, int32_t timeout, void* userData) in RunAsync() 341 OH_NN_ReturnCode NNExecutor::CheckInputDimRanges(NN_Tensor* inputTensors[], size_t inputSize) in CheckInputDimRanges()
|
/ohos5.0/foundation/ai/neural_network_runtime/test/unittest/components/v1_0/hdi_prepared_model/ |
H A D | hdi_prepared_model_test.cpp | 653 std::vector<NN_Tensor*> inputs; 654 std::vector<NN_Tensor*> outputs; 677 std::vector<NN_Tensor*> inputs; 678 std::vector<NN_Tensor*> outputs; 683 NN_Tensor* tensor = reinterpret_cast<NN_Tensor*>(tensorImpl); 705 std::vector<NN_Tensor*> inputs; 706 std::vector<NN_Tensor*> outputs; 715 NN_Tensor* tensor = reinterpret_cast<NN_Tensor*>(tensorImpl); 735 std::vector<NN_Tensor*> inputs; 736 std::vector<NN_Tensor*> outputs; [all …]
|
/ohos5.0/foundation/ai/neural_network_runtime/test/unittest/components/v2_0/hdi_prepared_model/ |
H A D | hdi_prepared_model_test.cpp | 648 std::vector<NN_Tensor*> inputs; 649 std::vector<NN_Tensor*> outputs; 672 std::vector<NN_Tensor*> inputs; 673 std::vector<NN_Tensor*> outputs; 678 NN_Tensor* tensor = reinterpret_cast<NN_Tensor*>(tensorImpl); 700 std::vector<NN_Tensor*> inputs; 701 std::vector<NN_Tensor*> outputs; 710 NN_Tensor* tensor = reinterpret_cast<NN_Tensor*>(tensorImpl); 730 std::vector<NN_Tensor*> inputs; 731 std::vector<NN_Tensor*> outputs; [all …]
|
/ohos5.0/foundation/ai/neural_network_runtime/test/unittest/components/v2_1/hdi_prepared_model/ |
H A D | hdi_prepared_model_test.cpp | 647 std::vector<NN_Tensor*> inputs; 648 std::vector<NN_Tensor*> outputs; 671 std::vector<NN_Tensor*> inputs; 672 std::vector<NN_Tensor*> outputs; 677 NN_Tensor* tensor = reinterpret_cast<NN_Tensor*>(tensorImpl); 699 std::vector<NN_Tensor*> inputs; 700 std::vector<NN_Tensor*> outputs; 709 NN_Tensor* tensor = reinterpret_cast<NN_Tensor*>(tensorImpl); 729 std::vector<NN_Tensor*> inputs; 730 std::vector<NN_Tensor*> outputs; [all …]
|
/ohos5.0/docs/en/application-dev/ai/nnrt/ |
H A D | neural-network-runtime-guidelines.md | 58 | typedef struct NN_Tensor NN_Tensor | Tensor handle, which is used to set the inference input and … 113 | NN_Tensor* OH_NNTensor_Create(size_t deviceID, NN_TensorDesc *tensorDesc) | Creates an **NN_Tenso… 114 | NN_Tensor* OH_NNTensor_CreateWithSize(size_t deviceID, NN_TensorDesc *tensorDesc, size_t size) | … 121 | OH_NN_ReturnCode OH_NNTensor_Destroy(NN_Tensor **tensor) | Destroys an **NN_Tensor** instance.| 136 …e OH_NNExecutor_RunSync(OH_NNExecutor *executor, NN_Tensor *inputTensor[], size_t inputCount, NN_T… 137 … OH_NNExecutor_RunAsync(OH_NNExecutor *executor, NN_Tensor *inputTensor[], size_t inputCount, NN_T… 193 OH_NN_ReturnCode SetInputData(NN_Tensor* inputTensor[], size_t inputSize) 233 OH_NN_ReturnCode Print(NN_Tensor* outputTensor[], size_t outputSize) 500 NN_Tensor* inputTensors[inputCount]; 501 NN_Tensor* tensor = nullptr; [all …]
|