0

I am currently using OpenVINO's inference engine for Deep Learning Inference in the field of Computational Fluid Dynamics. The CFD-software I am using is OpenFOAM, it does not support CMake, instead the user must use so called wmake.

In order to compile code for a Third Party software like OpenVINO I have to provide all required header files and libraries for the compilation. However I am not sure how to find out all those reqired by OpenVINO's inference engine.

Can somebody explain me how to find out the correct headers to include and libraries to link?

Thanks in advance!

1 Answer 1

1

To compile project with OpenVino you would need

  1. Includes: OV_LOCATION\deployment_tools\inference_engine\include
  2. Libraries: OV_LOCATION\deployment_tools\inference_engine\lib

To run on CPU, from OV_LOCATION\deployment_tools\inference_engine\bin you would need: inference_engine, mkl_tiny_tbb, MKLDNNPlugin, tbb shared libraries.

Sign up to request clarification or add additional context in comments.

2 Comments

Hello Dmitry, thanks for your reply. I guess I am able to include the headers, however in lib/intel64 there are some non-.so files: cache.json, hddl_perfcheck, MvNCAPI-ma2450.mvcmd, MvNCAPI-ma2480.mvcmd, myriad_compile, myriad_perfcheck, and a folder called cldnn_global_custom_kernels (containing cldnn_global_custom_kernels.xml, ctc_greedy_decoder.cl, grn.cl, interp.cl, prior_box_clustered.cl). Can you tell me what these are or whether I need them for anything? Moreover I have no folder OV_LOCATION\deployment_tools\inference_engine\bin, what would I be supposed to find there? Thanks!
Hi! My bad, bin is for Windows, on Linux everything will be at OV_LOCATION\deployment_tools\inference_engine\lib

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.