Sign Up
Log In
Log In
or
Sign Up
Places
All Projects
Status Monitor
Collapse sidebar
home:cabelo:intel
openvino
openvino.spec
Overview
Repositories
Revisions
Requests
Users
Attributes
Meta
File openvino.spec of Package openvino
# # spec file for package openvino # # Copyright (c) 2024 SUSE LLC # Copyright (c) 2024 Alessandro de Oliveira Faria (A.K.A. CABELO) <cabelo@opensuse.org> or <alessandro.faria@owasp.org> # # All modifications and additions to the file contributed by third parties # remain the property of their copyright owners, unless otherwise agreed # upon. The license for this file, and modifications and additions to the # file, is the same license as for the pristine package itself (unless the # license for the pristine package is not an Open Source License, in which # case the license is the MIT License). An "Open Source License" is a # license that conforms to the Open Source Definition (Version 1.9) # published by the Open Source Initiative. # Please submit bugfixes or comments via https://bugs.opensuse.org/ # %define libversion 2400 Name: openvino Version: 2024.0.0 Release: 0 Summary: OpenVINO™ Toolkit Loader License: Apache-2.0 URL: https://github.com/openvinotoolkit/openvino Source0: %{url}/archive/%{version}.tar.gz BuildRequires: ade-devel BuildRequires: cmake BuildRequires: fdupes BuildRequires: gcc-c++ BuildRequires: libva-devel BuildRequires: nlohmann_json-devel BuildRequires: ocl-icd-devel BuildRequires: opencl-cpp-headers BuildRequires: opencl-headers BuildRequires: pkgconfig BuildRequires: protobuf-devel BuildRequires: pugixml-devel BuildRequires: snappy-devel BuildRequires: tbb-devel BuildRequires: zlib-devel %description OpenVINO C / C++ Toolkit. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n lib%{name} Summary: Intel OpenVINO™ Toolkit %description -n lib%{name} OpenVINO C / C++ Runtime libraries. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n %{name}-devel Summary: Intel OpenVINO™ Toolkit Requires: %{name}-sample Requires: %{name}-auto-batch-plugin = %{version} Requires: %{name}-auto-plugin = %{version} Requires: %{name}-hetero-plugin = %{version} Requires: %{name}-intel-cpu-plugin = %{version} Requires: lib%{name}-ir-frontend = %{version} Requires: lib%{name}-onnx-frontend = %{version} Requires: lib%{name}-paddle-frontend = %{version} Requires: lib%{name}-pytorch-frontend = %{version} Requires: lib%{name}-tensorflow-frontend = %{version} Requires: lib%{name}-tensorflow_lite-frontend = %{version} %description -n %{name}-devel Intel(R) Distribution of OpenVINO(TM) Toolkit C / C++ Development files. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n %{name}-auto-batch-plugin Summary: Intel OpenVINO™ Toolkit Requires: lib%{name} = %{version} #Requires: lib%{name}-%{version} %description -n %{name}-auto-batch-plugin OpenVINO Automatic Batching software plugin. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n %{name}-auto-plugin Summary: Intel OpenVINO™ Toolkit Requires: lib%{name} = %{version} %description -n %{name}-auto-plugin OpenVINO Auto / Multi software plugin. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n %{name}-intel-cpu-plugin Summary: Intel OpenVINO™ Toolkit Requires: lib%{name} = %{version} %description -n %{name}-intel-cpu-plugin Intel® CPU inference plugin. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n %{name}-hetero-plugin Summary: Intel OpenVINO™ Toolkit Requires: lib%{name} = %{version} %description -n %{name}-hetero-plugin OpenVINO Hetero software plugin. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n lib%{name}-onnx-frontend Summary: Intel OpenVINO™ Toolkit Requires: lib%{name} = %{version} %description -n lib%{name}-onnx-frontend OpenVINO ONNX Frontend. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n lib%{name}-ir-frontend Summary: Intel OpenVINO™ Toolkit Requires: lib%{name} = %{version} %description -n lib%{name}-ir-frontend OpenVINO IR Frontend. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n lib%{name}-paddle-frontend Summary: Intel OpenVINO™ Toolkit Requires: lib%{name} = %{version} %description -n lib%{name}-paddle-frontend OpenVINO Paddle Frontend. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n lib%{name}-pytorch-frontend Summary: Intel OpenVINO™ Toolkit Requires: lib%{name} = %{version} %description -n lib%{name}-pytorch-frontend OpenVINO PyTorch Frontend. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n lib%{name}-tensorflow-frontend Summary: Intel OpenVINO™ Toolkit Requires: lib%{name} = %{version} %description -n lib%{name}-tensorflow-frontend OpenVINO TensorFlow Frontend. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n lib%{name}-tensorflow_lite-frontend Summary: Intel OpenVINO™ Toolkit Requires: lib%{name} = %{version} %description -n lib%{name}-tensorflow_lite-frontend OpenVINO TensorFlow Lite Frontend. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n %{name}-sample Summary: Intel OpenVINO™ Toolkit Requires: lib%{name} = %{version} BuildArch: noarch %description -n %{name}-sample Intel(R) Distribution of OpenVINO(TM) Toolkit C / C++ Samples. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %prep %setup -q %build %cmake \ -DCMAKE_BUILD_TYPE=Release \ -DBUILD_SHARED_LIBS=ON -DENABLE_OV_ONNX_FRONTEND=ON \ -DENABLE_OV_PADDLE_FRONTEND=ON \ -DENABLE_OV_PYTORCH_FRONTEND=ON -DENABLE_OV_IR_FRONTEND=ON \ -DENABLE_OV_TF_FRONTEND=ON -DENABLE_OV_TF_LITE_FRONTEND=ON \ -DENABLE_INTEL_GPU=OFF \ -DENABLE_INTEL_GNA=OFF \ -DENABLE_PYTHON=OFF -DENABLE_WHEEL=OFF \ %{nil} %cmake_build %install %cmake_install mkdir %{buildroot}%{_libdir} mkdir %{buildroot}%{_libdir}/pkgconfig mkdir %{buildroot}%{_libdir}/cmake mkdir %{buildroot}%{_libdir}/cmake/inferenceengine%{version} mkdir %{buildroot}%{_libdir}/cmake/ngraph%{version} mkdir %{buildroot}%{_libdir}/cmake/openvino%{version} mkdir %{buildroot}%{_libdir}/openvino-%{version} mkdir %{buildroot}%{_includedir} mkdir -p %{buildroot}%{_datadir}/licenses/openvino mkdir -p %{buildroot}%{_datadir}/openvino/samples/ mv %{buildroot}%{_prefix}/runtime/cmake/OpenVINOConfig-version.cmake %{buildroot}%{_libdir}/cmake/openvino%{version} mv %{buildroot}%{_prefix}/runtime/cmake/OpenVINOConfig.cmake %{buildroot}%{_libdir}/cmake/openvino%{version} mv %{buildroot}%{_prefix}/runtime/cmake/OpenVINOTargets-release.cmake %{buildroot}%{_libdir}/cmake/openvino%{version} mv %{buildroot}%{_prefix}/runtime/cmake/OpenVINOTargets.cmake %{buildroot}%{_libdir}/cmake/openvino%{version} mv %{buildroot}%{_prefix}/runtime/lib/intel64/pkgconfig/openvino.pc %{buildroot}%{_libdir}/pkgconfig ls -l %{buildroot}%{_prefix}/runtime/lib/intel64/pkgconfig/ rmdir %{buildroot}%{_prefix}/runtime/lib/intel64/pkgconfig/ rm %{buildroot}%{_prefix}/install_dependencies/install_openvino_dependencies.sh rm %{buildroot}%{_prefix}/setupvars.sh rmdir %{buildroot}%{_prefix}/install_dependencies/ mv %{buildroot}%{_prefix}/runtime/include/* %{buildroot}%{_includedir} mv %{buildroot}%{_prefix}/runtime/lib/intel64/*plugin* %{buildroot}%{_libdir}/openvino-%{version}/ mv %{buildroot}%{_prefix}/runtime/lib/intel64/* %{buildroot}%{_libdir}/ mv %{buildroot}%{_prefix}/samples/cpp %{buildroot}%{_datadir}/openvino/samples/ mv %{buildroot}%{_prefix}/samples/c %{buildroot}%{_datadir}/openvino/samples/ rm -rf %{buildroot}%{_prefix}/tools/deployment_manager rm -rf %{buildroot}%{_prefix}/samples/python cp /home/abuild/rpmbuild/BUILD/openvino-%{version}/LICENSE %{buildroot}%{_datadir}/licenses/openvino mv %{buildroot}%{_prefix}/licenses/runtime-third-party-programs.txt %{buildroot}%{_datadir}/licenses/openvino mv %{buildroot}%{_prefix}/licenses/onednn_third-party-programs.txt %{buildroot}%{_datadir}/licenses/openvino mv %{buildroot}%{_prefix}/licenses/onetbb_third-party-programs.txt %{buildroot}%{_datadir}/licenses/openvino sed -i "s/\${_IMPORT_PREFIX}\/runtime\//\/usr\//g" %{buildroot}%{_libdir}/cmake/openvino%{version}/OpenVINOTargets.cmake sed -i "s/\/runtime\/lib\/intel64//g" %{buildroot}%{_libdir}/cmake/openvino%{version}/OpenVINOTargets-release.cmake sed -i "s/\/runtime\/lib\/intel64//g" %{buildroot}%{_libdir}/pkgconfig/openvino.pc sed -i "s/runtime\/include/include/g" %{buildroot}%{_libdir}/pkgconfig/openvino.pc sed -i "s/\/ie/\/include\/ie/g" %{buildroot}%{_libdir}/pkgconfig/openvino.pc rm -rf %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/* %post -n lib%{name} -p /sbin/ldconfig %postun -n lib%{name} -p /sbin/ldconfig %post -n lib%{name}-onnx-frontend -p /sbin/ldconfig %postun -n lib%{name}-onnx-frontend -p /sbin/ldconfig %post -n lib%{name}-ir-frontend -p /sbin/ldconfig %postun -n lib%{name}-ir-frontend -p /sbin/ldconfig %post -n lib%{name}-paddle-frontend -p /sbin/ldconfig %postun -n lib%{name}-paddle-frontend -p /sbin/ldconfig %post -n lib%{name}-pytorch-frontend -p /sbin/ldconfig %postun -n lib%{name}-pytorch-frontend -p /sbin/ldconfig %post -n lib%{name}-tensorflow-frontend -p /sbin/ldconfig %postun -n lib%{name}-tensorflow-frontend -p /sbin/ldconfig %post -n lib%{name}-tensorflow_lite-frontend -p /sbin/ldconfig %postun -n lib%{name}-tensorflow_lite-frontend -p /sbin/ldconfig %files %{_licensedir}/%{name}/ %{_licensedir}/openvino/* %files -n lib%{name} %{_libdir}/libopenvino.so.%{version} %{_libdir}/libopenvino.so.%{libversion} %{_libdir}/libopenvino_c.so.%{version} %{_libdir}/libopenvino_c.so.%{libversion} %files -n %{name}-auto-batch-plugin %dir %{_libdir}/%{name}-%{version}/ %{_libdir}/%{name}-%{version}/libopenvino_auto_batch_plugin.so %files -n %{name}-auto-plugin %{_libdir}/%{name}-%{version}/libopenvino_auto_plugin.so %files -n %{name}-intel-cpu-plugin %{_libdir}/%{name}-%{version}/libopenvino_intel_cpu_plugin.so %files -n %{name}-hetero-plugin %{_libdir}/%{name}-%{version}/libopenvino_hetero_plugin.so %files -n lib%{name}-onnx-frontend %{_libdir}/libopenvino_onnx_frontend.so.%{version} %{_libdir}/libopenvino_onnx_frontend.so.%{libversion} %files -n lib%{name}-ir-frontend %{_libdir}/libopenvino_ir_frontend.so.%{version} %{_libdir}/libopenvino_ir_frontend.so.%{libversion} %files -n lib%{name}-paddle-frontend %{_libdir}/libopenvino_paddle_frontend.so.%{version} %{_libdir}/libopenvino_paddle_frontend.so.%{libversion} %files -n lib%{name}-pytorch-frontend %{_libdir}/libopenvino_pytorch_frontend.so.%{version} %{_libdir}/libopenvino_pytorch_frontend.so.%{libversion} %files -n lib%{name}-tensorflow-frontend %{_libdir}/libopenvino_tensorflow_frontend.so.%{version} %{_libdir}/libopenvino_tensorflow_frontend.so.%{libversion} %files -n lib%{name}-tensorflow_lite-frontend %{_libdir}/libopenvino_tensorflow_lite_frontend.so.%{version} %{_libdir}/libopenvino_tensorflow_lite_frontend.so.%{libversion} %files -n %{name}-sample %{_datadir}/%{name}/ %files -n %{name}-devel %{_includedir}/%{name} %{_libdir}/cmake/inferenceengine%{version} %{_libdir}/cmake/ngraph%{version} %{_libdir}/cmake/openvino%{version} %{_libdir}/*.so %{_libdir}/pkgconfig/openvino.pc %changelog
Locations
Projects
Search
Status Monitor
Help
OpenBuildService.org
Documentation
API Documentation
Code of Conduct
Contact
Support
@OBShq
Terms
openSUSE Build Service is sponsored by
The Open Build Service is an
openSUSE project
.
Sign Up
Log In
Places
Places
All Projects
Status Monitor