Sign Up
Log In
Log In
or
Sign Up
Places
All Projects
Status Monitor
Collapse sidebar
home:cabelo:jax
openvino
openvino.spec
Overview
Repositories
Revisions
Requests
Users
Attributes
Meta
File openvino.spec of Package openvino
# # spec file for package openvino # # Copyright (c) 2023 SUSE LLC # Copyright (c) 2023 Alessandro de Oliveira Faria (A.K.A. CABELO) <cabelo@opensuse.org> or <alessandro.faria@owasp.org> # # All modifications and additions to the file contributed by third parties # remain the property of their copyright owners, unless otherwise agreed # upon. The license for this file, and modifications and additions to the # file, is the same license as for the pristine package itself (unless the # license for the pristine package is not an Open Source License, in which # case the license is the MIT License). An "Open Source License" is a # license that conforms to the Open Source Definition (Version 1.9) # published by the Open Source Initiative. # Please submit bugfixes or comments via https://bugs.opensuse.org/ # Name: openvino Version: 2024.1.0 Release: 20.1 Summary: OpenVINO™ Toolkit Loader License: ASL 2.0 URL: https://github.com/openvinotoolkit/openvino Source0: %{url}/archive/%{version}.tar.gz # PATCH-FEATURE-OPENSUSE openvino-onnx-ml-defines.patch badshah400@gmail.com -- Define ONNX_ML at compile time when using system onnx to allow using 'onnx-ml.pb.h' instead of 'onnx.pb.h', the latter not being shipped with openSUSE's onnx-devel package Patch0: openvino-onnx-ml-defines.patch # PATCH-FIX-UPSTREAM openvino-ComputeLibrary-include-string.patch badshah400@gmail.com -- Include header for std::string Patch1: openvino-ComputeLibrary-include-string.patch # PATCH-FIX-UPSTREAM openvino-fix-build-sample-path.patch cabelo@opensuse.org -- Fix sample source path in build script Patch2: openvino-fix-build-sample-path.patch Packager: Alessandro de Oliveira Faria (A.K.A CABELO) <cabelo@opensuse.org> ExclusiveArch: x86_64 %define debug_package %{nil} %define libversion 2410 BuildRequires: gcc-c++ cmake make pkg-config gcc BuildRequires: opencl-headers ocl-icd-devel opencv-devel BuildRequires: pugixml-devel patchelf opencl-cpp-headers BuildRequires: python3-devel ccache nlohmann_json-devel BuildRequires: ninja scons git git-lfs patchelf fdupes BuildRequires: rpm-build ShellCheck tbb-devel libva-devel BuildRequires: snappy-devel ocl-icd-devel python3-base BuildRequires: opencl-headers zlib-devel gflags-devel-static python3-pip BuildRequires: protobuf-devel ade-devel Requires: lib%{name}-%{version} Requires: lib%{name}-devel-%{version} %description OpenVINO C / C++ Toolkit. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n lib%{name}-%{version} Summary: Intel OpenVINO™ Toolkit %description -n lib%{name}-%{version} OpenVINO C / C++ Runtime libraries. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n lib%{name}-devel-%{version} Summary: Intel OpenVINO™ Toolkit Requires: lib%{name}-auto-batch-plugin-%{version} Requires: lib%{name}-auto-plugin-%{version} %if 0%{?suse_version} <= 1600 Requires: lib%{name}-intel-gpu-plugin-%{version} Requires: lib%{name}-intel-gnu-plugin-%{version} %endif Requires: lib%{name}-intel-cpu-plugin-%{version} Requires: lib%{name}-hetero-plugin-%{version} Requires: lib%{name}-onnx-frontend-%{version} Requires: lib%{name}-ir-frontend-%{version} Requires: lib%{name}-paddle-frontend-%{version} Requires: lib%{name}-pytorch-frontend-%{version} Requires: lib%{name}-tensorflow-frontend-%{version} Requires: lib%{name}-tensorflow_lite-frontend-%{version} Requires: %{name}-sample-%{version} %description -n lib%{name}-devel-%{version} Intel(R) Distribution of OpenVINO(TM) Toolkit C / C++ Development files. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n lib%{name}-auto-batch-plugin-%{version} Summary: Intel OpenVINO™ Toolkit Requires: lib%{name}-%{version} %description -n lib%{name}-auto-batch-plugin-%{version} OpenVINO Automatic Batching software plugin. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n lib%{name}-auto-plugin-%{version} Summary: Intel OpenVINO™ Toolkit Requires: lib%{name}-%{version} %description -n lib%{name}-auto-plugin-%{version} OpenVINO Auto / Multi software plugin. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %if 0%{?suse_version} <= 1600 %package -n lib%{name}-intel-gpu-plugin-%{version} Summary: Intel OpenVINO™ Toolkit Requires: lib%{name}-%{version} %description -n lib%{name}-intel-gpu-plugin-%{version} Intel® GPU inference plugin. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n lib%{name}-intel-npu-plugin-%{version} Summary: Intel OpenVINO™ Toolkit Requires: lib%{name}-%{version} %description -n lib%{name}-intel-npu-plugin-%{version} Intel® NPU inference plugin. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %endif %package -n lib%{name}-intel-cpu-plugin-%{version} Summary: Intel OpenVINO™ Toolkit Requires: lib%{name}-%{version} %description -n lib%{name}-intel-cpu-plugin-%{version} Intel® CPU inference plugin. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n lib%{name}-hetero-plugin-%{version} Summary: Intel OpenVINO™ Toolkit Requires: lib%{name}-%{version} %description -n lib%{name}-hetero-plugin-%{version} OpenVINO Hetero software plugin. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n lib%{name}-onnx-frontend-%{version} Summary: Intel OpenVINO™ Toolkit Requires: lib%{name}-%{version} %description -n lib%{name}-onnx-frontend-%{version} OpenVINO ONNX Frontend. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n lib%{name}-ir-frontend-%{version} Summary: Intel OpenVINO™ Toolkit Requires: lib%{name}-%{version} %description -n lib%{name}-ir-frontend-%{version} OpenVINO IR Frontend. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n lib%{name}-paddle-frontend-%{version} Summary: Intel OpenVINO™ Toolkit Requires: lib%{name}-%{version} %description -n lib%{name}-paddle-frontend-%{version} OpenVINO Paddle Frontend. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n lib%{name}-pytorch-frontend-%{version} Summary: Intel OpenVINO™ Toolkit Requires: lib%{name}-%{version} %description -n lib%{name}-pytorch-frontend-%{version} OpenVINO PyTorch Frontend. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n lib%{name}-tensorflow-frontend-%{version} Summary: Intel OpenVINO™ Toolkit Requires: lib%{name}-%{version} %description -n lib%{name}-tensorflow-frontend-%{version} OpenVINO TensorFlow Frontend. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n lib%{name}-tensorflow_lite-frontend-%{version} Summary: Intel OpenVINO™ Toolkit Requires: lib%{name}-%{version} %description -n lib%{name}-tensorflow_lite-frontend-%{version} OpenVINO TensorFlow Lite Frontend. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %package -n %{name}-sample-%{version} Summary: Intel OpenVINO™ Toolkit Requires: lib%{name}-%{version} BuildArch: noarch %description -n %{name}-sample-%{version} Intel(R) Distribution of OpenVINO(TM) Toolkit C / C++ Samples. OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference. This open-source version includes several components: namely OpenVINO Model Converter (OVC), OpenVINO™ Runtime, as well as CPU, GPU, multi device and heterogeneous plugins to accelerate deep learning inference on Intel® CPUs and Intel® Processor Graphics. It supports pre-trained models from Open Model Zoo, along with 100+ open source and public models in popular formats such as TensorFlow, ONNX, PaddlePaddle, MXNet, Caffe, Kaldi. %prep %autosetup -p1 %build %cmake \ -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=../openvino_dist \ -DBUILD_SHARED_LIBS=ON -DENABLE_OV_ONNX_FRONTEND=ON \ -DENABLE_OV_PADDLE_FRONTEND=ON -DENABLE_OV_IR_FRONTEND=ON \ -DENABLE_OV_PYTORCH_FRONTEND=ON -DENABLE_OV_IR_FRONTEND=ON \ -DENABLE_OV_TF_FRONTEND=ON -DENABLE_OV_TF_LITE_FRONTEND=ON \ -DENABLE_INTEL_GNA=OFF -DENABLE_JS=OFF \ -DENABLE_INTEL_NPU=ON \ -DENABLE_INTEL_GPU=ON \ -DENABLE_SYSTEM_SNAPPY=ON \ -DONNX_USE_PROTOBUF_SHARED_LIBS=ON \ -DENABLE_SYSTEM_PROTOBUF=ON \ -DProtobuf_USE_STATIC_LIBS=OFF \ -DENABLE_SYSTEM_PUGIXML=ON \ -DENABLE_SYSTEM_OPENCL=ON \ %if 0%{?suse_version} > 1600 -DENABLE_INTEL_GPU=OFF \ -DCMAKE_CXX_STANDARD=17 \ %endif -DENABLE_PYTHON=OFF -DENABLE_WHEEL=OFF \ .. # -DPYTHON_EXECUTABLE=`which python3.11` \ # -DPYTHON_LIBRARY=/usr/lib64/libpython3.11.so \ # -DPYTHON_INCLUDE_DIR=/usr/include/python3.11 .. %if 0%{?suse_version} > 1600 make -j 1 VERBOSE=1 %else %cmake_build #make --jobs=$(nproc --all) VERBOSE=1 %endif %install #%make_install -C build %cmake_install mkdir -p %{buildroot}%{_libdir} mkdir -p %{buildroot}%{_libdir}/pkgconfig mkdir -p %{buildroot}%{_libdir}/cmake mkdir -p %{buildroot}%{_libdir}/cmake/inferenceengine%{version} mkdir -p %{buildroot}%{_libdir}/cmake/ngraph%{version} mkdir -p %{buildroot}%{_libdir}/cmake/openvino%{version} mkdir -p %{buildroot}%{_libdir}/pkgconfig mkdir -p %{buildroot}%{_libdir}/openvino-%{version} mkdir -p %{buildroot}%{_includedir} mkdir -p %{buildroot}/usr/share/openvino/samples/c mkdir -p %{buildroot}/usr/share/openvino/samples/cpp mkdir -p %{buildroot}/usr/share/licenses/openvino mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/include/* %{buildroot}%{_includedir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_onnx_frontend.so %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_paddle_frontend.so %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_pytorch_frontend.so %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_tensorflow_frontend.so %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_tensorflow_lite_frontend.so %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino.so.%{version} %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino.so.%{libversion} %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_c.so.%{version} %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_c.so.%{libversion} %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_c.so %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino.so %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/cmake/OpenVINOConfig-version.cmake %{buildroot}%{_libdir}/cmake/openvino%{version} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/cmake/OpenVINOConfig.cmake %{buildroot}%{_libdir}/cmake/openvino%{version} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/cmake/OpenVINOTargets-release.cmake %{buildroot}%{_libdir}/cmake/openvino%{version} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/cmake/OpenVINOTargets.cmake %{buildroot}%{_libdir}/cmake/openvino%{version} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/pkgconfig/openvino.pc %{buildroot}%{_libdir}/pkgconfig mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_auto_batch_plugin.so %{buildroot}%{_libdir}/openvino-%{version} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_auto_plugin.so %{buildroot}%{_libdir}/openvino-%{version} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_intel_cpu_plugin.so %{buildroot}%{_libdir}/openvino-%{version} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_hetero_plugin.so %{buildroot}%{_libdir}/openvino-%{version} %if 0%{?suse_version} <= 1600 mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_intel_gpu_plugin.so %{buildroot}%{_libdir}/openvino-%{version} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_intel_npu_plugin.so %{buildroot}%{_libdir}/openvino-%{version} %endif mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_ir_frontend.so.%{version} %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_ir_frontend.so.%{libversion} %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_onnx_frontend.so.%{version} %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_onnx_frontend.so.%{libversion} %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_paddle_frontend.so.%{version} %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_paddle_frontend.so.%{libversion} %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_pytorch_frontend.so.%{version} %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_pytorch_frontend.so.%{libversion} %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_tensorflow_frontend.so.%{version} %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_tensorflow_frontend.so.%{libversion} %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_tensorflow_lite_frontend.so.%{version} %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/runtime/lib/intel64/libopenvino_tensorflow_lite_frontend.so.%{libversion} %{buildroot}%{_libdir} mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/samples/c/* %{buildroot}/usr/share/openvino/samples/c mv %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/samples/cpp/* %{buildroot}/usr/share/openvino/samples/cpp cp /home/abuild/rpmbuild/BUILD/openvino-%{version}/LICENSE %{buildroot}/usr/share/licenses/openvino sed -i "s/\${_IMPORT_PREFIX}\/runtime\//\/usr\//g" %{buildroot}%{_libdir}/cmake/openvino%{version}/OpenVINOTargets.cmake sed -i "s/\/runtime\/lib\/intel64//g" %{buildroot}%{_libdir}/cmake/openvino%{version}/OpenVINOTargets-release.cmake sed -i "s/\/runtime\/lib\/intel64//g" %{buildroot}%{_libdir}/pkgconfig/openvino.pc sed -i "s/runtime\/include/include/g" %{buildroot}%{_libdir}/pkgconfig/openvino.pc sed -i "s/\/ie/\/include\/ie/g" %{buildroot}%{_libdir}/pkgconfig/openvino.pc rm -rf %{buildroot}/home/abuild/rpmbuild/BUILD/openvino-%{version}/openvino_dist/* %post -p /sbin/ldconfig %postun -p /sbin/ldconfig %post -n lib%{name}-%{version} -p /sbin/ldconfig %postun -n lib%{name}-%{version} -p /sbin/ldconfig %post -n lib%{name}-auto-batch-plugin-%{version} -p /sbin/ldconfig %postun -n lib%{name}-auto-batch-plugin-%{version} -p /sbin/ldconfig %post -n lib%{name}-auto-plugin-%{version} -p /sbin/ldconfig %postun -n lib%{name}-auto-plugin-%{version} -p /sbin/ldconfig %if 0%{?suse_version} <= 1600 %post -n lib%{name}-intel-npu-plugin-%{version} -p /sbin/ldconfig %postun -n lib%{name}-intel-npu-plugin-%{version} -p /sbin/ldconfig %post -n lib%{name}-intel-gpu-plugin-%{version} -p /sbin/ldconfig %postun -n lib%{name}-intel-gpu-plugin-%{version} -p /sbin/ldconfig %endif %post -n lib%{name}-intel-cpu-plugin-%{version} -p /sbin/ldconfig %postun -n lib%{name}-intel-cpu-plugin-%{version} -p /sbin/ldconfig %post -n lib%{name}-hetero-plugin-%{version} -p /sbin/ldconfig %postun -n lib%{name}-hetero-plugin-%{version} -p /sbin/ldconfig %post -n lib%{name}-onnx-frontend-%{version} -p /sbin/ldconfig %postun -n lib%{name}-onnx-frontend-%{version} -p /sbin/ldconfig %post -n lib%{name}-ir-frontend-%{version} -p /sbin/ldconfig %postun -n lib%{name}-ir-frontend-%{version} -p /sbin/ldconfig %post -n lib%{name}-paddle-frontend-%{version} -p /sbin/ldconfig %postun -n lib%{name}-paddle-frontend-%{version} -p /sbin/ldconfig %post -n lib%{name}-pytorch-frontend-%{version} -p /sbin/ldconfig %postun -n lib%{name}-pytorch-frontend-%{version} -p /sbin/ldconfig %post -n lib%{name}-tensorflow-frontend-%{version} -p /sbin/ldconfig %postun -n lib%{name}-tensorflow-frontend-%{version} -p /sbin/ldconfig %post -n lib%{name}-tensorflow_lite-frontend-%{version} -p /sbin/ldconfig %postun -n lib%{name}-tensorflow_lite-frontend-%{version} -p /sbin/ldconfig %post -n %{name}-sample-%{version} -p /sbin/ldconfig %postun -n %{name}-sample-%{version} -p /sbin/ldconfig %post -n lib%{name}-devel-%{version} -p /sbin/ldconfig %postun -n lib%{name}-devel-%{version} -p /sbin/ldconfig %files %defattr(-,root,root) %dir %{_licensedir}/openvino %{_licensedir}/openvino/LICENSE %files -n lib%{name}-%{version} %dir %{_libdir}/openvino-%{version} %{_libdir}/libopenvino.so.%{version} %{_libdir}/libopenvino.so.%{libversion} %{_libdir}/libopenvino_c.so.%{version} %{_libdir}/libopenvino_c.so.%{libversion} %files -n lib%{name}-auto-batch-plugin-%{version} %dir %{_libdir}/openvino-%{version} %{_libdir}/openvino-%{version}/libopenvino_auto_batch_plugin.so %files -n lib%{name}-auto-plugin-%{version} %dir %{_libdir}/openvino-%{version} %{_libdir}/openvino-%{version}/libopenvino_auto_plugin.so %if 0%{?suse_version} <= 1600 %files -n lib%{name}-intel-gpu-plugin-%{version} %dir %{_libdir}/openvino-%{version} %{_libdir}/openvino-%{version}/libopenvino_intel_gpu_plugin.so %files -n lib%{name}-intel-npu-plugin-%{version} %dir %{_libdir}/openvino-%{version} %{_libdir}/openvino-%{version}/libopenvino_intel_npu_plugin.so %endif %files -n lib%{name}-intel-cpu-plugin-%{version} %dir %{_libdir}/openvino-%{version} %{_libdir}/openvino-%{version}/libopenvino_intel_cpu_plugin.so %files -n lib%{name}-hetero-plugin-%{version} %dir %{_libdir}/openvino-%{version} %{_libdir}/openvino-%{version}/libopenvino_hetero_plugin.so %files -n lib%{name}-onnx-frontend-%{version} %dir %{_libdir}/openvino-%{version} %{_libdir}/libopenvino_onnx_frontend.so.%{version} %{_libdir}/libopenvino_onnx_frontend.so.%{libversion} %files -n lib%{name}-ir-frontend-%{version} %dir %{_libdir}/openvino-%{version} %{_libdir}/libopenvino_ir_frontend.so.%{version} %{_libdir}/libopenvino_ir_frontend.so.%{libversion} %files -n lib%{name}-paddle-frontend-%{version} %dir %{_libdir}/openvino-%{version} %{_libdir}/libopenvino_paddle_frontend.so.%{version} %{_libdir}/libopenvino_paddle_frontend.so.%{libversion} %files -n lib%{name}-pytorch-frontend-%{version} %dir %{_libdir}/openvino-%{version} %{_libdir}/libopenvino_pytorch_frontend.so.%{version} %{_libdir}/libopenvino_pytorch_frontend.so.%{libversion} %files -n lib%{name}-tensorflow-frontend-%{version} %dir %{_libdir}/openvino-%{version} %{_libdir}/libopenvino_tensorflow_frontend.so.%{version} %{_libdir}/libopenvino_tensorflow_frontend.so.%{libversion} %files -n lib%{name}-tensorflow_lite-frontend-%{version} %dir %{_libdir}/openvino-%{version} %{_libdir}/libopenvino_tensorflow_lite_frontend.so.%{version} %{_libdir}/libopenvino_tensorflow_lite_frontend.so.%{libversion} %files -n %{name}-sample-%{version} %defattr(-,root,root) %dir %{_datadir}/openvino/samples/cpp %dir %{_datadir}/openvino/samples/c %dir %{_datadir}/openvino/samples %dir %{_datadir}/openvino %{_datadir}/openvino/samples/cpp/* %{_datadir}/openvino/samples/c/* %files -n lib%{name}-devel-%{version} %dir %{_includedir}/ %dir %{_libdir}/pkgconfig %dir %{_libdir}/cmake %{_includedir}/* %{_libdir}/cmake/* %{_libdir}/libopenvino.so %{_libdir}/libopenvino_c.so %{_libdir}/libopenvino_onnx_frontend.so %{_libdir}/libopenvino_paddle_frontend.so %{_libdir}/libopenvino_pytorch_frontend.so %{_libdir}/libopenvino_tensorflow_frontend.so %{_libdir}/libopenvino_tensorflow_lite_frontend.so %{_libdir}/pkgconfig/openvino.pc %changelog
Locations
Projects
Search
Status Monitor
Help
OpenBuildService.org
Documentation
API Documentation
Code of Conduct
Contact
Support
@OBShq
Terms
openSUSE Build Service is sponsored by
The Open Build Service is an
openSUSE project
.
Sign Up
Log In
Places
Places
All Projects
Status Monitor