About 2,170,000 results
Open links in new tab
  1. OpenVINO 2025.2 Available Now! - Intel Community

    Jun 18, 2025 · We are excited to announce the release of OpenVINO™ 2025.2! This update brings expanded model coverage, GPU optimizations, and Gen AI enhancements, designed to maximize …

  2. Intel® Distribution of OpenVINO™ Toolkit

    Dec 12, 2025 · Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

  3. OpenVINO™ Toolkit Execution Provider for ONNX Runtime — …

    Jun 24, 2022 · The OpenVINO™ Execution Provider for ONNX Runtime enables ONNX models for running inference using ONNX Runtime API’s while using OpenVINO™ toolkit as a backend. With …

  4. OpenVINO™ 2024.6 Available Now! - Intel Community

    Dec 19, 2024 · We are excited to announce the release of OpenVINO™ 2024.6! In this release, you’ll see improvements in LLM performance and support for the latest Intel® Arc™ GPUs! What’s new in …

  5. No module named 'openvino.runtime'; 'openvino' is not a package

    Jun 26, 2023 · I installed OpenVINO with pip3 and tried the OpenVINO Runtime API Python tutorial. The first lines of the tutorial are from openvino.runtime import Core ie = Core() devices = …

  6. Llama2-7b inference using openvino-genai - Intel Community

    Nov 12, 2024 · Hi Shravanthi, Thanks for reaching out. Can you share the screenshot of your TinyLlama directory? Does openvino_tokenizer (.xml and .bin) files available in the directory? I have exported …

  7. Installing openvino raspberry pi 4 - Intel Community

    Sep 28, 2023 · Hence, we additionally provide OpenVINO™ Runtime archive file for Debian. Since, this package doesn’t include Model Optimizer/Model Downloader, the ideal scenario is to use another …

  8. Solved: OpenVION GenAI chat_sample on NPU - Intel Community

    Mar 7, 2025 · Solved: Hello Intel Experts! I am currently testing out the chat_sample from `openvino_genai_windows_2025.0.0.0_x86_64` on the NPU. From

  9. ONNX Model GenAI C# Sample - Intel Community

    Feb 15, 2025 · Hi all' Is openVINO supported for ONNX GenAI Managed ? Is so how do I configure the Execution Provider in the C# code? using Config config = new Config(modelPath); …

  10. Intel® Distribution of OpenVINO™ Toolkit

    Sep 24, 2023 · CPU: I7-1165g7 GPU: Intel iris xe graphics From the result you shared, your OpenVINO™ installation is correct however the GPU not being detected might be due to GPU …