Intel® Core™ Ultra 200S and 200HX Series Processors
Datasheet, Volume 1 of 2
| ID | Date | Version | Classification |
|---|---|---|---|
| 832586 | 12/02/2025 | Public |
Intel® Neural Processing Unit (Intel® NPU)
The NPU IP in the
The functionality of the Intel® NPU is exposed to a Host system (enumerated as a PCIe device) via a base set of registers. These registers provide access to control and data path interfaces and reside in the Host and Processor subsystems of the Intel® NPU. All host communications are consumed by the scheduler of the Intel® NPU, a 32-bit LeonRT micro-controller. The LeonRT manages the command and response queues as well as the runtime management of the IP itself.
The NPU IP Deep Learning capability is provided by two Neural Compute Engine (NCE) Tiles. Both NCE Tiles are managed by the NPU Scheduler. Each Tile includes a configurable number of Multiply Accumulate (MAC) engines, purpose built for Deep Learning workloads, and two Intel® Movidius SHAVE DSP processors for optimal processing of custom deep learning operations.
The Intel® NPU of
The NPU plugin supports the following data types as inference precision of internal primitives: INT8(I8/U8), FP16.