Edge Developer Toolbox Developer Guide
Benchmark Another Model or Target Architecture and Compare Performance
Repeat all the steps from Import an AI Model, Select Target Environment, Quantize Model, and continue with Benchmark the Model to compare its performance.
This time, choose our custom AI model: My-Yolov8-Model as the model for comparison.
For the “Select Target Environment” step, select the same processor, the Intel® Core™ i9-12900K processor. For the “Benchmark the Model” step, select Throughput Optimization for the algorithm. Continue with the rest of the steps.
After the benchmarking, you will have a second set of benchmarks that you can use for comparison. Click on the processor family under the Target Platform column to do so. In this example, it is the 12th Generation Intel® Core™ i9 Processors. New comparison options will appear below. The current model is shown at the top, which is My-Yolov8-Model:
Click on the model dropdown under All Projects and select the previous model you benchmarked, resnet-18-pytorch. Click the Benchmark Runs section and select the CPU, which is the Intel® Core™ i9-12900K processor. Then click Show Graphs to continue:
Four comparison graphs are shown for the models. A higher fps rate is better for throughput, while a lower ms rate is better for latency. In this example, resnet-18 clearly beats My-Yolov8-Model in both throughput and latency. It can also be noted that while there is a slight difference in Average CPU usage, the Average Memory Usage rates are nearly identical.
You can continue by benchmarking another model, or you can return to Edge Developer Toolbox to use the imported models in any of the workflows.
Benchmark Another Target Architecture and Compare Performance
Similar to the example above, you can repeat all the steps from Import an AI Model, Select Target Environment, Quantize Model, and continue with Benchmark the Model to compare its performance.
This time, keep using the same model, but choose a different processor in the “Select Target Environment” step.