Huge Hot Hopper Exhibits Energy Efficiency
The Version 3.0 MLPerf Inference results show data-center AI engines making gains relative to tests from six months ago. Nvidia continues post the highest scores, but Qualcomm achieved power-efficiency leadership on a couple tests.
Increasing a processor’s clock rate and execution-path width aren’t the only ways to boost performance; software optimization can yield similar gains. The recent Version 3.0 results on the MLPerf Inference benchmark show data-center AI engines making gains relative to tests from six months ago.
Nvidia continues to stand out by offering the fastest AI chips and by posting scores for all MLPerf tests. Qualcomm delivered better power efficiency on image classification and object detection. Intel upgraded its Xeon scores, an improvement over what it previewed last year. Some companies that previously reported inference results declined to post updates, however, suggesting their software is stagnant.
Subscribers can view the full article in the TechInsights Platform.
Revealing the innovations others cannot inside advanced technology products
1891 Robertson Rd #500, Nepean, ON K2H 5B7