AMD will continue to bring AI technologies to Ryzen processors

AMD will continue to bring AI technologies to Ryzen processors, but not in the desktop segment

The AMD Ryzen 7000 family of mobile processors in January introduced models equipped with artificial intelligence acceleration hardware modules dubbed XDNA. The company thinks it makes sense to spread such blocks across the entire Ryzen processor range, but in the desktop segment it is in no hurry, and also emphasizes the importance of synchronous software development.

    Image source: AMD

Image source: AMD

resource PC world managed to discuss the prospects for the development of these technologies with AMD Corporate Vice President David McAfee, who leads the company’s retail business in the customer direction. AMD will introduce special Inference Processing Unit (IPU) blocks in its central processors, which will handle specific workloads associated with artificial intelligence. This is done with high energy efficiency, which will ultimately lead to greater benefits being felt in the mobile segment, due to the need to save battery power in the same laptops.

In a way, the IPU is intended to resemble the CPU unit responsible for video decoding, as AI-related workloads will move to permanent streaming in the future, according to an AMD representative. Like the RDNA graphics architecture, the XDNA AI blocks will also evolve and change generations. According to McAfee, the industry still faces a problem with the lack of a commonly accepted measure of the performance of systems in the field of AI, which makes it difficult for consumers to select the appropriate platforms for this criterion.

AMD is considering expanding the AI ​​accelerators to other Ryzen processor models, but the focus is primarily on mobile solutions for the time being. According to AMD’s corporate vice president, desktop PCs have enough overall performance to economically justify the introduction of a special unit. The practical benefit, for example, of training Ryzen Threadripper to accelerate operations with artificial intelligence will not be great. Unless it will be interesting from a demonstration point of view, but no more.

According to an AMD representative, the localization of AI-related computing will be demanded by companies in the future. Today, all such operations are predominantly performed in the cloud, but not all companies and organizations are willing to entrust sensitive information to third-party server systems, and in this sense, the advent of processors that can process this data locally with high efficiency should remedy that Problem. The software must evolve synchronously with the hardware so that its developers can prove the effectiveness of their components. The next three years will be critical in this regard, McAfee added.

About the author

Dylan Harris

Dylan Harris is fascinated by tests and reviews of computer hardware.

Add Comment

Click here to post a comment