After Speaking With Intel, I

Key Takeaways Intel is ramping up its AI capabilities with specialized processors like Raptor Lake and Arrow Lake for improved performance. The company is aiming to support a wide range … Read more

Taylor Bell

Taylor Bell

Published on Apr 24, 2024

After Speaking With Intel, I

Key Takeaways

  • Intel is ramping up its AI capabilities with specialized processors like Raptor Lake and Arrow Lake for improved performance.
  • The company is aiming to support a wide range of AI frameworks and tools, including being the first to enable DirectML on the NPU.
  • AI is crucial for Intel’s future success, as it enhances traditional performance and efficiency in CPUs, GPUs, and NPUs.

Intel has been going through a bit of a transformation of sorts in the last few months. Pushed by the improvements of chip competitor AMD and the AI dominance of Nvidia, the company has put its foot down and committed to supporting both AI and improving its fabrication nodes at a breakneck pace. Intel doesn’t just want to ensure its processors are powerful, but it also wants them to be the smartest in the business.

I sat down with Robert Hallock, Senior Director of Technical Marketing at Intel, to talk about the company’s push for AI going forward.

Intel logo on a gradient blue background with squares

Related

Intel’s process roadmap to 2025: Intel 7, 4, 3, 20A, and 18A explained

Intel has outlined its new processes for the next few years, but what does it all mean?

Intel’s AI strategy going forward

It’s all about useful AI

Intel’s approach to AI is deeply integrated, involving a comprehensive upgrade to their hardware to better accommodate AI tasks directly on devices. Hallock explains the meaning of “AI PC” in this context, telling me that “the requirement to get that name is that all of the accelerators in the device must have specific AI accelerating extensions.” This includes enhancements like the VNNI instruction set in the CPU core and other AI-centric features in their graphics chips and NPUs.

Hallock emphasizes the distinction between traditional processors and those equipped for AI: “previous generation processors […] can run [AI] on a CPU core, but it’s really slow.” The newer generations, however, are designed to handle these tasks much more efficiently, highlighting a shift towards specialized AI processors like those in the Raptor Lake family which, according to Hallock, “can run AI stuff, but it’s not really an AI processor because it doesn’t have the instruction sets to do it.” Meteor Lake and the like are the first true AI processors from Intel, with Arrow Lake in particular being a major focus for AI.

Intel’s approach to AI is deeply integrated, involving a comprehensive upgrade to their hardware to better accommodate AI tasks directly on devices

To that end, Hallock mentions that Intel’s strategy also focuses on the consumer market’s needs, particularly in gaming and content creation. The company’s Arrow Lake processors are aimed at “the gaming and creator market where you can’t really get an AI PC type processor today.” This initiative is part of Intel’s broader goal to deploy AI capabilities widely, with plans to “bring 100 million different AI accelerators to market through 2025.” The company’s upcoming Lunar Lake also makes up a pretty big part of this goal.

Finally, Hallock also talks about how Intel’s focus is on running things locally, not in passing off data to the cloud. He tells me that “we’re very focused on local inference, local applications, and the end-user experience. And on the client side, not so much in the training or data scientist piece of that equation. We’re trying to get users to the final applications that they can use.” In other words, it’s all about usable AI, rather than AI for the sake of it.

I’m naturally a skeptic when it comes to AI, so I asked Hallock what he thinks the future of the AI PC is. He told me that he thinks “the ultimate trajectory of the AI PC is that it just becomes a PC that in three to five years, pretty much every desktop computer has an AI accelerator in it. Much like graphics today was kind of a rarity 10 years ago, but it’s now super common.”

intel-core-i9-14900ks-open-wafer-close

Intel is supporting the development of useful AI

They were the first to support DirectML

A photo of the MSI Prestige Evo 16 AI sitting on a desk with plants and toy objects in the background

On the software side, Intel is not limiting itself to proprietary technology. Hallock discusses Intel’s support for a wide range of AI frameworks and tools, ensuring compatibility and optimization across various platforms. He notes, “We support DirectML from Microsoft… and we were the first to enable DirectML on the NPU.” It’s reflective of Intel’s commitment to multi-vendor, open standards, which facilitates broader adoption and innovation in AI applications.

Intel also wants to deliver AI in any way that it can, supporting the development and acceleration across a multitude of applications and contexts. Hallock tells me:

Wherever people are going to consume AI from or deliver it through, we want to be there and enable that. We want developers to A) find us, get connected to resources, B) help us find these cool, interesting projects that we might not have known about, get them into our network, help support their development, get them hardware, get them technical support, yada, yada, yada. Through that program, we’ve already got 100 ISVs [Independent Software Vendors] signed up, and we’re planning to deliver and on track to deliver 300 different AI features by kind of the middle-ish of this year. So 300 different application features, and that is on top of all the features that Intel can run that we didn’t specifically help develop. So I’m thinking of many of the features in Adobe software. We’re compatible with all of them. There’s like 100 features there alone across the Adobe Creative Cloud Suite, and some of them we helped with, but not all of them. And we run all of those, too.

Hallock likens the development of AI to game engines, particularly in the development of it. Just like how you may like a game or you may not like another game, his view is that some AI features are going to be evoke similar emotions in users who may be apathetic to some but find use in others. Because of this, and the hardware assortment that people have, he tells me that “it also means you need to have a very large software optimization organization at your company to handle all these frameworks, runtimes, APIs, models, engines, so on and so forth.” He tells me that it’s Intel’s job to ensure that these things all run perfectly on a computer so that the user experience isn’t impeded.

As well as that, Hallock also tells me that Intel has “over 500 optimized AI models” right now, and they all run on the Core Ultra. The problem is that people don’t necessarily understand what a model is, but that a model is analogous to a game engine in the sense that the hardware needs to support it appropriately in order for it to function at its best.

Intel says you should care about AI

Not even for the reasons you might think, either

Acer Swift Go 14-11

Hallock closes with an interesting point about Intel’s CPUs and why AI processing is important. He tells me that there’s a “massive appetite from the software industry to create AI-based applications,” and notes that without AI-specific extensions and processing capability in a CPU, the conventional things people like in their computers will suffer. He explains that “if you have a processor that doesn’t have an AI accelerator inside, you are upside down on performance, you’re upside down on power efficiency. All the things that people traditionally care about get worse without these accelerators or the instructions in the CPU core.”

He mentions as well that in the case of 3D TVs, it was only the hardware industry pushing for it. Nobody else cared, whereas in the AI field, it’s the software vendors that need hardware to properly support what they’re doing, and that’s what Intel is trying to do. “Software companies want this to happen because it makes their applications faster, more feature-rich, it makes them more capable,” he tells me.

The other thing that I would say is users should expect that CPU cores, GPU cores, and NPU all play a role in AI. All of them have a role. The NPU for power efficiency reasons, the GPU for performance reasons, and even the CPU with AVX256 instructions can assist in many of the setup phases for these AI workloads. Stable Diffusion, for example, some of the early stages of generating an image run great on CPU. So it is not an all NPU concept. It is all three engines and will continue to be all three engines because software manufacturers want it to work that way.

Intel needs AI to be the future, and it’s investing a ton of resources into it. As it stands, the company’s processors are the best processors for AI, but Nvidia is still the AI kingpin currently. With AMD also beginning to focus on AI and Zen 5 and Arrow Lake just around the corner, the next generation of processors will be an interesting battle, to say the least.

Qualcomm Snapdragon X Elite Asus Zenbook 14-9

Related

Snapdragon X Elite vs Intel Core Ultra 7 155H: We ran the benchmarks

Intel stole the show at CES, but Qualcomm was there to say it’s still coming for them

Partager cet article

Inscrivez-vous à notre newsletter