Intel VP talks AI strategy as the company takes on Nvidia

Intel VP talks AI strategy as the company takes on Nvidia

Couldn’t make it to Transform 2022? Watch all the sessions from the summit in our on-demand library now! Look here.


Intel is on an artificial intelligence (AI) mission that it sees as very, very possible.

The company is the world’s largest semiconductor chipmaker by revenue and is best known for its dominance of the CPU market, with its well-known “Intel inside” campaign, reminding us of everything that resides inside our personal computers. However, in an era where AI chips are all the rage, the company finds itself chasing competitors, notably Nvidia, which has a huge advantage in rendering AI with its GPUs.

There are significant benefits to catching up in this space. According to one report, the AI ​​chip market was worth around $8 billion in 2020, but is expected to grow to nearly $200 billion by 2030.

At Intel’s Vision event in May, the company’s new CEO, Pat Gelsinger, highlighted AI as a core element of the company’s future products, while predicting that AI’s need for higher levels of computing performance higher makes it a key driver for Intel’s overall strategy.

Event

MetaBeat 2022

MetaBeat will bring together thought leaders to provide guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

register here

Gelsinger said he envisioned four superpowers that spur innovation at Intel: pervasive connectivity, ubiquitous computing, AI, and edge cloud infrastructure.

That requires high-performance hardware and software systems, including in tools and frameworks used to implement end-to-end AI and data pipelines. As a result, Intel’s strategy is to “build a stable of open source chips and software that meet a wide range of computing needs as AI becomes more prevalent,” a recent report said. Wall Street Journal designated item.

“Each of these superpowers is awesome on its own, but when they come together, that’s magical,” Geisinger said at the Vision event. “If you’re not applying AI to every one of your business processes, you’re falling behind. We are seeing this in all industries.”

It’s in that context that VentureBeat recently spoke with Wei Li, vice president and general manager of AI and analytics at Intel. He is responsible for AI and analytical hardware and software acceleration for deep learning, machine learning, and big data analytics on Intel CPUs, GPUs, AI accelerators, and XPUs with heterogeneous and distributed computing.

Intel Hardware and Software Connection

According to Li, it is Intel’s strong connection between software and hardware that makes the company stand out and ready to compete in the AI ​​space.

“The biggest problem we’re trying to solve is bridging data and insights,” he said. “The bridge needs to be wide enough to handle a lot of traffic, and the traffic needs to have speed and not get stuck.”

That means AI needs software to run efficiently and quickly, with a full ecosystem that enables data scientists to take in huge amounts of data and devise solutions, as well as hardware acceleration that provides the ability to process data from efficient way.

“On the hardware side, when we add specific acceleration within the hardware, we need to know what we’re accelerating,” Li said. “So we’re doing a lot of co-design, where the software team works very closely with the hardware team.”

The two groups operate almost as one team, he added, to understand models, discover performance bottlenecks and add hardware capacity.

“It’s an outside-in approach, a tightly integrated co-design, to make sure the hardware is designed the right way,” he said, adding that the original GPU wasn’t designed for AI but turned out to have the right amount of compute and bandwidth. wide. Since then, GPUs have evolved.

“When we design GPUs today, we look at AI as an important workload to drive GPU design,” he said. “There are specific features within the GPU that are just for AI. That is the advantage of being in a company where we have both software and hardware teams”.

Intel’s goal is to scale its AI efforts, Li said, saying it is about developing an ecosystem rather than separate solutions.

“It will be how we lead and nurture an open AI software ecosystem,” he explained. “Intel has always been an open ecosystem that enables competition, which enables Intel technologies to get to market faster at scale.”

Intel Skilled AI Reference Kits Increase Speed

Historically, Intel has done a lot of work on the software capability side of things to get better performance, essentially increasing the width of the bridge between data and information.

Last month, Intel released AI-trained reference kits for the open source community, which Li said is one of the steps the company is taking to increase the speed of crossing the bridge.

“Traditionally, AI software was designed for specialists, for the most part,” he said. “But we want to target a much broader set of developers.”

The AI ​​models in the reference kits have been designed, trained, and tested against thousands of models for specific use cases, while data scientists can customize and tune the model with their own data.

“You get a combination of ease of use because you start with something almost pre-cooked, plus you get all the optimized software as part of the package so you can get to your solution quickly,” Li explained.

Priorities for next year

In the coming year, one of Intel’s biggest AI priorities is on the software side.

“We will put more effort into focusing on ease of use,” Li said.

On the hardware side, he added, new products will focus heavily on performance, including the Sapphire Rapids Xeon server processor due to launch in 2023.

“It’s like a CPU with a GPU built into it because of how many computing capabilities it has,” Li said. “It’s a game changer to have all the acceleration inside the GPU.”

Additionally, Intel is focusing on its data center GPU performance, working with its client Argonne National Laboratory, which serves its customers and developers.

Intel’s Biggest AI Challenges

Li said the biggest challenge his team faces is executing Intel’s AI vision.

“We really want to make sure we execute well so we can deliver on the right schedule and make sure we run fast,” he said. “We want to have a torrid pace, which is not easy as a big company.”

However, Li will not blame external factors that create challenges for Intel, such as the economy or inflation.

“Everyone has headwinds, but I want to make sure we do the best we can with the things that we have control over as a team,” he said. “So I’m pretty optimistic, particularly in the AI ​​domain. It’s like going back to my graduate student days: you really can think big. Everything is possible.”

The VentureBeat Mission is to be a digital public square for technical decision makers to learn about transformative business technology and transact. Learn more about membership.

Leave a Reply

Your email address will not be published.