Intel Ice Lake

How the Intel Ice Lake processor’s new AI powers will improve your PC

Intel is trying to change how we think of AI
Life
Image: IDG News Service

29 May 2019

Those who buy an Intel notebook with Ice Lake this autumn may start to see increased instances of artificial intelligence (AI).

This is not to say that AI capabilities are exclusive to Ice Lake, or that without it you will not see drastic improvements in desktop software. But some of the ‘whoa’ moments that app developers are working on require artificial intelligence and machine learning capabilities, which Intel is building into its new 10th-gen Core chip.

Some can already be seen, Microsoft’s Photos app, for example, uses AI image analysis to come up with its own assessment of what it is “seeing,” a beach scene, for example, or snow. Microsoft Photos and Google Photos already identify and group the subjects of your photos, recognising who is in them.

 

advertisement



 

But on the PC, “AI” tends to be equated with digital assistants like Cortana. Intel is trying to redefine how we think of AI.

What will AI do for you?

In a briefing before Computex, Intel showed off other AI examples: stylising a video in real time as it plays, just like applying a filter in Snapchat; removing unwanted background noise from a call or chat, and accelerating CyberLink PhotoDirector 10’s ability to de-blur photos. Microsoft’s Skype and Teams apps can already pick you out from a video call and blur or even replace the background. AI functionality will make that faster.

What is a Gaussian Neural Accelerator?

Intel’s secret ingredient is what it calls a Gaussian Neural Accelerator, a tuned piece of logic found within the Ice Lake chip package. The two work hand in hand. The CPU architecture accelerates what is known as DLBoost, which in turn accelerates inferencing technology on Intel’s Ice Lake CPU. (Inferencing applies rules or algorithms to known facts to learn more about them.) The Gaussian Neural Accelerator, meanwhile, can run under very low-power conditions to perform a specialised task, such as real-time translation of an ongoing conversation.

New functions inside of a PC typically go through somewhat of a struggle between running them on the general-purpose CPU or a dedicated add-on card. In the early days of the PC, for example, early multimedia functions were accelerated by Native Signal Processing and Intel’s Multimedia Instruction Set (MMX), then migrated to dedicated sound and graphics cards. Over time, basic graphics and audio capabilities moved back into the CPU and chipset as it became more cost-effective.

Figuring out how AI will be processed seems to be in these early stages, too. For now, Intel is hedging its bets, splitting the workload between the CPU’s DLBoost instructions, the Gaussian Neural Accelerator, and the more traditional Iris Plus integrated GPU. “The thing that we’re good at is providing primitives that they [software developers] can build upon,” Ronak Singha, an Intel fellow, told reporters before Computex.

Not the only game in town

Intel, obviously, is not alone. In fact, Qualcomm was first to show off how it eliminates background noise within an audio or video call at the launch of Snapdragon 8cx last December. AMD’s Lisa Su also told reporters that her company is working on AI infererencing, but offered no details.

As users, though, it is not quite clear where AI-powered functions will pop up. One example, executives said, is how Adobe Photoshop has continually grown smarter and smarter at identifying the subject of a photo and then allowing it to be automatically migrated to another background, through AI-powered recognition. But that “magic lasso” function existed before Ice Lake, too.

It probably will be up to the app developers themselves to communicate what capabilities they are delivering in a particular app or game, and what is expressly powered by AI. If history holds, Nvidia will have something to say about all of this, too. In 2008, companies like Ageia started talking about hardware physics accelerators after PhysX and others started making available physics engines to games for realistic approximations of how objects ricochet and fall in the real world. Those functions later got sucked into discrete GPUs, as Nvidia bought Ageia. It is very possible a similar fight could occur, determining which chip gets AI inferencing inside your PC.

IDG News Service

Read More:


Back to Top ↑

TechCentral.ie