Artificial intelligence may not need networks at all

Pro
(Image: Stockfresh)

22 December 2017

The advancement of edge computing, along with increasingly powerful chips, may make it possible for artificial intelligence (AI) to operate without wide-area networks (WAN).

Researchers working on a project at the University of Waterloo say they can make AI adapt as computational power and memory are removed. And indeed if they can do that, it would allow the neural networks to function free of the internet and cloud — the advantages being better privacy, lower data-send costs, portability and the utilisation of AI applications in geographically remote areas.

The scientists say they can teach AI to learn it does not need lots of resources.

The group claims to be doing it by copying nature and placing the neural network in a virtual environment. They “then progressively and repeatedly deprive it of resources.” The AI subsequently evolves and adapts, the team members say in a news article on the school’s web site.

The engine essentially learns to work around the fact that it does not have huge resources to draw on—AI typically uses a lot of power and processing capability.

“The deep-learning AI responds by adapting and changing itself to keep functioning,” the researchers say.

Making AI smaller
Whenever computational power or memory is removed from the school’s experimental AI, it becomes smaller and is thus “able to survive in these environments,” says Mohammad Javad Shafiee, a research professor at Waterloo and the system’s co-creator.

Fitting the deep-learning engine onto a chip for use in robots, smart phones, or drones, where both connectivity and weight can be issues, are possible uses for the technology, the researchers say.

“When put on a chip and embedded in a smartphone, such compact AI could run its speech-activated virtual assistant and other intelligent features,” the news article continues.

Edge AI
The University of Waterloo’s stand-alone AI is not the first edge-ified AI that we have seen, though. Unrelated to the Waterloo project, Intel earlier this year launched its Movidus Neural Compute Stick.

That ground-breaking, no-cloud-required, plug-and-play neural compute device (retailing at under $100) is geared towards prototyping and then deploying neural vision networks at the edge with no internet needed. It is no larger than a computer memory stick.

Gaining momentum from that launch, Movidius’s technology is also being used in Google’s upcoming Raspberry Pi-based hobbyist AIY Vision Kit, a do-it-yourself neural vision processor for the Pi camera that costs less than $50. It, too, is portable, simply requiring the Pi computer, camera and the Movidius-running, VisionBonnet Raspberry PI add-on board. Again, no network is needed. The Google TensorFlow-based software can recognise common objects, faces and animals. Movidius’s vision processing can also now be found in security cameras, drones and industrial machines.

In the case of the University of Waterloo’s AI project, the researchers say they have been able to obtain a 200-times reduction in the size of overall deep-learning AI software for object recognition.

Add to that the absence of a need for a network, and “this could be an enabler in many fields where people are struggling to get deep-learning AI in an operational form,” the Waterloo scientists say.

 

IDG News Service

Read More:


Back to Top ↑

TechCentral.ie