Inteligence

Gluon brings AI devs self-tuning machine learning

Pro
(Image: Stockfresh)

17 October 2017

Deep learning systems have long been tough to work with, due to all the fine-tuning and knob-twiddling needed to get good results from them. Gluon is a joint effort by Microsoft and Amazon Web Services do reduce all that fiddling effort.

Gluon works with the Apache MXNet and Microsoft’s Cognitive Toolkit frameworks to optimise deep-learning network training on those systems.

How Gluon works
Neural networks, like those used in deep learning systems, work in roughly three phases:

  1. The developer hard-codes the behaviour of the network.
  2. The developer adjusts how the data is weighted and handled by the network, by changing settings to produce useful results.
  3. The finished network is used to serve predictions.
  4. The problem with steps 1 and 2 is that they’re tedious and inflexible. Hard-coding a network is slow, and altering that coding to improve the network’s behaviour is also slow. Likewise, figuring out the best weights to use in a network is a task ripe for automation.

Gluon offers a way to write neural networks that are defined more like datasets than code. A developer can instantiate a network declaratively, using common patterns like chains of neural network layers. Gluon code is meant to be easy to write and easy to comprehend, and it takes advantage of native features in the language used (for example, Python’s context managers).

Where Gluon helps
The most basic way Gluon helps the developer is by making it easier to both define a network and modify it.

In Gluon, a neural network can be described in the conventional way, with a block of code that does not change. But the network can also be described as if it were a data structure, so it can be altered on the fly to accommodate changes during the training.

Code written in Gluon can take advantage of GPU-accelerated and distributed processing features in both MXNet and Cognitive Toolkit, so training jobs can be distributed across multiple nodes. Its creators say Gluon can do that without any performance compromises compared to the manual approach.

Where Gluon can be used
Gluon works today with MXNet. The Python 0.11 and later front ends for MXNet, for example, have Gluon library support. Gluon also works transparently with both MXNet’s GPU-accelerated edition and the Intel Math Kernel Library extensions for speeding CPU-bound processing.

Microsoft has not yet released a version of the Microsoft Cognitive Toolkit with Gluon support. It promises Gluon support in a future release of the toolkit.

 

IDG News Service

Read More:


Back to Top ↑

TechCentral.ie