Micron challenges conventional computer architecture with new chip
21 November 2013 | 0
Micron is challenging conventional computer architectures conceived decades ago with Automata, a highly parallel processor that can change its behaviour to process the task at hand.
The Automata processor, which was announced this week, has thousands of modified memory cells that can be turned into processing units, said Paul Dlugosch, director of Micron’s Automata processor technology development. The memory cells are non-volatile, and can be erased and reprogrammed to solve a certain problem, Dlugosch said.
“This is indeed a new architecture, it’s based on memory,” Dlugosch said, adding that the processor has been under development for seven years.
The customised column of memory in Automata can gang up to process tasks quicker than on conventional computers, Dlugosch said. There are no fixed data sizes, and with a compiler, instructions can be created on the fly targeted at solving specific problems. Data is spread across memory units in parallel for processing, and unlike conventional computers, there is no need to wait for data to be shifted out of memory.
Dlugosch said Automata challenges conventional computer architectures at work since the 1940s in which data is pushed into a processor, calculated and pushed back in the memory with the help of instructions and logic units. One such computer architecture was derived in the 1940s by mathematician John von Neumann. But chip-level limitations and programming languages hamper the ability of current CPUs and GPUs to parallelise tasks, Dlugosch said.
Automata combines logic and DDR memory interfaces, but will not replace conventional CPUs, Dlugosch said. Automata needs a CPU, field-programmable gate array (FPGA), network processor or other host computing units to feed high-level instructions.
“We make no claims that the Automata process will run on its own,” Dlugosch said. “The Automata processor must be programmed.”
For now, Automata can be used as a co-processor for applications in areas such as bioinformatics, security and video processing.
“We’ll see the Automata processor grow in popularity and grow as the dominant analysis engine for unstructured data,” Dlugosch said.
The Automata DRAM DIMM must be thought of as a black box, said Jim Handy, analyst at Objective Analysis. A host processor loads data from another memory, hard drive, or some other source, and then writes code into another part of that DIMM’s DRAM, then tells Automata to get to work.
“The host then goes off and does something else until the Automata signals completion, whereupon the host reads the results,” Handy said.
Automata could be an attempt to get the memory bus out of the way and put the processor in the same package of memory cells, said Nathan Brookwood, analyst at Insight 64.
“They’re basically arguing that in order to get better performance, you have to put processing close to memory,” Brookwood said.
The concept of Automata is not new and a handful of start-ups have pursued tight integration of memory and processing elements, analysts said. Earlier constraints revolved around programming models or memory implementation.
“The basic notion has been around for decades, but the DRAM companies have always seen themselves in a silo that doesn’t include processors and the processor guys have always looked at DRAM as a nasty business, so neither has ever tried to invade the other’s turf,” Handy said.
But with recent technology advances, Micron has a chance to succeed with Automata, though it could be years until tangible results surface, analysts said.
“Maybe this time something will actually work,” Brookwood said.
A lot of Automata’s effectiveness lies in the compiler provided with Micron’s software development kit. The compiler purposes Automata’s architecture and memory units, after which the raw data is streamed through the processors. Once data is loaded, Automata identifies data patterns, and defines behaviour of the processing units to crunch the data.
Dlugosch called Automata a “zero instruction set” processor with the ability to create its own instructions to focus on the targeted problem. The chip has interface logic to buffer input streams and uses high-level instructions from a host processor to control the device at a system level. Automata doesn’t receive instructions that represent a program or algorithm.
Automata may be years away from practical use, but Dlugosch said that Automata won’t replace DRAM. Outside of DDR memory, Handy said that Automata processor architecture has the flexibility to be built around SRAM, flash or other emerging types of memory like magneto-resistive RAM (MRAM), phase-change memory (PCM) or resistive RAM (RRAM).
Automata won’t replace FPGAs either, Dlugosch said. FPGAs, which are widely used for hardware prototyping, are functionally similar to Automata with fast throughput and the ability to be reprogrammed. By comparison, Automata is based on memory architecture and applies more to data analytics than hardware or code testing, Dlugosch said.
Micron has partnered with the University of Missouri and University of Virginia to research, test and write applications for the Automata processor. The company did not say when the chip would officially ship, but the Automata software development kit will be available next year.