Extreme Data

Image: Stcokfresh
Image: Stcokfresh

The big volume, velocity and variety Vs of Big Data may be giving way to many more designations, but in the name of making things clearer, writes LESLIE FAUGHNAN

Print

PrintPrint
Pro

Read More:

15 May 2014 | 0

Big Data is just so yesterday. Merely ‘big’ simply does not cut it anymore. To be interesting, much less sexy, in any sense these days, both the volume of the data and its characteristics have to be extreme. Of course what we now think of as extreme will be tomorrow’s biggish corporate workloads, but frontiers are for crossing. All of us with an interest in technology — and it is to be presumed most of us who earn a living from it — track with interest the progress of processor speeds. The continued relevance of Moore’s Law is in itself a fascination, but now that we have long since mastered the arcane art of ganging up processor cores we probably need a new comparator.

A colleague has started working on network and traffic analysis for a Chinese telco with over 500 million subscribers. It is still almost incredible that we have the technology to do that but it’s perhaps a good example of how the advances in technology drive our ambitions in terms of analytics and trying to better understand how things work, Kieran Towey, Accenture

Computing has its origins unequivocally in mechanising routine work. Astrophysics and galactic modelling or aircraft digital prototyping are some of the wonderful capabilities we have reached but we started by trying to develop the mechanical adding machine as a sort of advanced abacus. Adding electricity did bring us eventually to electronics, while display screens and various input devices enabled us to work directly with computers. But our understandable — and fun — preoccupation with the go-ever-faster stripes of the technology should not obscure the basic fact that all of this technology is for human purposes. Those may be about building better civilisations or better bombs, or just making more money, but in fact that does not really matter.

Jargon bust
Accenture’s management scientist Kieran Towey is not too keen on the currently conventional industry jargon about Big Data, not least because he is inclined, like many real experts, to be dismissive of the various Vs and particularly the Volume characteristic as the least challenging. “I would rather think of it as Five Ts—Technology, Talent, Types, Techniques and Targets. Technology has somewhat dominated the discussion, and that’s understandable given our terrific rate of progress. But I think Talent, the people who are going to accomplish our objectives, are right up there in the mix of essentials. Types are almost self-explanatory, but with an extreme range from video to sensor data and back to data warehouses. By Techniques I mean our new sets of tools for analysis that are not taught in standard statistics or economics or similar degrees.

“I include Targets,” Towey says, “because there is no reason and certainly no budget to do any analytics unless there is some problem to be solved or advantage to be gained. That certainly includes problems and questions that have been intractable in the past but that perhaps we have the technology and tools to crack today. I have a colleague, for example, who has started working on network and traffic analysis for a Chinese telco with over 500 million subscribers. It is still almost incredible that we have the technology to do that but it’s perhaps a good example of how the advances in technology drive our ambitions in terms of analytics and trying to better understand how things work.”


Read More:



Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Back to Top ↑