Named as Gartner top 10 strategic technology trend last year, digital twins can provide critical insights without risk in the real world, writes JASON WALSHPrint
12 December 2018 | 0
The explosion in data, both storage and the ability to process it, has created countless opportunities for online businesses, but what about those industries such as manufacturing and aerospace that have both feet firmly planted in the material world?
Even these can be transformed, including by the internet of things: enter ‘digital twinning’.
The ‘digital twin’, perhaps like ‘the cloud’, is one of those technology terms that straddles the line between useful and irksome: is this really new? Or is it just a new name? After all, back in the Second World War era simulation was what digital computers were invented to do in the first place, and, later, it was also one of the first applications of the early commercial mainframes.
But not only are today’s computers very different from those of the 1940s or 1960s, so too is the amount of information that is available to them. So, is digital twinning just an idea whose time hascome?
Gartner expects half of all major industrial companies to be using digital twins by 2021, and that it will increase their effectiveness by 10%. In the meantime, digital twins are already being deployed both in production environments as well as in research.
Marc O’Regan, chief technology officer of Dell EMC, told TechPro the company was working with organisations that are looking to implement digital twins in the development of ‘smart cities’.
The idea is one that has caught the attention of politicians, researchers and businesses worldwide: how can we use the vast amount of data that cities produce to make them more liveable?
O’Regan gives one example.
“To build a city first you have to build a village,” he said. “It’s [a case of] start with a village, so if you want to make something it usually starts with [twinning] a building or a small area in a city.”
Working alongside academia and looking for collaboration with municipal authorities, Dell EMC is providing the foundation for understanding how people behave on the streets.
Digital twinning may require significant processing grunt for modelling, but, says O’Regan, that does not mean the smart city requires us all to be kitted out with smart phones and sensors.
“An interesting thing is that it’s not always about the latest and greatest tech. It’s more about taking the technology and making it more relevant. We saw that in Rio de Janeiro, where there was a garbage overflow.
“We considered microservices on mobile phones but instead we published a method for texting it in and that goes into the database, which sends out the bin trucks to clean it up.”
The end result was a real change in people’s lives, he said.
“Not only was it [the rubbish overflow] discomforting, it caused rat infestations, knock-on effects with diseases, hospital beds and so on. By doing something simple, building a text based algorithm, we were able to help with that,” he said.
Much of the recent excitement around digital twins follows Gartner’s research, particularly the 2017 paper ‘Prepare for the Impact of Digital Twins’.
In it, Gartner said that, while for more than 30 years, product and process engineering teams have used 3D renderings of computer-aided design (CAD) models, asset models and process simulations to ensure and validate manufacturability, new data sources will offer a powerful way to monitor and control assets and processes.
“A digital twin is a management model. That’s the long and the short of it,” said Alfonso Velosa, research vice president for internet of things and digital business transformation at Gartner.
For Velosa, the key factor is how data can change how we conceptualise an item or process.
“If you think of American Airlines, they’ll have, let’s say for the sake of argument, a thousand aircraft. You have to be able to distinguish between jet engine 50 and jet engine 223.
“It’s CAD-CAM, but it’s about the data,” he said.
Velosa say that the basis for digital twinning is modelling, but that this focus on data is where it develops transformative power.
“It is exactly modelling, but how do we blend the and physical and digital world and why wold you? NASA did it for a specific reason. Now you would do it with a specific asset, something like what is going on with an asset, or even just ‘is it on or off’.”
One obvious area of benefit is predictive maintenance, but, says Velosa, it can also provide insights that will themselves drive revenue as more and more devices are network-enabled.
“If you’re the CEO of a manufacturer of connected coffee machines, say, you want to know which cartridges are being used, how often. Can we send them coupons on whatever kind they have? What’s going on with the asset, and how can we serve the customer better?
“For any connected asset if you don’t have any digital twin you can be sure you competitor is at least looking at it,” he said.
Putting on the brakes
Not everyone is impressed with digital twinning, however; or at least with how it is being promoted and reported.
Mike Hinchey, Irish Computer Society vice president and fellow as well as former director of the Irish software research centre Lero, pours some cold water on the concept.
“To me, all it is is a simulation, and we’ve been doing that since the 60s,” he said.
Hinchey, who formerly worked at NASA, the birthplace of the digital twinning concept, has no problem with the process, but fears that it represents the latest hype cycle of breathless IT reporting.
“A lot of what we hear [about everything in technology] is hot air,” he said, referring in particular to a recent article in a prominent American business magazine.
“I worked for NASA for fifteen years, and every mission had a summation, not just digitally but also physically. Look at something like Hubble [space telescope] repair: we had the world’s biggest clean room and we had a replica of the telescope, everything apart from the lens.
“Everything that was done in space was done not only in space but was done on earth,” he said.
One business that has taken a major interest in the concept is Siemens.
For Siemens, there are three basic kinds of digital twin: product digital twins, production digital twins, and performance digital twins. The first is for product design, the second for use in streamlining production and manufacturing environments, and the third is a digital twin that uses connected devices out in the wild to generate live operational data.
Real world data is clearly at the centre of Siemens’ focus. Indeed, Siemens recently developed, and issued a white paper on, MindSphere, a cloud-based, open operating system for the internet of things.
The benefits of digital twinning in extremely high-end industry—such as aerospace, nuclear power and so on—are as clear as day, but is there anything in it for more traditional businesses, such as mid-sized manufacturing?
Simon Roberts, senior analyst at Manufacture 2030 says that there is.
“From our perspective, we’ve been looking at resource efficiency simulation [because] that’s the groundwork before getting into digital twinning. The idea essentially is to simulate your material environment so that you can understand how it is consuming energy and consuming resources,” he said.
Where traditional simulation for manufacturing tends to be in production optimisation, digital twinning has the potential to go much deeper, he says.
“Typically, you have live data that is going into to a system. The power is what you can do with that representation: predictive analytics and machine learning. You take that representation and you look at historical performance and projecting the future from that,” he said.
“If you look at where digital twins are being applied, let’s say the automotive sector, the production of a car body for example: how the robotics are moving in real time. An operation might take a minute or 90 seconds. You’re looking at every single movement of that robot in that time period.”
Roberts expects sectors outside the very high end to get involved.
“What typically is the case is that high-value sectors like aerospace and automotive tend to be early adopters. They want to model the lifecycle and they have the resources to invest in that sort of stuff. If you’re looking at that much data you are looking at expensive equipment [but] where we are in terms of industry, in terms of the cost of sensors and equipment, that’s coming down and the barriers to entry are coming down.
Indeed, Manufacture 2030 is working on a project funded by Innovate UK, the UK’s official technology strategy board, to improve resource efficiency in the sector precisely through digital twinning.
“Innovate UK recognises that there is a need for this and [to] create something that is not just for high-end, but is more accessible in general terms,” said Emma Gollub, head of partnership at Manufacture 2030’s parent 2degrees.
Aoife Connaughton, senior manager at Deloitte, has worked in developing robotic process automation and says that digital twinning does have an application there.
“It wouldn’t be a traditional way, but it can be done, and it’s very attractive: it’s low cost, it’s scalable and you can change things,” she said.
How it will play out is, as yet, an unknown, as few companies have as yet got involved with it in a big way, though the signs are that they will, and it also plays out in different ways in different applications.
“A lot of stuff like this isn’t used because it’s not on people’s radars,” she said.
“It’s probably genuinely exciting in finance but the textbook digital twinning application is in manufacturing,” said Connaughton.
“It has nice software that wraps it up together. The end result is the same as modelling, but I suppose it’s modelling that is turbocharged by IoT data rather than [merely] sample data or hypotheses.
“We’re seeing a lot of big regulatory changes, Brexit for example is really challenging for clients, so it could be a good way to go back to a regulator and say ‘this is how this change will affect us’ and so on.”
In the end, what is inevitable is that given that data is produced, whether by machine parts or people’s actions, it will be fed back into industrial processes in one way or another, and this requires some kind of analytics.
“Data proliferation is a thing: there is just a shed load of data out there and companies are [now] only scratching the surface of it,” she said.