Edge Computing

Enterprises tap edge computing for IoT analytics

IoT deployments fuelling investments in edge gateways and HCI
Pro
(Image: Stockfresh)

26 September 2019

IoT needs edge computing. The world is on pace to hit 41.6 billion connected IoT devices generating 79.4 zettabytes of data in 2025, according to research firm IDC. To make the most of that data, enterprises are investing in compute, storage and networking gear at the edge, including IoT gateways and hyper-converged infrastructure (HCI).

Moving processing and analysis to the edge can enable new IoT capabilities, by reducing latency for critical applications, for example, and improve the speed of alerts while easing network loads.

We talked to IoT early adopters in three different industries to find out how they are advancing their IoT deployments by building up their edge computing infrastructure. Here is what we learned.

 

advertisement



 

Managed service provides benefits of edge computing, reduced load on IT staff

SugarCreek is preparing for the next generation of food manufacturing, where high-definition cameras and analytics can work together to quickly mitigate contamination or other processing issues. The only way to handle that automation in a timely fashion, though, is to beef up the company’s edge computing, according to SugarCreek CIO Todd Pugh.

Putting analytics, servers and storage together at the edge to process data from the cameras and IoT sensors on the equipment eliminates the need “to send command and control to the cloud or a centralised data centre,” which can take 40 milliseconds to get from one spot to another, Pugh says. “That’s too long to interpret the data and then do something about it without impacting production.” That type of decision-making needs to happen in real time, he says.

Edge computing can be taxing on an IT department, though, with resources distributed across sites. In SugarCreek’s case, six manufacturing plants span the Midwest U.S. SugarCreek plans to move from its internally managed Lenovo edge-computing infrastructure to the recently launched VMware Cloud on Dell EMC managed service. SugarCreek beta-tested the service for Dell EMC and VMware when it was code-named Project Dimension.

SugarCreek already uses edge computing for local access to file and print services and Microsoft Active Directory; to store video from indoor and outdoor surveillance cameras; and to aggregate temperature and humidity sensors to assess how well a machine is running.

Having this data at the edge versus interacting with the data centre in real time, which Pugh calls “financially impractical,” reduces overall bandwidth demands. For instance, the company can store 30 days’ worth of high-definition video without chewing up limited bandwidth. Other data, such as that generated by the sensors, is gathered and then forwarded at regular intervals back to the data centre.

The managed service will ready SugarCreek for its more advanced surveillance and analytics plans. VMware Cloud on Dell EMC includes an on-premises Dell EMC VxRail HCI, VMWare vSphere, vSAN, and NSX SD-WAN.

“The cloud service is fully managed by VMware and if a hard drive fails, Dell comes in and takes care of it” rather than having specialised IT at each site or making an IT team member travel in case of issues, Pugh says, which helps when working with “pretty tight resources.”

Implementing edge computing in this way also will enable the team to do at the edge anything it can do at the main data centres. “We’ll be able to secure the edge and, using micro segmentation, treat it like it’s just another data centre,” he says. “Switching to a managed service at the edge will allow my people to concentrate on making bacon faster and better rather than worrying about compute power and maintenance.”

Homegrown hyper-converged infrastructure keeps IoT systems on track

Edge computing is helping keep the Wabtec Corp. fleet of 18,000 locomotives on track.

A network of IoT sensors, software embedded in more than 20 computers used to control the locomotive, and human/machine interfaces all send information to be processed in an on board “mini data centre” that handles data acquisition, algorithms and storage. The thousands of messages that come from each locomotive assist the company in “getting ahead of and visibility into 80 percent of failures that occur,” according to Glenn Shaffer, prognostics leader for Wabtec Global Services. That has led him to refer to the edge as “a diagnostic utopia.”

Wabtec (which recently merged with GE Transportation) is not new to data aggregation using wireless sensors, though. The rail transport company first started using a version of IoT (before it was called IoT) on its locomotives in 2000 but found capabilities constrained by the expense of satellite communications, which at the time was the only option to transmit information back and forth to the data centre. Also, trains travel through a multitude of climates, terrain and obstructions (such as tunnels), making connections unreliable.

With edge computing, though, the information generated onboard now can be analysed, reacted upon, and stored within the confines of the locomotive and without exhausting costly bandwidth. Wabtec’s homegrown rugged mini data centre can sense critical failures and respond in real time.

For example, the custom infrastructure monitors parts such as cylinders, gauges their wear, maps that against the demands of upcoming routes such as an intense climb up a mountain, and schedules maintenance before the part has a chance to fail, according to John Reece, Wabtec Freight Global Services CIO.

Similarly, if the on board mini data centre receives an alert that an axle is beginning to lock up, torque can automatically be redistributed to the other axles, preventing a costly breakdown that would require a crane to be brought in to move the vehicle. “Some things fail fast on a locomotive, requiring decisions that run at the millisecond level, so we have to act quickly,” Reece says.

While edge computing is perfectly suited to such ‘fast fails’, Wabtec also relies on cloud resources for more comprehensive monitoring of the locomotive environment, Reece says. For instance, once failing parts are detected and mitigated on board, maintenance shops are alerted via an edge-attached cell modem so they can order parts and schedule appropriate technicians to perform the repair work. Logistics teams receive a heads-up so they can alert customers to delays, conduct re-routes or assign replacement locomotives.

Maintenance shops, which Shaffer considers part of Wabtec’s edge-computing strategy as well because of the computing capacity placed there, also serve as great outposts for full-volume data uploads. Once a locomotive pulls in, techs link the mini data centre to the cloud via a high-speed connection and upload all the stored data. That data is used to conduct fuel performance analyses, life-cycle management and to develop predictive/prescriptive analytics via a big-data platform.

The Wabtec team is careful not to overload the on board system with unnecessary data, minimising the number of sensors and leaving some insight, such as the status of windshield wipers, to humans. Even as 5G bit/sec connections come into play as well as the emergence of autonomous trains, Reece says it will be important to be discriminating about where sensors are placed, what data is collected on board, and how it is processed at the edge. Already IT operates on a philosophy of updating compute power 10x the current state “and it still gets obsolete quickly.” Storage, he finds, has the same issue. “Connectivity along the routes will never be 100 percent reliable, and there’s a risk associated with bogging down the system at the edge where these decisions get made,” he says.

Edge computing complements public cloud resources

Evoqua Water Technologies, a provider of mission-critical water-treatment solutions, is a veteran of IoT technology. For more than a decade it has relied on sensors attached to and embedded in its equipment to remotely monitor its purifying and filtration systems, collect data, and then leverage any insights internally and externally for customers.

“Data transmission was very, very expensive, leading us to only send what was important,” says Scott Branum, senior manager of digital solutions at Evoqua. If the equipment were running correctly, data from the sensors would only be sent once a day. However, if an alarm went off, all relevant data would be relayed to the data centre. This methodology is how Evoqua controlled its cellular costs.

More recently, Evoqua has migrated to edge computing, embedding a small Linux-based gateway device from Digi International to its water treatment systems. While data generated from sensors and other inputs eventually flows from that compute and storage gateway via cellular connectivity to a data processing platform in the Microsoft Azure cloud, some business logic is enacted at the edge.

“We are taking various points of data and aggregating them via proprietary algorithms so business rules can be triggered as necessary,” Branum says. For instance, if a catastrophic incident is detected, analytics at the edge instruct the system to shut itself down based on predefined rules. “There are some things that happen where we can’t wait to take action, and we certainly can’t wait until data is transmitted once a day and then analysed,” he says.

The edge computing setup is also programmed to detect anomalies in equipment performance, pinpoint the issue, and alert an off-site tech team, without involving the data centre. “We’re not just sending out labour to check on a piece of equipment and see how it’s running; we’re getting business intelligence noting a vibration was detected in a specific pump as well as possible solutions,” Branum says. Not only is a technician’s time put to better use on value-added activities, but also the appropriate skill set can be deployed based on the issue at hand.

Branum’s team keeps a close eye on inputs and fine-tunes sensor data to avoid false alarms. “We spend so much time on the front end thinking how we are going to use the edge,” he says. “There hasn’t been a deployment yet that we haven’t had to refine. If the market – our customers – tells us there is no value in certain functionality and we are only creating nuisance alarms, we change it.”

Outside of immediate decisions that need to be made at the edge, data is sent to the data centre for deeper analysis such as cost–benefit reports and life-cycle planning. Branum says using the public cloud rather than a private data centre has helped reduce development costs and keep up with industry standards on security. “We are trying to design an edge with Digi International and a centralised data processing platform with Azure that can scale together over time,” he says.

IDG News Service

Read More:


Back to Top ↑

TechCentral.ie