Edge computing needs mission critical approach

As the edge becomes application and workload specific, it must also be treated as mission critical, says Schneider Electric
Pro
Kevin Brown, Schneider Electric (Image: Mediateam)

12 April 2019

Edge computing is becoming an increasingly important architecture option in the enterprise portfolio, and will continue to do so, but it must be considered mission critical as much as any centralised data centre facility.

This was a key message for the Schneider Electric Life at the Edge event, where the company set out its evolving strategy to support edge computing, with compelling examples from education, healthcare and research.

According to Gartner, by 2025, some three quarters (75%) of enterprise data is expected to be created and processed at the edge.

However, as Kevin Brown, SVP of Innovation and CTO, Secure Power Division, says, in the past, what has evolved into edge computing, tended to be ad hoc, physically insecure and treated as less than mission critical. If edge-based capability was present at all, it tended to be organised in a manner that was not up to the same standard as equipment deployed within the data centre.

Brown argues that edge computing needs to be treated the same as mission critical equipment and infrastructure in the data centre, standardised, physically secure, resilient and properly managed with intelligent, remote tools.

Schneider Electric says that in the future, new architectures will accommodate different types of data centres, with edge capability as part of that. Brown said that centralised data centres will house massive compute and storage capacity located in remote areas. Regional edge centres will have large compute and storage capacity located in central or urban areas, while local edge centres will feature compute and storage capacity where data is generated and consumed.

Brown highlights that while these facilities might vary in availability and uptime from Tier 3 type facilitates down to the local edge, a consistent approach to availability must be maintained. He gave the example of a centralised data centre where availability might be 99.98%, or the equivalent of just 1.6 hours per year of downtime. This contrasts with an edge facility where availability might be 99.65%, or 30.7 hours of downtime per year. He argued that where an edge computing centre is supporting research, medical or retail services, this downtime is simply not acceptable.

Citing a white paper (Schneider Electric WP 256) which he co-authored with Wendy Torell, Brown said that “prefabricated micro data centres are a simple way to ensure a secure, highly available environment at the edge. Best practices such as redundant UPSs, a secure organised rack, proper cable management and airflow practices, remote monitoring, and dual network connectivity ensure the highest-criticality sites can achieve the availability they require.”

Brown emphasised that ensuring this level of availability was not about taking Tier 3 methodologies and simply applying them to edge situations, rather there is a need to have better resilience, manageability and reliability to ensure availability commensurate to the task.

To accomplish this, Schneider Electric has worked with a robust ecosystem of partners, such as Cisco, HPE, Microsoft, NetApp, Scale Computing, SureStor and Dell EMC, to provide standardised, preconfigured designs, secure enclosures and cloud-based intelligent monitoring systems to allow organisations to easily deploy, manage and leverage edge computing capabilities.

Brown cited an example from a hospital group, where its systems were spread across 51 hospitals, with hundreds of clinics across five states. Initially, the group lacked an enterprise monitoring tool, resulting in a lack of visibility across disparate networks. It also lacked an integrated service agreement to cover the infrastructure.

The solution was to deploy a monitoring system that gave a consolidated view across all networks. The new systems allowed easy deployment of devices, their configuration and firmware upgrades remotely, saving time and personnel resources. Alarms and alerts were consolidated through a management system that also increased productivity, reducing ‘alarm fatigue’.

In the higher education space, a university with 20,000 enrolments, that had 400 odd devices in closets around the campus and more than 200 in its data centre, had no management tools that could reach the local closets. The university required a consolidated view of physical infrastructure centrally, as well as for mobile.

“My edge is focusing on the sequencers,” Simon Binley, Sanger Institute, Wellcome Trust (Image: Mediateam)

A pilot with just 50 devices proved successful for ease of deployment and the required visibility. The facilities director received a mobile dashboard and alert notifications anytime, anywhere, allowing them to contact the network operations centre (NOC) for updates and progress.

Beyond these examples, other areas where edge computing is likely to be a key technology are in the commercial space for the likes of retail, healthcare, finance, and education; in the industrial space for oil and gas, mining, automotive, and manufacturing, and in the telco space for cell towers, base stations, and remote infrastructure.

The Sanger Institute is part of the Wellcome Trust. The institute is a centre for genomic research, where it conducts large-scale, high-throughput genomic studies that enable researchers to participate in national and international projects, in areas such as cancer, infectious disease, human epidemiology and developmental disorders. The institute makes this research available free to researchers and clinicians around the world.

“The data centre is critical for facilitating the gathering, analysis and distribution of data to the research community,” said Simon Binely, data centre manager, Sanger Institute.

To accomplish this, the institute’s data infrastructure must be highly resilient, highly available and able to provide significant compute and storage capacity close to the sequencing and analysis machines. This is done through more than 36,000 compute cores in a massively parallel server farm, serving 25 gene sequencers that run 24 x 7.

The output of 2TB per day per genome, says Binley, requires the proximity of storage and compute. Genomic sequencing has created 55PB of data, of which the institute holds more than 150PB. This data, he said, can never be deleted, and must be available 24 x 7.

Visibility of this estate is critical, as is the ability to tackle issues and remediate without having to dispatch personnel. The on-campus data centre is a 4MW facility to meet the requirements for genomic sequencing.

According to Binley, an holistic approach is taken to efficiency. “Top quality management tools enable Wellcome Trust to deliver efficiency, releasing funds into the business for its primary purpose — scientific research,” he said. This also enables a small team to run data centre services, he adds.

Management software provides visibility of infrastructure, energy use and service requirements, and has also helped validate the data centre development. Now, management information can be delivered by a mobile app. An efficiency monitor delivers a quarterly report for visibility of core assets and performance, as well as energy use, with flags for where use is too high. It also provides visibility of maintenance requirements; predictive assessment of end of life (EoL) requirements of equipment and components such as batteries.

“My edge is focusing on the sequencers,” said Binley.

Working with its services partner EfficiencyIT, the institute came to the realisation that edge computing was the way to achieve the platform necessary for the its work. 

“We were built and tasked to sequence the human genome and we have evolved over the last 25 years. Once we had completed that project, we have moved on to broaden our vision to focus for other things, so that it is not just about the traditional things of bricks and mortar data centres, power and cooling etc.”

“It is important for me,” said Binley, “that the data coming off the sequencers is uninterrupted, in whatever form that is.”

That is where the edge paradigm has come to us, he said.

Schneider Electric’s Brown said this trend is likely to characterise edge computing into the near future. He asserts that as edge computing becomes more standardised and specific, workloads and requirements will drive developments to be application and use-case specific, ensuring that deployments are right-sized to be most efficient.

TechCentral Reporters

Read More:


Back to Top ↑

TechCentral.ie