Vertiv experts anticipate self-sufficient, self-healing edge in service of IoT, 5G
The edge of the network continues to be the epicenter of innovation in the data center space as the calendar turns to 2019, with activity focusing on increased intelligence designed to simplify operations, enable remote management and service, and bridge a widening skills gap.
This increasing sophistication of the edge is among the data center trends to watch in 2019 as identified by Vertiv experts from around the globe.
“Today’s edge plays a critical role in data center and network operation and in the delivery of important consumer services,” said Vertiv CEO Rob Johnson. “This is a dramatic and fundamental change to the way we think about computing and data management. It should come as no surprise that activity in the data center space in 2019 will be focused squarely on innovation at the edge.”
“The drivers behind edge computing are increasingly high-demand, low-latency applications such as artificial intelligence and advanced data analytics,” said Robert Linsdell, managing director Australia and New Zealand, Vertiv.
“As always, Australia and New Zealand’s need to get the edge right is greater than most – we have a highly-disperse geography and many of our primary and resurging industries such as mining and manufacturing aren’t done in cities with access to centralised data centers. They’re in far-flung destinations where even basic connectivity can be poor or absent.
“The only way to harness the power and benefits of IoT and smart city applications in these areas is through edge computing and we need further investment in this area to make sure technology expectations among customers, staff and businesses are met across the region.”
Simplifying the Edge: A smarter, simpler, more self-sufficient edge of the network is converging with broader industry and consumer trends, including the Internet of Things (IoT) and the looming rollout of 5G networks, to drive powerful, low-latency computing closer to the end-user.
For many businesses, the edge has become the most mission critical part of their digital ecosystem. Intelligent infrastructure systems with machine learning capabilities working in tandem with cloud-based analytics are fundamentally changing the way we think about edge computing and edge services. The result will be a more robust, efficient edge of the network with enhanced visibility and self-healing capabilities requiring limited active management.
“In Asia, the edge is no longer just a buzzword but a reality and many organisations are realising the value of having a strong core to edge ecosystem to support high compute and low latency demands,” said Anand Sanghi, president, Asia and India, Vertiv. “As the edge becomes a critical and strategic part for many organisations, it’s no longer about simply having availability, but protecting and optimising the edge with the right infrastructure to deliver the best customer experience.”
- Workforce Revolution: A workforce aging into retirement and training programs lagging behind the data center and edge evolution are creating staffing challenges for data centers around the globe. This will trigger parallel actions in 2019. First, organisations will begin to change the way they hire data center personnel, moving away from traditional training programs toward more agile, job-specific instruction with an eye toward the edge. More training will happen in-house. And second, businesses will turn to intelligent systems and machine learning to simplify operations, preserve institutional knowledge, and enable more predictive and efficient service and maintenance.
- Smarter, More Efficient UPS Systems: New battery alternatives will present opportunities for the broad adoption of UPS systems capable of more elegant interactions with the grid. In the short term, this will manifest in load management and peak shaving features. Eventually, we will see organisations using some of the stored energy in their UPS systems to help the utility operate the electric grid. The static storage of all of that energy has long been seen as a revenue-generator waiting to happen. We are moving closer to mainstream applications.
- Pursuing Normalisation: The data center, even in the age of modular and prefabricated design, remains far too complex to expect full-fledged standardisation of equipment. However, there is interest on two fronts: standardisation of equipment components and normalisation across data center builds. The latter is manifesting in the use of consistent architectures and equipment types, with regional differences, to keep systems simple and costs down. In both cases, the goal is to reduce equipment costs, shorten delivery and deployment timelines, and simplify service and maintenance.
- High-Power Processors and Advanced Cooling: As processor utilisation rates increase to run advanced applications such as facial recognition or advanced data analytics, high-power processors create a need for innovative approaches to thermal management. Direct liquid cooling at the chip – meaning the processor or other components are partially or fully immersed in a liquid for heat dissipation – is becoming a viable solution. Although most commonly used in high-performance computing configurations, the benefits – including better server performance, improved efficacy in high densities, and reduced cooling costs – justify additional consideration. Another area of innovation in thermal management is extreme water-free cooling, which is an increasingly popular alternative to traditional chilled water.