Friday, April 25, 2025

Edge


Iceotope cuts power for edge and data centre compute requirements
Iceotope has announced that its chassis-level cooling system is being demonstrated in the Intel Booth at HPE Discover 2022, the prestigious ‘Edge-to-cloud Conference’. Ku:l data centre is the product of a close collaboration between Iceotope, Intel and HPE and promises a faster path to net zero operations by reducing edge and data centre energy use by nearly a third. Once the sole preserve of arcane, high performance computing applications, liquid cooling is increasingly seen as essential technology for reliable and efficient operations of any IT load in any location. There is a pressing concern about sustainability impacts as distributed edge computing environments proliferate to meet the demand for data processing nearer the point of use, as well as growing facility power and cooling consumption driven by AI augmentation and hotter chips. Working together with Intel and HPE, Iceotope benchmarked the power consumption of a sample IT installation being cooled respectively using air and precision immersion liquid cooling. The results show a substantial advantage in favour of liquid cooling, reducing overall power use across IT and cooling infrastructure. Putting Ku:l data centre to the test To understand the operational advantages of Iceotope’s Precision Immersion Cooling system, Ku:l data centre was compared to a traditional air-cooled system using a 19.6kW load comprising 16x HPE ProLiant DL380 Gen10 servers under stress test conditions. Laboratory tests, using industry-standard high power computing benchmarks across a range of ambient temperatures, demonstrated that the Iceotope Precision Immersion Cooled system enabled a 4% increase in performance with zero throttling in higher ambient temperatures at server level and consumed 1kW less energy at rack level than its air-cooled counterpart. This represents a 5% energy saving in the IT alone and a 30% saving at scale based on a typical cooling power usage effectiveness (pPUE) of 1.4 in air and 1.04 in liquid cooled data centres. Iceotope’s Ku:l data centre solution is being demonstrated at HPE Discover housed in a standard Schneider Electric NetShelter rack with heat rejected to a Schneider heat removal unit (HRU). The integration with HPE ProLiant DL380 servers, as well as provision through channel partners is supplied, supported, and warrantied by IT distribution giant, Avnet Integrated. The three companies announced their partnership to provide liquid cooled data centre solutions in 2019. Air cooling challenges overcome with Ku:l data centre Air cooling cannot be used precisely or sustainably to cool high-power chips and processors, and it is commonly held that it is no longer a suitable approach for ensuring an appropriate operating environment for increasingly energy dense IT equipment. Not only is liquid a significantly more effective medium for heat removal than air, but each liquid-cooled chassis is 100% sealed, protecting the critical IT from the surrounding atmosphere and creating a stable operating environment. Isolating the IT from the external environment opens a world of potential facility sites and distributed IT locations that could not have been considered until now. Further benefits accrue from the removal of the server fans and the need for other air handling equipment from the data centre space. Energy use is significantly reduced, water consumption is virtually eliminated, and noise becomes a thing of the past. Unlike air cooled infrastructure, Precision Immersion Cooling does not require rack depopulation at higher densities, enabling the racks to run fully populated to facilitate more servers and storage devices, and/or denser IT loads. Additionally, Iceotope’s sealed liquid-cooled chassis enclosure simply converts off-the-shelf air-cooled servers to liquid-cooled servers with a few minor modifications including removing the fans. This means that industry standard form factors, including edge and data centre racks, can be used to accommodate liquid-cooled IT, with maintenance and hot swapping carried out on site with familiar ease and no mess, without the need for heavy lifting gear or spill kits. Iceotope Director of Product Strategy, Jason Matteson says, “The processing requirements for ubiquitous AI and high-performance applications across the board are already creating a sustainability dilemma for operators. Accommodating a precipitous increase in chip power at the same time as lowering carbon emissions in distributed edge locations as well as data centres is problematic. Iceotope’s Ku:l data centre demonstrates a very practical response to an urgent need for a paradigm shift in data centre design.” Jen Huffstetler, Chief Product Sustainability Officer at Intel comments, “Today, sustainability calls for data centre cooling solutions to increase efficiency, flexibility and scalability, while also delivering the performance levels today’s computing demands. The new Ku:l data centre precision immersion environment enables predictable IT performance with precise cooling and higher space utilisation in a familiar format for today’s mission critical facilities.” Phil Cutrone, Vice President and General Manager of Service Providers, OEM and Major Accounts at HPE says, "There is a greater need for zero-touch edge-computing capabilities to ensure reliability at remote locations when in-person monitoring, and maintenance is not always feasible. The combined solution enables customers to access high-density applications using precision immersion and liquid-cooled racks for instant deployment in any environment, whether it is in on-premises in a data centre or at the edge." With rising demand for cloud storage solutions, savings of the calibre offered by the new Iceotope Ku:l data centre solution are becoming more critical to tempering rack densities. According to Uptime Research, the average server rack density increased 15% from 7.3kW to 8.4kW between 2019 and 2020, and anticipated microprocessor introductions are likely to accelerate this trend.

PacketFabric launches multi-cloud connectivity at 1623 Farnam
1623 Farnam has announced that PacketFabric has joined 1623’s rapidly growing ecosystem. This strategic match brings PacketFabric closer to its customers in the Midwest and enriches connectivity options for tenants at 1623 Farnam. “Digital businesses are continuously looking for ways to improve user experience delivery, and latency matters,” comments Jezzibell Gilmore, Co-Founder and Chief Commercial Officer at PacketFabric. “We’re delighted to work with 1623 Farnam, offer our customers the opportunity to establish a presence in a high-quality data centre, and power private network solutions and emerging edge-based applications.”  “We are always growing our ecosystem to provide the greatest interconnection opportunities for our customers, and we are thrilled to welcome PacketFabric,” says Todd Cushing, President of 1623. “1623 Farnam is the ‘easy button’ for organisations to accelerate their digital transformation. Our customers have a high level of network assurance with easy access to 100G redundancy, enterprise-grade, carrier-class interconnection, very low latency, and full network resiliency. They do not have to compromise on anything anymore." PacketFabric and 1623 Farnam will co-host a webinar on June 16 to outline how the new partnership will enhance the rapidly growing 1623 ecosystem. The videocast discussion will focus on how the 1623-PacketFabric partnership will benefit fbusinesses undergoing digital transformation, and how the PacketFabric platform will optimise their digital transformation experience.  Webinar participants will learn how to build a highly interconnected edge presence with unparalleled speed and efficiency, a vital consideration for businesses embarking on their own digital transformations. The session will include a demonstration of PacketFabric’s agile, secure, on-demand interconnection platform, which enables users to connect with other data centres quickly and efficiently, effectively extending their reach to wherever they need their edge to be. One of the remarkable features of the PacketFabric platform is the ability to accelerate up to 10G in bursts to accommodate periods of heavy traffic. Speakers at the webinar will include interconnection and edge experts from both PacketFabric and 1623. One-on-one consultations will be available after the conclusion of the webinar.

Award win for project underpinning vaccine, cell and gene therapy manufacturing capability
Keysource, in association with CENTIEL, have been named as winners of the Edge Project of the Year at the DCS Awards 2022. The award winners were announced at a gala dinner held at the Leonardo Royal Hotel London St Pauls on 26 May 2022. The teams’ work has been recognised following the completion of an installation of a truly modular, scalable, and highly efficient UPS and electrical infrastructure solution to help underpin growth in a UK Government effort to develop world-class innovative vaccine research and manufacturing capability. Scientists and researchers based in the centre will accelerate the time taken for new treatments to be delivered to patients by developing cutting-edge therapies to treat life-changing diseases. The edge data centre that supports the essential laboratory work, is responsible for ensuring that samples and vaccines remain in optimal condition. The installation is now the first medical facility in the UK to take advantage of Li-ion batteries in combination with CENTEIL’s fourth generation three phase, true modular UPS, CumulusPowerTM which offers industry leading 99.9999999% (nine, nines) availability translating to just milliseconds of downtime per year. Louis McGarry, Sales and Marketing Director, CENTIEL UK explains: “The facility runs off different energy sources from the grid to sustainable power. This means the UPS is called on more often to provide a clean, continuous source of power to support the laboratory environment and ensure optimal conditions. Li-ion batteries offered the perfect solution as they are highly capable of cycling many times over, unlike traditional VRLA batteries, where cycling shortens their design life significantly. Richard Clifford, Head of Solutions at Keysource comments: “At Keysource, we deliver the finest and most efficient facilities, utilising the latest in critical power and cooling technologies. Our solutions support the latest generation of high-performance, high-density computing and have been internationally recognised as examples of best practice. We are proud to support clients that lead the development of new, innovative technologies to bolster the UK’s medical research capability. This project is a perfect example of why investing in critical infrastructure, and its protection is essential to ensure these vital services continue to operate, now and in the future.”

EkkoSense extends data centre optimisation support to edge sites
EkkoSense has extended its EkkoSoft Critical solution to ensure full support for edge sites. This new capability will provide data centre operators with the widest possible view of their critical facilities’ performance - from the smallest server room through to the largest rooms. This means that, for the first time, operations teams can gain real-time access to power, cooling and space optimisation data from across their entire data centre estate. EkkoSense’s new edge site monitoring and optimisation solution integrates previously unused data sources to support edge facilities that were either left unmonitored or only tracked by generalised Building Management Systems (BMS). Edge sites now covered by EkkoSoft Critical include single or smaller server rooms, hub sites and telecom equipment rooms. Edge site data can now be viewed, analysed and optimised using EkkoSoft Critical’s intuitive single pane of glass enterprise estate performance visualisations. “With analyst firms now estimating that soon more than half of enterprise-generated data will be either created or processed outside of the data centre or the cloud, it’s essential for organisations to have much greater insight into their growing number of edge facilities,” says Paul Milburn, EkkoSense’s Chief Product Officer. “Now we’re able to not only support operations teams with enterprise-wide visibility, but also deliver the thermal, power and capacity management support they need to run remote sites more efficiently while also supporting greater IT loads.” Unlike traditional remote BMS solutions that can only respond to hard faults, the EkkoSense solution features more flexible alerting with user permission configuration. This effectively delivers a more comprehensive ‘mini-BMS’ alternative at around a tenth of the cost of more complex BMS solutions. EkkoSense software for edge is particularly easy to deploy, with a starter kit package of wireless sensors and the EkkoHub Wireless Data Receiver that supports self-installation by non-IT professionals. Each EkkoHub can support up to 200 wireless sensors and up to 20 Modbus/SNMP devices in direct mode, ensuring support and scalability for a broad range of edge sites from single server rooms through to larger hub sites or equipment rooms.

An edge explosion?
Andy Connor, EMEA Channel Director, Subzero Engineering, outlines this edge explosion and examines the crucial role of the modular, micro data centre in delivering digital transformation. The idea of delivering IT resources close to the point of use is not a new one. However, where once the required data centre and IT infrastructure resources were relatively inflexible, slow and expensive to build out and run, today’s digital solutions provide the necessary mix of scalability, agility, flexibility, speed and cost-effectiveness to make the edge a transformational reality. The Linux Foundation’s recent ‘State of the Edge 2021’ market report suggests that, between 2019 and 2028, approximately $800 billion will be spent on new and replacement IT server equipment and edge computing facilities. At the same time, the global IT power footprint for infrastructure edge deployments is forecast to increase from 1 GW to over 40 GW. Valuates Reports predicts that the global edge computing market alone will reach $55,930 million by 2028, from $8,237 million last year. The growth of edge Edge growth will be witnessed across almost every industry sector, including transport and logistics, manufacturing, energy and utilities, healthcare, smart cities and retail. This growth can be divided into two subcategories: edge devices – all manner of (IoT-enabled) sensors and handheld devices which will leverage artificial intelligence and machine learning to generate, process and act upon data locally; and edge infrastructure – the networks (including 5G) and data centre infrastructure required to support the ‘local’ applications and to house the servers which will collect, process and store the likely zettabytes of data and images these applications generate via the devices. Industry 4.0 promises to revolutionise the manufacturing industry, with more and more intelligence and automation being implemented to optimise product design and testing as well as actual production processes. Edge sensors, devices and infrastructure can be installed retrospectively to upgrade existing manufacturing facilities – indeed many organisations have already embarked on this process. Native edge applications which incorporate advanced digital operations, such as autonomous, mobile robots, will be a major feature of new greenfield factories, constructed, for example, to produce the autonomous vehicles and alternative, sustainable energy infrastructures of the future. Edge and affected industries Autonomous vehicles are, perhaps, the best example of just how prevalent edge computing will become. Not only will all manner of IoT sensors and devices be deployed in the manufacturing supply chain, as well as during vehicle production; there will also be the need for edge data centre infrastructure to be created within the factories, to ensure near real-time communications between the computers, the networks and the data storage. The autonomous vehicles themselves will incorporate a wide range of edge sensors and devices, ranging from those required for driving, control and safety functions, alongside what might be termed as ‘passenger interaction’ technology – everything from entertainment to safety instructions, route and time details. There will need to be a massive build out of edge infrastructure across the road network. This will serve three main functions: traffic safety at the very local level, ensuring there are no crashes or accidents with other road users or pedestrians; traffic management across specific zones, most notably in busy urban environments; and regional or national strategic data collection, processing and reporting, to enable analysis of traffic trends and patterns, with a view to making improvements. More widely, autonomous vehicles, transport systems and traffic management are a key component of the rapidly developing Smart City concept. Road pricing and travel ticketing tariffs both rely on edge devices and infrastructure to respond to varying demand. Security and surveillance are key digital services within metropolitan areas and will increasingly rely on edge-enabled real-time reporting to flag up potential issues more quickly and more accurately than human beings scanning multiple security screens. And the same is true in the retail sector – although, as many shops are located in city locations, one could argue that smart retailing is almost a sub-sector of the smart city. Point of sale (PoS) payment systems have existed for many years, relying on both edge devices and, increasingly, relatively local edge data centres, to ensure the transaction speed and reliability, as well as efficient stock management. Customer loyalty programs are another example of edge technology already at work. Smart transport, smart cities, smart retail. To this list can be added smart homes, smart healthcare, smart energy, almost every activity can benefit from the addition of some kind of intelligence and/or automation. Our daily domestic and working lives, which increasingly intersect thanks to the digital transformation accelerated by the pandemic, will feature literally hundreds of edge interactions. Infrastructure – the edge explosion bottleneck? Talking about edge applications is a great deal easier than implementing them. If autonomous vehicles, smart cities and smart retail are to become an everyday, reliable reality, then there needs to be a major build-out of the edge infrastructure required to make them happen. Look beneath the surface of almost any edge application and the data centre is the crucial foundation upon which to build. Right now, large, centralised data centre facilities are the norm. But this is beginning to change with the realisation that the local, real-time requirements of so many edge applications require a small, local, fast, agile, flexible and, importantly, scalable data centre to match. What’s needed is a micro data centre that is as dynamic as the customer’s edge application. Customers should have the flexibility to utilise their choice of best-in-class data centre components, including the IT stack, the uninterruptible power supply (UPS), cooling architecture, racks, cabling, or fire suppression system. So, by taking an infrastructure-agnostic approach, customers have the ability to define their edge, and use resilient, standardised, and scalable infrastructure in a way that’s truly beneficial to their business.  Furthermore, by adopting a modular architecture users can scale as demands require it, and without the need to deploy additional containerised systems. This approach alone offers significant benefits, including a 20-30% cost-saving, compared with conventional ‘pre-integrated’, micro data centre designs. For too long now, our industry has been shaped by vendors that have forced customers to base decisions on systems which are constrained by the solutions they offer. Now is the time to disrupt the market, eliminate this misalignment, and enable customers to define their edge as they go. By providing customers with the physical data centre infrastructure they need, no matter their requirements, you can help them plan for tomorrow. Disruptive edge applications require disruptive, proven micro data centre solutions.

New research uncovers edge computing challenges
Schneider Electric has unveiled findings from a newly commissioned IDC White Paper entitled, ‘Succeeding at Digital First Connected Operations’ that highlights the power of edge computing in enabling the shift to a digital-first world. The white paper details responses from over 1,000 IT and operations professionals across industrial, healthcare, education, and other verticals as well as a series of in-depth interviews with industrial enterprises. Respondents were global, representing firms in the United States, China, Japan, Germany, The United Kingdom, India, and Ireland. The organisations ranged in size from 100 to more than 1,000 employees. Responses provided insights about the factors driving edge investments, the challenges firms faced while deploying to the edge, obstacles to continued investment, and strategic recommendations to future-proof edge capabilities. “As organisations seek to create new or improved experiences for customers and to become more operationally efficient, improve safety and security, and become more sustainable, they are leaning more on digital technologies. The white paper examines the crucial role that edge computing and edge deployments play in enabling digital-first, connected operations,” says Chris Hanley, SVP, Commercial Operations & Global Channels, Leading Edge Commercial Strategy, Schneider Electric. “It highlights strategies that IT professional and decision makers can adopt to future proof their edge computing capabilities to support remote, connected, secure, reliable, resilient, and sustainable operations.” Edge computing is one of the major enablers of a digital-first paradigm. In fact, the most common use cases of edge infrastructure include cybersecurity systems to monitor the operational network locally as well as storing and processing operational data to bring it to the cloud. Further, when organisations were asked why they were investing in edge computing to support these workloads, respondents cited, ‘improve cybersecurity’ (50%) and ‘systems resiliency and reliability’ (44%). Yet, there are various challenges that organisations must overcome to ensure their edge infrastructure, and thus, their connected operations, are resilient and reliable. Despite the promise of the edge, many organisations report connectivity and power outage concerns. In fact, 32% of respondents have experienced a ‘lack of connectivity or slow connectivity’ with their edge deployments. Further, 31% have experienced a ‘utility power outage or power surge lasting more than 60 seconds.’ Challenges to overcome when transitioning to digital-first connected operations: • Security. Physical security and cybersecurity concerns are high when connecting operations. This concern will require systems and processes that are tailored for this new paradigm. Yet, once connected to the cloud, the power of operational data can be harnessed to drive a host of new and enhanced use cases. Such data can enhance collaboration in the enterprise and enable remote operations capabilities that result in labour efficiencies while ensuring companies have resilient, remote operations capabilities. • Skills. The workforce needs to have the right skills to execute across technology settings and to be able to build alignment internally to drive change. This focus will require companies to engage with new ecosystem partners inside and outside of their organisation. • Reliability. As more of the local operations capabilities are directly supported remotely through the connected edge, reliability is a critical concern. “Resilient edge resources are the foundation for shifting to digital-first, connected operations,” says Jennifer Cooke, Research Director, Edge Strategies, IDC. “Organisations will become vulnerable if and when their technology fails. To future proof edge deployments, leaders must develop a strategy that addresses concerns, such as cybersecurity and connectivity issues, and ensures access to the skills required to maintain resilient edge infrastructures.” How organisations can future-proof edge capabilities to support their transition to digital-first connected operations: • Resilient, secure, sustainable power and connectivity resources: By including resilient power and connectivity resources early in the edge planning phases, companies can reduce the risk of downtime. • Remote Monitoring and Management of edge resources at large scale: The ongoing management of edge infrastructure at scale will challenge all organisations. Having the right skills in the right place at the right time will be difficult if not impossible. Ensure that your edge resources are equipped to support continuous remote monitoring and autonomous operation. • Trusted partners that can provide the necessary skills for the above edge resources: Consider trusted partners to provide industry best practices and service in situations or locations where it is not economically or physically feasible to do it yourself. Trusted service partners can often predict problems before they occur. Further, look for partners that also have a commitment to sustainability since among those surveyed, 82% cited commitment to sustainability as a selection criterion for edge solution providers. As a trusted partner and full solution provider, Schneider Electric works closely with customers in designing their strategies to ensure certainty, resiliency, security and sustainability throughout the design, deployment and management at the edge via: • Resilient, secure, connected and sustainable physical infrastructure solutions for any edge environment – delivering certainty in a connected world. • A cloud-based monitoring and management platform EcoStruxure IT providing remote visibility including security, data-driven insights and recommendations, reporting capabilities and digital service capabilities. • An integrated ecosystem composed of IT Technology alliances, a global network of trusted experienced channel partners and service engineers as well as rules-based design tools.

DataQube to supply Edge Centres with data centre modules for US deployment
DataQube is actively supporting Edge Centres with its US expansion plans. The Australian firm has placed an order for 20 DataQube pods to provide an edge data centre offering, with further orders expected over the forthcoming months. DataQube’s unique solution will be integral to Edge Centres’ ambitious rollout plans by enabling the company to deploy multiple edge data centre and colocation facilities quickly and at scale. DataQube has been selected as the preferred solution because of its short deployment times, its compelling price point, and its green credentials. DataQube’s design architecture removes the need for expensive property refurbishments to accommodate specialist HPC and associated cooling equipment. All IT, storage and power infrastructure is housed within secure and sterile units that satisfy all current building regulations and LEED standards. As such DataQube installs can be fully operational within a nine month timeframe and for 50% less upfront investment compared to conventional data centre build projects.  “We needed a partner that we could work with globally,” says Jon Eaves, CEO at Edge Centres. “DataQube and Edge Centres are aligned on a global roll-out plan starting in the US” The outer and inner structures of DataQube’s novel offering are manufactured from lightweight materials for portability and easily assembly purposes. The units are also supplied flatpack, permitting transportation in bulk.  Moreover, the solution’s person-free layout enables optimal use of IT, thus reducing energy consumption and co2 emissions by over 50% because the energy transfer is dedicated solely to powering computers. This equates to a PUE of less than 1.05, the lowest in the industry. “We are delighted to be working with Edge Centres on this exciting new venture,” comments David Keegan, CEO of DataQube Global. “Deploying our podular data centres instead of commissioning a data centre from scratch is not only makes commercial sense, DataQube’s green credentials will help data centres of the future to become more sustainable by reducing energy consumption, not just switching source.”

DataQube to supply Edge Centres with 20 x edge data centre modules
DataQube is actively supporting Edge Centres with its US expansion plans. The Australian firm has placed an order for 20 x DataQube pods to provide an edge data centre offering, with further orders expected over the forthcoming months. DataQube’s unique solution will be integral to Edge Centres’ ambitious rollout plans by enabling the company to deploy multiple edge data centre and colocation facilities quickly and at scale.  DataQube has been selected as the preferred solution because of its short deployment times, its compelling price point, and its green credentials. DataQube’s breakthrough design architecture removes the need for expensive property refurbishments to accommodate specialist HPC and associated cooling equipment. All IT, storage and power infrastructure is housed within secure and sterile units that satisfy all current building regulations and LEED standards. As such DataQube installs can be fully operational within a nine-month timeframe and for 50% less upfront investment compared to conventional data centre build projects.   The outer and inner structures of DataQube’s novel offering are manufactured from lightweight materials for portability and easily assembly purposes. The units are also supplied flatpack permitting transportation in bulk. Moreover, the solution’s person-free layout enables optimal use of IT, thus reducing energy consumption and CO2 emissions by over 50% because the energy transfer is dedicated solely to powering computers. This equates to a PUE of less than 1.05, the lowest in the industry.  “We needed a partner that we could work with globally,” says Jon Eaves, CEO at Edge Centres. “DataQube and Edge Centres are aligned on a global roll-out plan starting in the US.” “We are delighted to be working with Edge Centres on this exciting new venture,” says David Keegan, CEO of DataQube Global. “Deploying our popular data centres instead of commissioning a data centre from scratch is not only makes commercial sense, DataQube’s green credentials will help data centres of the future to become more sustainable by reducing energy consumption, not just switching source.” DataQube has already set up manufacturing facilities in the in the US as part of its ESG and sustainability strategies. 

Aruba ESP delivers services for the protection of edge-to-cloud networks
Aruba has announced significant advancements to Aruba ESP (Edge Services Platform), with new functionality in Aruba Central to enable organisations to keep pace with rapidly changing business requirements. The new Aruba Central NetConductor allows enterprises to centralise the management of distributed networks with cloud-native services that simplify policy provisioning and automate network configurations in wired, wireless, and WAN infrastructures. Central NetConductor enables a more agile network while enforcing Zero Trust and SASE security policies. Aruba also revealed the industry’s first self-locating indoor access points (APs) with built-in GPS receivers and Open Locate, a proposed new industry standard for sharing location information from an AP to a device. Digital acceleration driven by remote/hybrid work, new business models, and the demand for improved user experiences highlights the need for a more agile, flexible network. Aruba provides a comprehensive set of cloud-native services to deal with the complexity of multi-generational architectures with their attendant operations and security challenges. Traditional VLAN-based architectures require significant manual configuration and integration, are slow to adapt to new business connectivity requirements, and introduce potential security gaps. A modern, agile network employs a network “overlay” that seamlessly stitches together existing VLAN segments with cloud-native policy and configuration services that enables users and devices to make secure and reliable connections from anywhere. To help customers accelerate their digital transformation initiatives, Central NetConductor uses AI for management and optimisation, implements business-intent workflows to automate network configuration, and extends Aruba’s built-in security with cloud-native Network Access Control (NAC) and Dynamic Segmentation for fabric-wide enforcement. Because Central NetConductor is based on widely accepted protocols such as EVPN, VXLAN and BGP, it can be adopted in a seamless manner that preserves investments based on the ability to operate with existing Aruba networks and third-party vendor infrastructures.  “In today’s business world, flexibility is paramount. Enterprises need to be able to shift gears, turn up new services and offerings, and serve new customers seemingly overnight. Because the network underpins everything, enabling critical connectivity and data-driven intelligence, it’s got to have the flexibility built-in,” says Maribel Lopez, founder of Lopez Research. “Organisations today should look for standards-based solutions that give them technical flexibility and the ability to protect their investments and adopt new technologies at their own pace, but also options when it comes to consumption models.” Three key principles of network modernisation Static networks no longer meet growing business demands or support changing security requirements; therefore, organisations must be in a process of continuous network modernisation based on three main principles: Automation: Simplified workflows and AI-powered automation to reduce the time and resources required to plan, deploy, and manage networks that support remote, branch, campus, and cloud connectivitySecurity: Increased threat detection and protection with built-in identity-based access control and Dynamic Segmentation that are the foundation for Zero Trust and SASE frameworksAgility: Unified, cloud-native, standards-based architecture for investment protection and ease of adoption with NaaS consumption models to optimize budget and staff resources Aruba Central NetConductor accelerates the deployment, management, and protection of modern, fabric-based networks by mapping capabilities to the three network modernisation principles: Automation: Intent-based workflows with “one-button” connectivity and security policy orchestrationSecurity: Pervasive role-based access control extends Dynamic Segmentation for built-in Zero Trust and SASE security policy enforcementAgility: Cloud-native services for a single point of visibility and control. Standards-based for ease of migration and adoption to preserve existing investments Innovations in indoor location services WLAN AP installation remains a manual process which is time-consuming, prone to error, and results in an unreliable reference for location-aware applications. To address this, Aruba has introduced the industry’s first self-locating indoor APs to simplify how organisations capture indoor location data and communicate information over the air to any mobile device or application. Aruba Wi-Fi 6 and Wi-Fi 6E APs use a combination of built-in GPS receivers, Wi-Fi Location support for fine time measurement and intelligent software to enable highly accurate, automated WLAN deployments. Aruba’s self-locating WLAN APs provide zero-touch determination of AP location, continuously validate and update location, and provide a set of universal coordinates that may be transposed on any building floor map or web mapping platform.  Accurate location of the WLAN infrastructure creates an anchored reference that is shared using Open Locate. Businesses can use the universal coordinates and anchored reference of Aruba’s self-locating indoor APs to easily develop or enhance asset tracking, safety/compliance, facility planning, venue experience apps or other location-aware services. “Location is core to many app experiences and accurate indoor location unlocks many new and innovative enterprise use cases,” says Sean Ginevan, head of Global Strategy and Digital Partnerships for Android Enterprise at Google. “With Android 10, Google was first to fully support Wi-Fi RTT to enable precise indoor location on mobile devices. Aruba’s self-locating network infrastructure and the Open Locate initiative will help realize the vision of accurate, indoor location for our developer community and make it much easier to deploy these networks at scale. We can’t wait to see what developers build.”   “Enterprises have shown tremendous resiliency in the face of major disruptions and tectonic shifts within their businesses over the past two years, and it’s become clear that business agility is now top-of-mind for our customers,” comments David Hughes, Chief Technology and Product Officer at Aruba. “The advancements introduced today will help customers evolve their approach to a ‘services orientation’ using AI-powered solutions, strengthening security and accelerating the move to a cloud-centric network architecture, which are all hallmarks of a modern network.”

How to keep networks running as demand places pressure on the edge
By Alan Stewart-Brown, VP EMEA, Opengear The traditional data centre has been a mainstay of computing and connectivity networks for decades, with most processing transactions being carried out in a centralised core. Although core networks are essentially the backbone of any network, mobility, technological advancements and user demands have increased the need to add edge elements to the core. Gradual but growing adoption of new generation data-rich applications and IoT technologies have increased the demand for deployment of IT infrastructure closer to the end user. The move to remote working that we have seen since the pandemic began has, in turn, helped boost the move to the edge. Edge computing is a distributed, open IT architecture that features decentralised processing power. Instead of transferring data to a data centre, IoT devices transfer it to a local connection point. The data is processed by a local computer, or server, at this edge location. Nearer to the source The advantages of this model are that since the edge is specifically designed to be located closer to the user, it can provide much faster services and minimises latency by enabling real-time processing of large quantities of data that then communicates across a much shorter distance. At these edge compute sites, the most commonly found devices are network switches, routers, security appliances, storage and local compute devices. Unlike origin or cloud servers, which are usually located far from the devices that are communicating with them, the edge is located closer to the user for optimal data processing and processing power application or content delivery. Edge computing brings data processing and information delivery functionality closer to the data’s source. It is the next generation of infrastructure for the internet and the cloud – and it is experiencing rapidly accelerated growth.  We’ve already seen a massive migration to the edge during the pandemic and it is now widely reported that by 2025, 75% of all data will be processed there. COVID has boosted edge computing in other ways, of course. We have seen a boom in people moving away from shopping in big city high streets and prioritising convenience stores in their local area. We have also seen the growth of video streaming and an ongoing rise in online gaming. And all of this has led to an increase in demand for computer power at the edge to drive these kinds of activities, which are increasingly happening in remote locations.  Moreover, edge computing processes data locally which brings many benefits to a wide variety of industries. In the case of healthcare, edge computing allows organisations to access critical patient information in real-time rather than through an incomplete and slow database, while in retail, edge computing helps to improve customer experiences, increase operational efficiency, and strengthen security measures. Finding a way forward For all the reasons highlighted above, we are seeing computing power transitioning to the edge, and edge data centres. But with this power comes an element of vulnerability. As consumers continue to demand faster, more efficient services and more IoT devices are added, a greater strain is put on the organisations distributed IT networks, thereby increasing the likelihood of outages. To keep edge data centres up and running, there is a clear need for organisations and service providers to put in place proactive monitoring and alerting, to ensure they can remediate networks without the need for truck rolls to send an engineer on site. Smart Out-of-Band (OOB) Management tools can be used to diagnose the problem and remediate it, even when the main network is down or congested due to a network disruption, or even if it is down completely.  Failover to Cellular (F2C) provides continued internet connectivity for remote LANs and equipment over high-speed 4G Long Term Evolution (LTE) when the primary link is unavailable. Easily integrating with existing IT systems and network infrastructure, F2C restores WAN connectivity without the need for manual intervention. Organisations are also using a combination of automation and network operations (NetOps) for zero touch provisioning, effectively getting the network provisioned and up and running, without having to do anything manually. Often, they will want to ‘zero touch provision’ their own devices. They will also want to use this technology for the orchestration of maintenance tasks and to automatically deliver remediation in the event of an equipment failure or other technical problem. That effectively means that organisations can ship new or replacement equipment to site and using Smart OOB quickly bring the site up via a secure cellular connection allowing for the remote provisioning and configuration of the equipment in-situ with having to send a skilled network engineer to site. This can deliver huge cost savings for many companies implementing new edge deployments, especially those trying to do so at pace across multiple geographies. Then following deployment, if a problem develops, it results in a loss of connectivity to the production network and one that cannot be resolved immediately, business continuity can be maintained with organisations continuing to pass any mission critical network traffic across the secure OOB LTE cellular connection. Edge computing is poised to transform the data centre landscape and is already influencing network strategies. The concepts around the edge are not necessarily new but are increasingly relevant as IoT connected systems continue to scale. Organisations are realising that relying on centralised data centres for the large amounts of sensor and endpoint data that is being collected, simply isn’t realistic or cost effective. What the future may bring As cloud service offerings increase, content streaming grows and more IoT is integrated, organisations are challenged with diversifying their network initiatives. The more applications and devices that that use an edge network, the greater the strain. As companies and organisations move more and more of their compute load from large data centres to edge compute locations, they must adjust their network management processes to ensure they continue delivering the always-on uptime that customers expect. To do this, they must use hybrid solutions that leverage internet and cloud-based connectivity, as well as physical infrastructure. A combination of NetOps and Smart OOB management ensures that organisations have always-on network access to deliver the network resilience needed for fast evolving edge computing.



Translate »