Edge Computing in Modern Data Centre Operations


DataQube to supply Edge Centres with 20 x edge data centre modules
DataQube is actively supporting Edge Centres with its US expansion plans. The Australian firm has placed an order for 20 x DataQube pods to provide an edge data centre offering, with further orders expected over the forthcoming months. DataQube’s unique solution will be integral to Edge Centres’ ambitious rollout plans by enabling the company to deploy multiple edge data centre and colocation facilities quickly and at scale.  DataQube has been selected as the preferred solution because of its short deployment times, its compelling price point, and its green credentials. DataQube’s breakthrough design architecture removes the need for expensive property refurbishments to accommodate specialist HPC and associated cooling equipment. All IT, storage and power infrastructure is housed within secure and sterile units that satisfy all current building regulations and LEED standards. As such DataQube installs can be fully operational within a nine-month timeframe and for 50% less upfront investment compared to conventional data centre build projects.   The outer and inner structures of DataQube’s novel offering are manufactured from lightweight materials for portability and easily assembly purposes. The units are also supplied flatpack permitting transportation in bulk. Moreover, the solution’s person-free layout enables optimal use of IT, thus reducing energy consumption and CO2 emissions by over 50% because the energy transfer is dedicated solely to powering computers. This equates to a PUE of less than 1.05, the lowest in the industry.  “We needed a partner that we could work with globally,” says Jon Eaves, CEO at Edge Centres. “DataQube and Edge Centres are aligned on a global roll-out plan starting in the US.” “We are delighted to be working with Edge Centres on this exciting new venture,” says David Keegan, CEO of DataQube Global. “Deploying our popular data centres instead of commissioning a data centre from scratch is not only makes commercial sense, DataQube’s green credentials will help data centres of the future to become more sustainable by reducing energy consumption, not just switching source.” DataQube has already set up manufacturing facilities in the in the US as part of its ESG and sustainability strategies. 

Aruba ESP delivers services for the protection of edge-to-cloud networks
Aruba has announced significant advancements to Aruba ESP (Edge Services Platform), with new functionality in Aruba Central to enable organisations to keep pace with rapidly changing business requirements. The new Aruba Central NetConductor allows enterprises to centralise the management of distributed networks with cloud-native services that simplify policy provisioning and automate network configurations in wired, wireless, and WAN infrastructures. Central NetConductor enables a more agile network while enforcing Zero Trust and SASE security policies. Aruba also revealed the industry’s first self-locating indoor access points (APs) with built-in GPS receivers and Open Locate, a proposed new industry standard for sharing location information from an AP to a device. Digital acceleration driven by remote/hybrid work, new business models, and the demand for improved user experiences highlights the need for a more agile, flexible network. Aruba provides a comprehensive set of cloud-native services to deal with the complexity of multi-generational architectures with their attendant operations and security challenges. Traditional VLAN-based architectures require significant manual configuration and integration, are slow to adapt to new business connectivity requirements, and introduce potential security gaps. A modern, agile network employs a network “overlay” that seamlessly stitches together existing VLAN segments with cloud-native policy and configuration services that enables users and devices to make secure and reliable connections from anywhere. To help customers accelerate their digital transformation initiatives, Central NetConductor uses AI for management and optimisation, implements business-intent workflows to automate network configuration, and extends Aruba’s built-in security with cloud-native Network Access Control (NAC) and Dynamic Segmentation for fabric-wide enforcement. Because Central NetConductor is based on widely accepted protocols such as EVPN, VXLAN and BGP, it can be adopted in a seamless manner that preserves investments based on the ability to operate with existing Aruba networks and third-party vendor infrastructures.  “In today’s business world, flexibility is paramount. Enterprises need to be able to shift gears, turn up new services and offerings, and serve new customers seemingly overnight. Because the network underpins everything, enabling critical connectivity and data-driven intelligence, it’s got to have the flexibility built-in,” says Maribel Lopez, founder of Lopez Research. “Organisations today should look for standards-based solutions that give them technical flexibility and the ability to protect their investments and adopt new technologies at their own pace, but also options when it comes to consumption models.” Three key principles of network modernisation Static networks no longer meet growing business demands or support changing security requirements; therefore, organisations must be in a process of continuous network modernisation based on three main principles: Automation: Simplified workflows and AI-powered automation to reduce the time and resources required to plan, deploy, and manage networks that support remote, branch, campus, and cloud connectivitySecurity: Increased threat detection and protection with built-in identity-based access control and Dynamic Segmentation that are the foundation for Zero Trust and SASE frameworksAgility: Unified, cloud-native, standards-based architecture for investment protection and ease of adoption with NaaS consumption models to optimize budget and staff resources Aruba Central NetConductor accelerates the deployment, management, and protection of modern, fabric-based networks by mapping capabilities to the three network modernisation principles: Automation: Intent-based workflows with “one-button” connectivity and security policy orchestrationSecurity: Pervasive role-based access control extends Dynamic Segmentation for built-in Zero Trust and SASE security policy enforcementAgility: Cloud-native services for a single point of visibility and control. Standards-based for ease of migration and adoption to preserve existing investments Innovations in indoor location services WLAN AP installation remains a manual process which is time-consuming, prone to error, and results in an unreliable reference for location-aware applications. To address this, Aruba has introduced the industry’s first self-locating indoor APs to simplify how organisations capture indoor location data and communicate information over the air to any mobile device or application. Aruba Wi-Fi 6 and Wi-Fi 6E APs use a combination of built-in GPS receivers, Wi-Fi Location support for fine time measurement and intelligent software to enable highly accurate, automated WLAN deployments. Aruba’s self-locating WLAN APs provide zero-touch determination of AP location, continuously validate and update location, and provide a set of universal coordinates that may be transposed on any building floor map or web mapping platform.  Accurate location of the WLAN infrastructure creates an anchored reference that is shared using Open Locate. Businesses can use the universal coordinates and anchored reference of Aruba’s self-locating indoor APs to easily develop or enhance asset tracking, safety/compliance, facility planning, venue experience apps or other location-aware services. “Location is core to many app experiences and accurate indoor location unlocks many new and innovative enterprise use cases,” says Sean Ginevan, head of Global Strategy and Digital Partnerships for Android Enterprise at Google. “With Android 10, Google was first to fully support Wi-Fi RTT to enable precise indoor location on mobile devices. Aruba’s self-locating network infrastructure and the Open Locate initiative will help realize the vision of accurate, indoor location for our developer community and make it much easier to deploy these networks at scale. We can’t wait to see what developers build.”   “Enterprises have shown tremendous resiliency in the face of major disruptions and tectonic shifts within their businesses over the past two years, and it’s become clear that business agility is now top-of-mind for our customers,” comments David Hughes, Chief Technology and Product Officer at Aruba. “The advancements introduced today will help customers evolve their approach to a ‘services orientation’ using AI-powered solutions, strengthening security and accelerating the move to a cloud-centric network architecture, which are all hallmarks of a modern network.”

How to keep networks running as demand places pressure on the edge
By Alan Stewart-Brown, VP EMEA, Opengear The traditional data centre has been a mainstay of computing and connectivity networks for decades, with most processing transactions being carried out in a centralised core. Although core networks are essentially the backbone of any network, mobility, technological advancements and user demands have increased the need to add edge elements to the core. Gradual but growing adoption of new generation data-rich applications and IoT technologies have increased the demand for deployment of IT infrastructure closer to the end user. The move to remote working that we have seen since the pandemic began has, in turn, helped boost the move to the edge. Edge computing is a distributed, open IT architecture that features decentralised processing power. Instead of transferring data to a data centre, IoT devices transfer it to a local connection point. The data is processed by a local computer, or server, at this edge location. Nearer to the source The advantages of this model are that since the edge is specifically designed to be located closer to the user, it can provide much faster services and minimises latency by enabling real-time processing of large quantities of data that then communicates across a much shorter distance. At these edge compute sites, the most commonly found devices are network switches, routers, security appliances, storage and local compute devices. Unlike origin or cloud servers, which are usually located far from the devices that are communicating with them, the edge is located closer to the user for optimal data processing and processing power application or content delivery. Edge computing brings data processing and information delivery functionality closer to the data’s source. It is the next generation of infrastructure for the internet and the cloud – and it is experiencing rapidly accelerated growth.  We’ve already seen a massive migration to the edge during the pandemic and it is now widely reported that by 2025, 75% of all data will be processed there. COVID has boosted edge computing in other ways, of course. We have seen a boom in people moving away from shopping in big city high streets and prioritising convenience stores in their local area. We have also seen the growth of video streaming and an ongoing rise in online gaming. And all of this has led to an increase in demand for computer power at the edge to drive these kinds of activities, which are increasingly happening in remote locations.  Moreover, edge computing processes data locally which brings many benefits to a wide variety of industries. In the case of healthcare, edge computing allows organisations to access critical patient information in real-time rather than through an incomplete and slow database, while in retail, edge computing helps to improve customer experiences, increase operational efficiency, and strengthen security measures. Finding a way forward For all the reasons highlighted above, we are seeing computing power transitioning to the edge, and edge data centres. But with this power comes an element of vulnerability. As consumers continue to demand faster, more efficient services and more IoT devices are added, a greater strain is put on the organisations distributed IT networks, thereby increasing the likelihood of outages. To keep edge data centres up and running, there is a clear need for organisations and service providers to put in place proactive monitoring and alerting, to ensure they can remediate networks without the need for truck rolls to send an engineer on site. Smart Out-of-Band (OOB) Management tools can be used to diagnose the problem and remediate it, even when the main network is down or congested due to a network disruption, or even if it is down completely.  Failover to Cellular (F2C) provides continued internet connectivity for remote LANs and equipment over high-speed 4G Long Term Evolution (LTE) when the primary link is unavailable. Easily integrating with existing IT systems and network infrastructure, F2C restores WAN connectivity without the need for manual intervention. Organisations are also using a combination of automation and network operations (NetOps) for zero touch provisioning, effectively getting the network provisioned and up and running, without having to do anything manually. Often, they will want to ‘zero touch provision’ their own devices. They will also want to use this technology for the orchestration of maintenance tasks and to automatically deliver remediation in the event of an equipment failure or other technical problem. That effectively means that organisations can ship new or replacement equipment to site and using Smart OOB quickly bring the site up via a secure cellular connection allowing for the remote provisioning and configuration of the equipment in-situ with having to send a skilled network engineer to site. This can deliver huge cost savings for many companies implementing new edge deployments, especially those trying to do so at pace across multiple geographies. Then following deployment, if a problem develops, it results in a loss of connectivity to the production network and one that cannot be resolved immediately, business continuity can be maintained with organisations continuing to pass any mission critical network traffic across the secure OOB LTE cellular connection. Edge computing is poised to transform the data centre landscape and is already influencing network strategies. The concepts around the edge are not necessarily new but are increasingly relevant as IoT connected systems continue to scale. Organisations are realising that relying on centralised data centres for the large amounts of sensor and endpoint data that is being collected, simply isn’t realistic or cost effective. What the future may bring As cloud service offerings increase, content streaming grows and more IoT is integrated, organisations are challenged with diversifying their network initiatives. The more applications and devices that that use an edge network, the greater the strain. As companies and organisations move more and more of their compute load from large data centres to edge compute locations, they must adjust their network management processes to ensure they continue delivering the always-on uptime that customers expect. To do this, they must use hybrid solutions that leverage internet and cloud-based connectivity, as well as physical infrastructure. A combination of NetOps and Smart OOB management ensures that organisations have always-on network access to deliver the network resilience needed for fast evolving edge computing.

Why Secure Access Service Edge is key for a distributed workforce
Written by Daniel Blackwell, Product Manager – Network and Security at Pulsant, on using edge to transform networks. The huge shift to remote working and the increased sophistication of SaaS applications used by employees on vastly extended networks present significant security challenges. Supercharged by the pandemic, these major trends have left many businesses struggling to address the long-term security risks generated by such an expanded attack surface. The problem is that thousands of employees are now working from uncontrolled environments, frequently using their own devices, and almost certainly relying on domestic networks. Personal devices and home broadband lack the same security protocols and controls that apply to corporate devices and networks, making them more vulnerable to cyber-attacks. Internet access is often shared with other devices, while home networks either have weak passwords, or none at all, and are generally configured without encryption. All these vulnerabilities provide multiple angles of attack on a corporate network which are potentially easier to carry out than many other methods employed by criminals or activist hackers. The picture for IT chiefs is further complicated by the use of multiple cloud vendors and the steadily growing adoption of hybrid infrastructures for sound business reasons. This further compounds the vulnerabilities of an expanded surface, with multiple ingress points to access distributed business information and systems, which all need to be controlled and monitored. Removing the IT headache For IT teams these developments are problematic. Applying security policies to each employee working remotely can be complex and costly. For example, applying the same policies and controls could require deploying a firewall at each employee’s home which is expensive and generates huge management overheads. The alternative of providing each employee with a remote VPN connection back to a central office location goes against the flow of what businesses need today for increased agility and cost-effectiveness. As organisations increasingly move to decentralised services employing SaaS applications and public cloud, there is little sense in routing traffic back through an office location. The role of SASE Secure Access Service Edge (SASE) is increasingly emerging as a solution to most of these difficulties, enabling organisations to apply security policies to employees wherever they are working, using a centralised management policy. Adoption of SASE remains cautious, however, largely because there is no settled definition of what it is, nor has it been standardised, causing significant confusion about the benefits that it can bring. Depending on who you want to believe, SASE comprises all or most of the following technologies: secure web gateways (SWGs); web-filtering; cloud access security brokers (CASBs); firewall-as-a-service (FWaaS) and Zero Trust Network Access (ZTNA). Many organisations will already have some of these applications in place but not in a unified, cloud-based solution that provides genuine control, visibility and management, removing the drudgery and cost of overseeing and administering them separately. Gartner defines SASE as an extension of SD-WAN to include other network security controls and services that can be centrally managed through the same SD-WAN management plane. This covers the essential elements of network and application optimisation, access control and the vital requirement for the IT team to have full visibility. With these capabilities, troubleshooting becomes much quicker and more effective. Unfortunately, many vendors have boarded the SASE bandwagon in what are often little more than rebranding exercises. They slap the SASE label on cloud-based security solutions that are not managed by a single dashboard and still involve multiple separate products. Others claim to provide SASE even without an SD-WAN offering, while yet more offer elements of SASE but not the full product range. In the current market, there are very few vendors who provide SASE matching Gartner’s full definition. This does not mean, however, that SASE is something that organisations should disregard; instead it should be seen as more of a framework to build a solution that helps solve the security complexities introduced by modern working. Zero trust and the edge SASE is fundamentally about the application and the user. With SD-WAN, the primary purpose is to have control over the application and apply routing policies to ensure the right applications obtain the best possible path. This optimises performance for the end-user and enables organisations to upgrade or implement new applications efficiently and quickly. True SASE means applying the same principles of efficiency and agility to security controls. The application and the user are still considered, but more specifically it is about ensuring the right user has access to the right applications, but only those applications. This implementation of the zero-trust approach can even be broken down further to the right device, at the right time of day, from the right network, and access restricted to applications and web services based on the security posture of the user, device, and destination. The physical location of the SASE 'engine' should also be considered. The term cloud implies that something is located everywhere, while in the UK this typically means it is hosted in one location. By having regional points-of-presence, the enforcement of security policies is distributed closer to each user wherever they are working. Using this approach, organisations can stop employees from accessing known bad web services, regardless of location, removing the risk of downloading malicious files or applications. If malware does get through and a device is breached, access can be revoked, preventing attackers from gaining access to applications or services.   Securing the edge Genuine SASE forms a comprehensive package that combines a variety of solutions, and as organisations move towards distributed and decentralised applications, SASE and SD-WAN provide agile and flexible central controls. These are vital attributes. Remote working policies are now permanent and widespread, and before too long, SASE and SD-WAN will enable IT and security teams alike to bring security protocols closer to users. The outcome will be a highly-resilient network that optimises the edge and truly supports its users and protects them from emerging and increasingly sophisticated cyber threats — whether they are at home, on the road, in a branch office or headquarters.

The benefits of application-aware networks and its link to edge computing
By Daniel Blackwell, Pulsant The migration to hybrid working in thousands of organisations is set to have many consequences, including a surge in the use of SaaS, cloud services and distributed applications. The adoption of a mixture of office and remote working may once have looked ephemeral, but a McKinsey global survey of senior executives in large corporations found nine-in-10 intend on continuing with a combination of on-site and remote working beyond the pandemic. Businesses that had maintained connectivity and facilitated 'microtransactions' between employees through the most trying times were found by McKinsey to have sustained higher levels of productivity. Such a major change in the way that enterprises function is only made possible by the user-friendly effectiveness of today’s ever-expanding galaxy of big-name business applications, or increased adoption of collaboration tools such as Slack, Dropbox, Zapier and Trello. All these applications depend on fast, high-bandwidth networks which are resilient and available. You need to see the applications on your network As a result of this embedding of hybrid work practices, it’s now more important than ever that organisations know where applications are moving throughout their infrastructure and how best to manage and control them to deliver optimal performance. The reliance on applications is increasing pressure to ensure performance, reliability, and security. This means focusing attention on networks to avoid lacklustre performance. For network operators, this dictates a shift towards application-aware networks to provide detailed reporting and intelligence to route applications down the best path. In the digital economy, application experience can make or break a business. Yet, achieving visibility over applications isn’t easy. It often takes far too long to troubleshoot and identify the root cause of a latency or performance problem and develop a resolution. Greater visibility from an application-aware network allows businesses to understand and fix application issues faster, saving them time and the cost of traditionally complex troubleshooting processes. Security is a concern too Security is also a major concern along an extended attack surface that may include hundreds of connections to employees’ homes. Many home networks use easily guessed or default passwords or may be configured without encryption, providing a far easier avenue for an attacker to gain access to a corporate network. Applying security policies to each remote worker can be complex and expensive. For example, applying the same policies and controls could require deploying a firewall at each employee’s home which is not only costly but creates substantial management overheads. Alternatively, each employee could be provided with a remote VPN connection back to a central office location, but as organisations increasingly move to decentralised services with SaaS and public cloud, it doesn’t make sense to route traffic back through an office location. The role of SD-WAN Organisations now need to resolve these difficulties through the implementation of application-aware networks. Their primary route is through SD-WAN technology (software-defined networking in a wide area network) which gives visibility over applications and enables organisations to control and direct traffic intelligently and securely from a central location across the WAN. Unlike traditional WAN architectures which lack the central visibility and control required for distributed IT environments, SD-WAN delivers a step change for businesses, providing the agility for businesses to configure and make changes to multiple devices at the simple push of a button, saving time and increasing efficiency. Organisations can enforce their policy, based on user experience, with network priority given to the most business-critical applications so they avoid problems such as jitter, lag or brownouts. And because they can reduce the time required for configuration and trouble-shooting, businesses employing SD-WAN benefit from significant operational cost savings. Rolling out new applications becomes quicker and less costly across multiple sites. As more organisations adopt SaaS and cloud-based services, SD-WAN and application-aware networking are therefore becoming business-critical necessities. The role of the edge computing SD-WAN is the cornerstone of the application-aware network. By understanding what applications are used across the network, organisations can classify and apply appropriate application tuning to ensure optimum performance for each user. However, application-aware networking can also work alongside an edge computing strategy to drive further efficiencies. Edge computing is the confluence of cloud and physical data, which exists wherever the digital and physical world intersect, and enables data to be collected, generated, and processed close to the end-user to create new value. Whereas it would previously have been impossible to sustain high-speed data transfers necessary for applications using AI in almost all of the UK, edge data centres can now run analytics locally once models have been trained on masses of data in the public cloud. These advanced capabilities open the door to industrial IoT applications such as digital twin technologies that reshape manufacturing and logistics operations or advanced automation to transform the efficiency of manufacturing, extraction and refining processes, even in isolated sites. Edge either works independently of SD-WAN and application-aware networking or in conjunction with it to enable organisations to identify and prioritise application traffic. This has proved to be well-suited to the multi-cloud environment that many large enterprises increasingly adopt. SD-WAN in the core network of an edge platform and at the on-ramp to the public cloud will underpin high application performance for an organisation regardless of its location, overcoming any potential latency or congestion problems with the data that must be backhauled to a hyperscaler’s hub for processing. Security is significantly strengthened through monitoring and encryption between different sites. The advantages of application-aware networks have become obvious Triggered by the pandemic, the expansion of hybrid working has made the gains of application-aware networks obvious. As networks become increasingly software-defined and edge computing platforms expand and become fully operational, businesses benefiting from SD-WAN have access to far greater levels of application intelligence to improve connectivity, efficiency, and performance. They enjoy faster resolution of the network problems hindering application performance and reducing the strain placed on their workloads. By combining the operational and visibility benefits of application-aware networks and SD-WAN, with the low latency and high bandwidth of edge computing, businesses can offer new levels of customer experience and service. This becomes possible almost regardless of the strength of their network connection. They can deploy powerful new applications quickly and with full confidence in their performance and resilience. This is a major advantage, freeing almost everyone in an organisation to focus on creating value and advancing digital transformation.



Translate »