Monday, March 10, 2025

Designing, planning and testing edge data centres

Author: Beatrice

By Carsten Ludwig, Market Manager, Reichle & De-Massari

Edge data centres provide computing power on the periphery of cloud and Wide Area Networks, relieving these and improving performance. They are located as closely as possible to points where data is aggregated, analysed, or processed in real-time. Popular content and applications, for example, can be cached closer to less densely networked markets, improving performance and experience. Let’s examine some considerations when designing, planning and testing edge data centres.

Location

Edge providers may operate dozens or hundreds of edge data centres concurrently across urban and suburban locations, which can be hard to reach and work in, so edge data centres need to be exceptionally robust and secure. Proximity or direct connection to fibre optic links and network node points is imperative. Edge data centres need redundant, synchronous fibre hyperconnectivity in all directions: to the cloud, cellular phone networks, neighbouring data centres and users. These factors pose a significant challenge for planners and design engineers.

For planning purposes, edge providers require a tool that reflects all the above-mentioned demands and preconditions. The quality of planning can be improved, if drawings and data for further material sourcing comes from a single tool. Testing in this stage would be recommended for the fibre links, to ensure these are working correctly and delivering the promised performance. The tested quality of the components used determines the performance and functional reliability of the optical links.

Technical requirements

Edge data centres often have to cope with a lack of space and harsh environmental conditions. They need to be positioned in protected, discrete, dry places and the following must be provided:

  • Interruption-free power supply
  • Fire protection
  • Air-conditioning and cooling
  • Sound, dust and vibration protection
  • Locking and access control

A professional approach to securing high performance from the outset would be using preconfigured and assembled modular systems. These could consist of pre-terminated panels, sub racks or complete racks, if logistics and site design allow. Preconfigured equipment can be delivered by the OEM with relevant test certificates, ensuring a high level of quality and vastly simplifying installation as no testing is required on site. This approach requires a professional installation performance and capability from the service team. Properly configured and tested modules increase quality, reduce the risk of failure significantly and reduce the workload on site.

High density and port capacity

Afcom’s ‘2022 State of the Data Centre’ study noted a significant density increase at the edge. In 2021 the typical respondent implementing, or planning edge locations reported an estimated mean power density of 7kW. In 2022, this was 8.3kW. Edge data centre fibre hyper connectivity requires space for high count fibre cables under floors and in cable ducts.

For edge networks moving content such as HDTV programmes closer to the end user high density of more than 100 ports per rack unit is essential. Traditional 72 ports per unit UHD solutions won’t suffice. Current high-density fibre solutions for data centres generally offer up to 72 LC duplex ports per rack unit. However, this can introduce management difficulties. 

Pretermination by the OEM would be ideal. Testing on site is possible with adapters required on test equipment to serve new connectivity solutions such as the VSFF connector family. Connectivity can also be secured using intelligent AIM systems for monitoring layer one performance. Besides the connectivity check ‘outside of the data stream’ edge providers have an overview of what’s happening within connectivity ‘inside of the data stream’. There are several ways of realising this, from a low budget approach using TAP modules to high-performing 24/7 signal analysers. Each edge location has a unique design and service to deliver, so the approach has to be selected accordingly.

Testing

To ensure quality and performance levels, testing is essential. In Reichle & De-Massari’s experience, new data centre builds rarely go according to schedule. If part of the process is pulled forward or delayed, it introduces challenges related to component quality and performance. The installation of sensitive equipment such as fibre connectivity that needs to be 100% clean might have to take place in an environment insufficiently free of dust and moisture for example. It’s important to determine what tests can be done up front to avoid hassle on site. Optical connectors and adapters can be checked for insertion loss and other standard KPIs before delivery by the OEM. Even if equipment has been preconfigured, testing on-site in the event of schedule changes isn’t just smart – it should be mandatory. That avoids issues, and therefore also delays and a lot of finger-pointing between involved parties.

Management

Cable management is key. Double check measurements, make sure terminations are top quality, test wherever necessary, label and colour-code, watch out for cramped conduits and make absolutely sure no cables or bundles rest upon others. Bad cable management can result in signal interference and crosstalk, damage and failure, resulting in data transmission errors, performance issues and downtime. Introducing Operation Management systems provides a seamless 24/7 performance status for each location. As these locations are distributed in line with the nature of the new network architecture, the performance management should not only focus on standard applications such as power, cooling and access reports: every aspect of data connectivity needs to be covered. Solutions that monitor data flow (such as TAP modules) are mandatory.

Because an edge provider’s service team doesn’t work on site, remote control of all relevant aspects at each location is mandatory and a precondition for customer-relevant performance. Remote control needs to cover all edge locations in one system. On one hand, this helps monitor status of all relevant dimensions such as power supply, temperature conditions, data access, data flow, and security. On the other hand, current and upcoming installations at the edge site are monitored by a single system, giving insight for asset and capacity management and serving as a basis for further extensions and new/changing customers at each site.

www.rdm.com



Related Posts

Next Post
Translate »