Aside from lockdowns and vaccines, distance might be one of the larger themes of of the COVID-19 pandemic. Keeping our distance from one another has fundamentally changed the way we interact – in our personal lives and in our day-to-day work activities.
Social distance requirements in the workplace have led to companies implementing what we might call ‘office distancing’ – a combination of increased work-from-home (WFH) and more remote office schemes, rather than say, the traditional headquarters environment. And as a result, organisations have in turn needed to address ‘data distance,’ i.e. ensuring remote workers can access files and other necessary data with the same level of performance and security they have normally enjoyed.
The Challenges of Data Distancing
Large-scale data distancing across a globally distributed workforce results in a number of IT challenges:
Data security
Easy access to file data is crucial for employee productivity, but it should not come at the expense of security. The exposure of an organisation’s global data fabric to remote devices creates a security challenge as these edge locations, by their nature, do not have strong physical security. It is vital for an organisation’s chosen file system to rigorously control the data that can be accessed at each remote node or endpoint. While VPNs can meet security requirements, they are notoriously clunky. What is needed is a mix of the two: a way to extend corporate file systems to remote users securely without adversely impacting the user experience.
Overcoming network latency
Remote working models require enterprise IT teams to provide high-performance, interactive data services across greater distances than we have ever seen before. If your file storage is consolidated in a single, centralised datacentre, it is difficult to provide a high-speed user experience. This is due to network latency, which is a direct function of distance.
Traditional file storage solutions were not built to handle the latencies and connectivity issues stemming from wider enterprise data topologies. Thus, file data should be located near the users to ensure a “local” file access experience. This can be achieved by manually moving the files, or preferably, by strategically deploying caching devices (more on this in a bit).
Balancing consistency and availability
Distributed data fabrics vary in their levels of consistency and the ways they deal with inconsistencies, such as when two users are concurrently editing a file. One approach is to use an “eventual consistency” model, and handle inconsistencies by creating a conflict file. Other solutions implement strict global locking, at the cost of availability and latency; a global locking service often becomes a single point of contention and is not accessible during network disconnections. This trade-off is caused by the CAP theorem which states you can have at most two out of Consistency (C), Availability (A) and Partition Tolerance (P) in any distributed storage system.
Migration
Migration to modern file solutions from legacy systems is one of the most significant challenges for any enterprise organisation. To ease migration issues, choose a solution that has strong migration tools to allow for the retention of security settings such as Windows ACLs and backward compatibility with existing filers – by exposing the data using the ubiquitous SMB and NFS protocols.
5 top tips for implementing data distancing
Enterprises should keep the following in mind as they tackle these data distancing challenges:
1. Anywhere Availability:
Making data accessible to authorised users from anywhere – at HQ, branch offices or home – by using a global file system is increasingly becoming a necessity. In a global file system, files are cached at the edge (either at the endpoint or using regional caching nodes) to ensure low latency access from anywhere. Caching also provides for partition-safety to allow nodes to work offline in case connectivity is lost, and to re-synchronise once connectivity is re-established. This synchronisation also ensures business continuity and facilitates global collaboration among remote users.
2. Security:
A zero-trust approach should be employed, in which remote nodes and endpoints can only access a strictly controlled subset of corporate information with explicit permission, rather than being granted access to the entire infrastructure.
3. Cloud Bursting:
Cloud bursting is a popular use case for organisations seeking to expand on-premises storage capacity without deploying additional on-premises storage infrastructure. Cloud bursts for compute also helps overcome data distancing challenges by enabling the heavy data crunching to occur in the cloud, away from the edge device and thus improving local performance.
4. Dark Data:
It is imperative that enterprises corral their dark data on unknown bring-your-own and work-from-home devices, and ensure it becomes a thing of the past. Enterprise data should be at the fingertips of all of employees, everywhere, at all times.
5. Agility:
Engaging with solutions that enable agile collaboration on data between remote workers is vital so that users at the organisation’s headquarters and remote branches across the world can access the same file shares quickly and efficiently.
Remote work models are set to stay put beyond the pandemic. Using the right tools and technologies, data distancing can help enterprises ensure productivity for their remote and distributed workforces while maintaining performance requirements and existing access control and security models.
By Aron Brand, CTO of CTERA
Head office & Accounts:
Suite 14, 6-8 Revenge Road, Lordswood
Kent ME5 8UD
T: +44 (0)1634 673163
F: +44 (0)1634 673173
© 2025 All Things Media Ltd.
© 2025 All Things Media Ltd.