Tuesday, April 29, 2025

Cloud


Navigating enterprise approach on public, hybrid and private clouds
By Adriaan Oosthoek, Chairman at Portus Data Centers With the rise of public cloud services offered by industry giants like Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP), many organisations have migrated or are considering migrating their workloads to these platforms. However, the decision is not always straightforward, as factors like cost, performance, security, vendor lock-in and compliance come into play. Increasingly, enterprises must think about workloads in the public cloud, the costs involved and when a private cloud or a hybrid cloud setup might be a better fit. Perceived benefits of public cloud Enterprises view the public cloud as a versatile solution that offers scalability, flexibility, and accessibility. Workloads that exhibit variable demand patterns, such as certain web applications, mobile apps and development environments, are well-suited for the public cloud. The ability to quickly provision resources and pay only for what is used may make it an attractive option for some businesses and applications. Public cloud offerings also typically provide a vast array of managed services, including databases, analytics, machine learning and AI, and can enable enterprises to innovate rapidly without the burden of managing underlying infrastructure. This is also a key selling point of public cloud offerings. But how have enterprises’ real life experiences of public cloud set-ups compared against these expectations? Many have found the ‘pay-as-you-go’ pricing model to be very expensive and to have led to unexpected cost increases, particularly if workloads and usage spike unexpectedly or if the customer packages have been provisioned inefficiently. If not very carefully managed, the costs of public cloud services have a tendency to balloon quickly. Public cloud providers and enterprises that have adopted public cloud strategies are naturally seeking to address these concerns. Enterprises are increasingly adopting cloud cost management strategies, including using cost estimation tools, implementing resource tagging for better visibility, optimising instance sizes, and utilising reserved instances or savings plans to reduce costs. Cloud providers offer pricing calculators and cost optimisation recommendations to help enterprises forecast expenses and increase efficiency. Despite these efforts, the public cloud has proved to be far more expensive for many organisations than originally envisaged and managing costs effectively in public cloud set-ups requires considerable oversight, ongoing vigilance and optimisation efforts. When private clouds make sense There are numerous situations where a private cloud environment is a more suitable and cost-effective option. Workloads with stringent security and compliance requirements, such as those in regulated industries like finance, healthcare or government, often necessitate the control and isolation provided by a private cloud environment - hosted in a local data centre on a server that is owned by the user. Many workloads with predictable and steady resource demands, such as legacy applications or mission-critical systems, may not need the flexibility of the public cloud and could potentially incur much higher costs over time. In such cases, a private cloud infrastructure offers much greater predictability and cost control, allowing enterprises to optimise resources based on their specific requirements. And last but not least, once workloads are in the public cloud, vendor lock-in occurs. It is notoriously expensive to repatriate workloads back out of the public cloud, mainly due to excessive data egress costs. Hybrid cloud It is becoming increasingly clear that most organisations will benefit most from a hybrid cloud setup. Simply put, ‘horses for courses’. Only put those workloads that will benefit from the specific advantages into the public cloud and keep the other workloads within their own control in a private environment. Retaining a private environment does not require an enterprise to have or run their own data centre. Rather, they should take capacity in a professionally managed, third-party colocation data centre that is located in the vicinity of the enterprises’ own premises. Capacity in a colocation facility will generally be more resilient, efficient, sustainable, and cost effective for enterprises compared to operating their facilities. The private cloud infrastructure can also be outsourced – in a private instance. This is where regional and edge data centre operators such as Portus Data Centers come to the fore. In most cases, larger organisations will end up with hybrid cloud IT architecture to benefit from the best of both worlds. This will require careful consideration of how to seamlessly pull those workloads together through smart networking. Regional data centres with strong network and connectivity options will be crucial to serving this demand for local IT infrastructure housing.  The era where enterprises went all-in into the cloud is over. While the public cloud offers scalability, flexibility, and access to cutting-edge technologies, concerns about cost, security, vendor lock-in and compliance persist. To mitigate these concerns, enterprises must carefully evaluate their workloads and determine the most appropriate hosting environment. 

Own Company empowers customers to capture value from their data
Own Company, a SaaS data platform, has announced a new product, Own Discover, that reflects the company’s commitment to empower every company operating in the cloud to own their own data. Own Discover is expanding its product portfolio beyond its backup and recovery, data archiving, seeding, and security solutions to help customers activate their data and amplify their business. With Own Discover, businesses will be able to use their historical SaaS data to unlock insights, accelerate AI innovation, and more in an easy and intuitive way. Own Discover is part of the Own Data Platform, giving customers quick and easy access to all of their backed up data in a time-series format so they can: Analyse their historical SaaS data to identify trends and uncover hidden insights Train machine learning models faster, enabling AI-driven decisions and actions Integrate SaaS data to external systems while maintaining security and governance “For the first time, customers can easily access all of their historical SaaS data to understand their businesses better, and I’m excited to see our customers unleash the potential of their backups and activate their data as a strategic asset,” says Adrian Kunzle, Chief Technology Officer at Own. “Own Discover goes beyond data protection to active data analysis and insights and provides a secure, fast way for customers to learn from the past and inform new business strategies and growth.”

Vultr announces new CDN in race to be the next hyperscaler
Vultr, a privately-held cloud computing platform, has announced the launch of Vultr CDN. This content delivery service pushes content closer to the edge without compromising security. Vultr now enables global content and media caching, empowering its worldwide community with services for scaling websites and web applications. Traditional content delivery networks are incredibly complex, leaving businesses and web developers needing help to configure, manage, and optimise infrastructure cost-effectively and in a timely manner. They require immediate access to a powerful, scalable, and global content delivery network to accelerate digital content distribution and keep up with customer demand. The launch of Vultr CDN marks the next phase of the company’s growth as a leading cloud computing platform. By adding global content caching and delivery to Vultr’s existing cloud infrastructure, the service simplifies infrastructure operations with unbeatable price-to-performance starting at $10/month, with the industry’s lowest bandwidth costs. For those requiring the highest performance CPUs, Vultr also offers unique high-frequency plans powered by high clock speed CPUs and NVMe local storage, optimised for websites and content management systems. Purpose-built for performance-driven businesses, Vultr CDN delivers a network for fast, secure, and reliable content distribution and is optimised for content acceleration, API caching, image optimisation and more. Seamless integrations with Vultr Cloud Compute enable it to scale automatically and intelligently by selecting the best location for content delivery, thereby optimising user requests to save time and money.  Vultr CDN is now available for use as a beta service with a full release in February.

Aruba combines cloud potential with electric mobility
Aruba has announced that it is ready to enter the FIM Enel MotoE World Championship with the arrival of the Aruba Cloud MotoE Team. As both, the manager and title sponsor of the team, this is a new journey for Aruba into the world of sport. The project runs parallel to one undertaken with customers in the construction of a new cloud platform, which is now complete. There are several challenges that unite cloud technologies and the motor industry. First and foremost, sustainability, a key topic associated for cloud technologies as businesses look for more innovative and environmentally friendly products. The virtualisation of computational resources that underlies cloud computing, for example, allows for a reduction in the use of servers, and therefore, a reduction in emissions or when using clean energy, saving of natural resources. Furthermore, the continuous search for performance optimisation also unites the two industries. Cloud technologies are crucial across all spheres, both at a business level, but also in everyday life. For this reason, cloud developers are always looking to save energy through increasing the efficiency of infrastructure and optimising the use of resources. Similarly, the MotoE team is a starting point from where Ducati can experiment and develop technologies that could, in the future, be used on road motorbikes and offer customers increasingly sustainable and clean vehicles. The international dimension of the project is also particularly exciting, as over the years, Aruba Cloud has consolidated a significant international presence, becoming a player with more than 200,000 customers served in over 150 countries. Thanks to continuous investments in the innovation of its technology stack, Aruba Cloud is also distributed across the European data centre network. The riders of the Aruba Cloud MotoE Team will be Chaz Davies, who after retiring from Superbike in 2021, joined the Ducati ERC team in the Endurance World Championship, acting as coach for the Aruba riders in Superbike and Supersport at the same time, and Armando Pontone, who after a stint in the Moto3 category won the National Trophy SS600 in 2021. The team's official presentation will be held on 7 March at the Aruba Auditorium in Ponte San Pietro. Read more latest news from Aruba here.

Amidata implements Quantum ActiveScale to launch new cloud storage service
Quantum Corporation has announced that Amidata has implemented Quantum ActiveScale Object Storage as the foundation for its new Amidata Secure Cloud Storage service. Amidata has deployed ActiveScale object storage to build a secure, resilient set of cloud storage services accessible from across all of Australia, where the company is based. This way the company achieves simple operational efficiency, seamless scalability, and the ability to address customer needs across a wide range of use cases, workflows, and price points.  Amidata’s adoption of object storage also aligns with current IT trends. “More and more organisations are looking at object storage to create secure and massively scalable hybrid clouds,” says Michael Whelan, Managing Director, Amidata. “ActiveScale provides a durable, cost-effective approach for backing up and archiving fast-growing data volumes while also protecting data from ransomware attacks. Plus, by deploying the ActiveScale Cold Storage feature, we are delivering multiple storage classes as part of our service offerings, allowing us to target a wider set of customers and use cases. With our Secure Cloud cold storage option, customers can retain data longer and at a lower cost; that’s useful for offsite copies, data compliance, and increasingly, for storing the growing data sets that are fuelling AI-driven business analytics and insights.”  ActiveScale also supports multiple S3-compatible storage classes using flash, disk, and tape medias, providing a seamless environment that can flexibly grow capacity and performance to any scale. Cold Storage, a key feature, integrates Quantum Scalar tape libraries as a lower cost storage class to efficiently store cold and archived data sets. Quantum’s tape libraries are nearline storage, where customers can easily access and retrieve cold or less used data with slightly longer latency—minutes instead of seconds—but at a low cost, leveraging the same infrastructure used by the major hyperscalers. It intelligently stores and protects data across all storage resources using Quantum’s patented two-dimensional erasure coding to achieve extreme data durability, performance, availability, and storage efficiency.  For more information on Amidata’s implementation of ActiveScale, view the video case study.

VAST Data forms strategic partnership with Genesis Cloud
VAST Data has announced a strategic partnership with Genesis Cloud. Together, VAST and Genesis Cloud aim to make AI and accelerated cloud computing more efficient, scalable and accessible to organisations across the globe. Genesis Cloud helps businesses optimise their AI training and inference pipeline by offering performance and capacity for AI projects at scale while providing enterprise-grade features. The company is using the VAST Data Platform to build a comprehensive set of AI data services in the industry. With VAST, Genesis Cloud will lead a new generation of AI initiatives and Large Language Model (LLM) development by delivering highly automated infrastructure with exceptional performance and hyperscaler efficiency. “To complement Genesis Cloud’s market-leading compute services, we needed a world-class partner at the data layer that could withstand the rigors of data-intensive AI workloads across multiple geographies,” says Dr Stefan Schiefer, CEO at Genesis Cloud. “The VAST Data Platform was the obvious choice, bringing performance, scalability and simplicity paired with rich enterprise features and functionality. Throughout our assessment, we were incredibly impressed not just with VAST’s capabilities and product roadmap, but also their enthusiasm around the opportunity for co-development on future solutions.” Key benefits for Genesis Cloud with the VAST Data Platform include: Multitenancy enabling concurrent users across public cloud: VAST allows multiple, disparate organisations to share access to the VAST DataStore, enabling Genesis Cloud to allocate orders for capacity as needed while delivering unparalleled performance. Enhancing security in cloud environments: By implementing a zero trust security strategy, the VAST Data Platform provides superior security for AI/ML and analytics workloads with Genesis Cloud customers, helping organisations achieve regulatory compliance and maintain the security of their most sensitive data in the cloud. Simplified workloads: Managing the data required to train LLMs is a complex data science process. Using the VAST Data Platform’s high performance, single tier and feature rich capabilities, Genesis Cloud is delivering data services that simplify and streamline data set preparation to better facilitate model training. Quick and easy to deploy: The intuitive design of the VAST Data Platform simplifies the complexities traditionally associated with other data management offerings, providing Genesis Cloud with a seamless and efficient deployment experience. Improved GPU utilisation: By providing fast, real-time access to data across public and private clouds, VAST eliminates data loading bottlenecks to ensure high GPU utilisation, better efficiency and ultimately lower costs to the end customer.  Future proof investment with robust enterprise features: The VAST Data Platform consolidates storage, database, and global namespace capabilities that offer unique productisation opportunities for service providers.

Digital Realty to expand its service orchestration platform
Digital Realty has announced the continued momentum of ServiceFabric - its service orchestration platform that seamlessly interconnects workflow participants, applications, clouds and ecosystems on PlatformDIGITAL - its global data centre platform. Following the recent introduction of Service Directory, a central marketplace that allows Digital Realty partners to highlight their offerings, over 70 members have joined the directory and listed more than 100 services, including secure and direct connections to over 200 global cloud on-ramps, creating a vibrant ecosystem for seamless interconnection and collaboration. Service Directory is a core component of the ServiceFabric product family that underpins the organisation’s vision for interconnecting global data communities on PlatformDIGITAL and enabling customers to tackle the challenges of data gravity head-on. Chris Sharp, Chief Technology Officer, Digital Realty, says, “ServiceFabric is redefining the way customers and partners interact with our global data centre platform. By fostering an open and collaborative environment, we're empowering businesses to build and orchestrate their ideal solutions with unparalleled ease and efficiency.” The need for an open interconnection and orchestration platform is critical as an enabler for artificial intelligence (AI) and high-performance compute (HPC), especially as enterprises increasingly deploy private AI applications, which rely on the low latency, private exchange of data between many members of an ecosystem. PlatformDIGITAL was chosen to be the home of many ground-breaking AI and HPC workloads and ServiceFabric was designed with the needs of cutting-edge applications in mind. A key differentiator is Service Directory’s ‘click-to-connect’ capability, which allows customers to orchestrate and automate on-demand connections to the services they need, significantly streamlining workflows and removing manual configuration steps. With ‘click-to-connect’, users can: Generate secure service keys, granting controlled access to resources and partners with customisable security parameters. Automate approval workflows and facilitate connections to Service Directory, paving the way for seamless interconnectivity. Initiate service connections, significantly streamlining workflows between partners and customers. Integrate seamlessly with Service Directory, creating a unified experience for discovery, connection, and orchestration.

Aruba and University of Pisa team up to optimise cloud resources
Aruba has announced a new collaboration with the Department of Information Engineering at the University of Pisa. With mounting economic and environmental pressures, and regulations like the European Commission’s Energy Efficiency Directive (EED) setting the bar, energy efficiency is rapidly becoming a cornerstone of effective service design. It is with these priorities in mind that Aruba and the Department of Information Engineering at the University of Pisa have reached this two-year framework agreement, in which an experimental project of machine learning and AI applied to cloud computing has been developed. The project aims to develop an integrated solution for load management on cloud platforms, based on the prediction of the resources used by virtual machines (VMs). Predictions will be based on the analysis of historical data relating to VMs, and through the development of specific algorithms, it will aim to optimise the energy consumption of hardware, while guaranteeing the requirements of VM users. Since cloud environments are generally used dynamically and flexibly, their cost is influenced by energy consumption. Being able to optimise the use of these resources - for example, by predictively modulating the amount of hardware as per the specific needs of customers - can reduce consumption when not needed and as a result, offer the service at a better cost. Two different algorithms will be developed through the joint project: A dynamic VM profiling algorithm to outline certain profiles based on the resources used historically. An algorithm for managing VMs that exploits profiles to manage their execution on the different hardware that make up the cloud platform, in order to optimise energy consumption, while guaranteeing performance. The project, therefore, will make it possible to develop an integrated solution for virtual machine management on a cloud platform based on load prediction and to implement a proof-of-concept based on OpenStack for field experimentation through application cases. In detail, the main benefits of the project include: The possibility of being able to move the load between OpenStack nodes on a predictive and historical basis to optimise the use of resources on the nodes. The possibility of guaranteeing adequate resources for client requests, optimising the use of servers dedicated to the service, without creating artificial limitations. The possibility of having stand-by computation nodes in OpenStack clusters to be activated according to load distribution needs. "We are proud to announce our new collaboration with a prestigious institute such as the University of Pisa, a significant step towards innovation in the practical application of machine learning within the cloud ecosystem,” comments Daniele Migliorini, Head of Engineering at Aruba. “This partnership reflects our ongoing commitment to technology collaboration with Italian academic institutions of excellence, in order to offer cutting-edge solutions and meet the rapidly evolving needs of the market. We are confident that the synergy between our experience in the sector and the expertise of the University of Pisa will result in solutions that will shape the future of the cloud and encourage benefits that can be derived from artificial intelligence, optimising the use of energy in the data centre sector with a view to long-term sustainability." “Our department has a long tradition of dialogue and work alongside companies,” comments the Director of the Department of Information Engineering, Andrea Caiti. “We have several active laboratories dedicated to the research of 4.0 and 5.0, which have now acquired not only local, but also national and international relevance. We receive numerous requests for collaboration from businesses for training courses, co-design of solutions, use of the cutting-edge tools in our laboratories for product studies, and to set up joint research laboratories. This openness to collaboration has enabled us to contribute to effectively bridging the gap that has always existed between research and enterprise, literally bringing two worlds that usually don’t speak to each other to the same table." "The opportunity to collaborate with a large company like Aruba allows our department to work on frontier topics in the area of cloud computing technologies. This gives us the opportunity to work on innovative solutions, with a potentially significant impact on areas such as energy efficiency and environmental sustainability, which are crucial in the development of the cloud of the future," concludes the scientific head of the collaboration, Carlo Vallati.

Nasuni named a top enterprise cloud-based NAS consolidation solution
Nasuni has announced that analyst firm DCIG has named the Nasuni File Data Platform a 'Top Five Storage Solution' in the 2024-25 DCIG Top Five Enterprise Cloud-based NAS Consolidation Solutions report.  As the pace of data growth continues to accelerate, organisations face increasing hardware, software, management, and maintenance expenses. Additionally, there is a plethora of data silos without global visibility into the file data of its multiple network attached storage (NAS) devices and file servers. Based on enterprise-class software-defined storage, cloud-based NAS consolidation migrates file data from multiple file servers and NAS devices into a cloud-based storage platform.  “Natively built for the cloud, the Nasuni File Data Platform places the object store at the centre of its software-defined architecture,” says Todd Dorsey, DCIG Senior Storage Analyst and Author of the report. “Enterprises can replace legacy file infrastructure consisting of multiple file servers, NAS, data protection, and management toolsets with a single global file system.”  DCIG evaluated 31 software-defined storage solutions for a consolidated cloud-based network attached storage (NAS) and named Nasuni as one of its top five solutions for three key capabilities: Flexible, cloud-based architecture – Nasuni integrates with all popular cloud storage providers and leading solutions for private cloud storage, providing the flexibility to use the cloud solutions best for each organisation. The patented UniFS Global File System organises file data, metadata, and snapshots within cloud storage while the Nasuni Orchestration Center (NOC) serves as the control plane, providing file synchronisation, monitoring, analysis, and tuning of an organisation’s file platform.  Edge performance – Organisations deploy Nasuni Edge instances as a VM that replaces traditional file servers and NAS. This edge instance serves as a lightweight access point to cloud storage, supporting popular hypervisors such as VMware ESXi, Microsoft Hyper-V, and Nutanix AHV. Nasuni dynamically caches active files for users and applications for fast access, which removes the problem of cloud latency and egress fees while reducing the local storage footprint. Data security and protection – Nasuni encrypts all data, metadata, and snapshots in transit and at rest with AES-256 encryption, and utilises a zero trust security framework. Nasuni protects data from ransomware attacks with continuous versioning that removes the need for separate backup processes and allows organisations to recover previous file, folder, volume, or system from before an attack occurred with minute granularity. “The demands of today’s leading enterprises require a scalable cloud-native solution to replace traditional network attached storage, reduce risk, and optimise cloud spending,” says David Grant, President at Nasuni. “The recognition of Nasuni by DCIG as a top five cloud-based NAS consolidation solution is a testament to our company vision and the value our platform provides.” To download the report, click here.

Macquarie Cloud launches global-first Azure SaaS
Macquarie Cloud Services, part of Macquarie Technology Group, has launched a global-first Software-as-a-Service (SaaS) offering on the Microsoft Marketplace, unlocking previously unavailable services, benefits, and value for Australian organisations transacting directly with Microsoft on an enterprise agreement.  Macquarie Guard is a full turnkey SaaS solution that automates practical guardrails into Azure services, enabling continuous cost optimisation and governance, accelerated development, and greater speed to market. It will also allow Azure customers on Microsoft agreements to draw down on its Microsoft Azure Consumption Commitment (MACC), unlocking credits and incentives much faster and helping CIOs make the most of their public cloud investment.  It even features Macquarie Cloud Services’ managed Azure intellectual property (IP) and associate services architecture, as well as more than 50 unique function applications with enhanced monitoring, reporting, and security capabilities.   “Macquarie Guard is the culmination of the unique IP across our 120-strong team and our experience providing leading Azure cloud services to Australian organisations,” says Macquarie Cloud Services' Head of Azure, Naran McClung.  “Our purpose is to help customers that are underserved and overcharged succeed. We’re ready to bring that expertise and our NPS-proven service to a whole new segment, right as organisations are increasing their cloud footprint and looking to technologies like AI and machine learning.”  Macquarie Guard’s 24x7 automated operations deliver auto-remediation, auto-healing, continuous cost optimisation, compliance and governance, advanced logging, alerting, monitoring, ticketing, and reporting.  “Macquarie Guard’s AI-powered guardrails help organisations liberate their teams from burdensome resource and infrastructure management and spend more time on meaningful and secure change,” says Naran. The launch follows Microsoft recently announcing it would invest A$5bn in the next two years to bolster Australian cyber security, cloud, and AI capabilities.  “Customers are truly seeing what real transformation mean by leading with securing their data,” says Microsoft Chief Partner Officer Australia and New Zealand, Vanessa Sorenson. “Macquarie Cloud Services is part of this transformation journey Microsoft is on. Together we can provide a secure lens and guardrails around customers’ cloud environments, safeguard their data, and service what they truly need to transform.”



Translate »