Advertise on DCNN Advertise on DCNN Advertise on DCNN

Operations


Duos Edge AI to deploy edge data centres in Corpus Christi
Duos Technologies Group, through its operating subsidiary Duos Edge AI, a provider of edge data centre (EDC) solutions, has just announced the upcoming deployment of two new EDCs in Corpus Christi, Texas, USA. Scheduled to be delivered at the end of July 2025, the Corpus Christi EDCs will serve as central communications hubs for carriers delivering services to mobile operators, enterprises, local education, healthcare, and digital economy sectors while driving growth across the local market. In line with Duos Edge AI’s strategy to expand digital infrastructure in underserved and high-growth markets, with carrier integration and uninterrupted service, the initiative aims to remove key hurdles to edge connectivity while accelerating service readiness for regional partners. “Our Corpus Christi project highlights the speed, precision, and value of our Edge AI model,” claims Doug Recker, President and Founder of Duos Edge AI. “We’re delivering high-availability, localised computing power that enables fibre and network providers to scale efficiently and meet increasing demand at the edge. "We are bringing a state-of-the-art EDC solution to Corpus Christi to enable the major communications carriers to have an even more robust solution to the Corpus Christi market.” The Corpus Christi deployment is part of Duos Edge AI’s 2025 plan to deploy 15 EDCs nationwide across the US, incorporating modular design, rapid deployment, and a focus on bridging the digital divide. For more from Duos Edge AI, click here.

Macquarie and CareSuper join forces
Macquarie Cloud Services, an Australian cloud services provider for business and government and part of Macquarie Technology Group, has been appointed by CareSuper to lead a major cloud transformation program, marking a high-profile shift away from VMware Cloud on AWS and towards a more modern Azure environment. The agreement is seeing Macquarie Cloud migrate and recalibrate CareSuper’s VMware Cloud on AWS (VMC) environment – made up of hundreds of applications and petabytes of data – into a Managed Edge Azure Local offering. “Our goal is to optimise every part of our operation so we can deliver long-term value to our members,” states Simon Reiter, Chief Technology Officer at CareSuper. “Cloud decisions must serve that mission – not just today, but five years from now. Macquarie Cloud Services stood out as a partner who could deliver both the technical transformation and the ongoing managed service maturity required.” Macquarie’s Azure-led approach consolidates CareSuper’s Technology estate into a unified platform. The engagement includes migrating workloads from VMware Cloud on AWS into a new Azure landing zone, modernising databases and implementing platform-as-a-service (PaaS) offerings with the aim to streamline performance and efficiencies for the fund. “We’re seeing a wave of repatriation from AWS,” comments Naran McClung, Head of Azure at Macquarie Cloud Services. “For many organisations, rising costs and architectural limitations have made them re-evaluate. But it’s not just about moving away, it’s about moving forward. That’s where our team adds value.” Macquarie has assumed the risk of the migration project, delivering the transformation with zero upfront cost to CareSuper and full accountability for outcomes. “What we’ve found in partnering with Macquarie Cloud Services is a team of experts who can transform, refactor, migrate, and ensure we get the best operational value from our cloud environment. That the company backs itself by taking on the cost risk of the migration phase is telling of its capabilities and commitment to putting customers first,” continues Simon. Four years as an Azure Expert MSP Macquarie Cloud Services is one of only a handful of partners across Asia Pacific to retain its Microsoft Azure Expert Managed Services Provider (MSP) status for four consecutive years. “We’ve seen our Azure team and business expand by about 20% every year since we set it up in 2020,” claims Naran. “Becoming an Azure Expert MSP is not a lifetime achievement, it takes incredible dedication, assessments requiring dozens of the team to come together, and – most importantly – an ability to deliver value to customers time and time again.” For more from Macquarie, click here.

Siemens enters collaboration with Microsoft
Siemens Smart Infrastructure, a digital infrastructure division of German conglomerate Siemens, today announced a collaboration agreement with Microsoft to transform access to Internet of Things (IoT) data for buildings. The collaboration will enable interoperability between Siemens' digital building platform Building X and Microsoft Azure IoT Operations, a component of this adaptive cloud approach, providing tools and infrastructure to connect edge devices while integrating data. The interoperability of Building X and Azure IoT Operations seeks to make IoT-based data more accessible for large enterprise customers across commercial buildings, data centres, and higher education facilities, and provide them with the information to enhance sustainability and operations. It enables automatic onboarding and monitoring by bringing datapoints such as temperature, pressure, or indoor air quality to the cloud for assets like heating, ventilation, and air conditioning (HVAC) systems, valves, and actuators. The system should also allow customers to develop their own in-house use cases such as energy monitoring and space optimisation. The collaboration leverages known and established open industry standards, including World Wide Web Consortium (W3C) Web of Things (WoT), describing the metadata and interfaces of hardware and software, as well as Open Platform Communications Unified Architecture (OPC UA) for communication of data to the cloud. Both Siemens and Microsoft are members of the W3C and the OPC Foundation, which develops standards and guidelines that help build an industry based on accessibility, interoperability, privacy, and security. “This collaboration with Microsoft reflects our shared vision of enabling customers to harness the full potential of IoT through open standards and interoperability,” claims Susanne Seitz, CEO, Siemens Smart Infrastructure Buildings. “The improved data access will provide portfolio managers with granular visibility into critical metrics such as energy efficiency and consumption. With IoT data often being siloed, this level of transparency is a game-changer for an industry seeking to optimise building operations and meet sustainability targets.” “Siemens shares Microsoft’s focus on interoperability and open IoT standards. This collaboration is a significant step forward in making IoT data more actionable,” argues Erich Barnstedt, Senior Director & Architect, Corporate Standards Group, Microsoft. “Microsoft’s strategy underscores our commitment to partnering with industry leaders to empower customers with greater choice and control over their IoT solutions.” The interoperability between Siemens’ Building X and Azure IoT Operations will be available on the market from the second half of 2025. For more from Siemens, click here.

Industry analysts urge data trust over AI hype
As organisations continue to increasingly embrace AI to unlock new operations, industry analysts - at the Gartner Data & Analytics Summit in Sydney - urged a critical reminder that without trustworthy data, even the most advanced AI systems can lead businesses astray. Amid rising interest in generative AI and autonomous agents, business leaders are being reminded that flashy AI capabilities are meaningless if built on unreliable data. According to information technology research and advisory company Gartner's 2024 survey, data quality and availability remain the biggest barriers to effective AI implementation. If the foundation is flawed, so is the intelligence built on top of it. While achieving perfect data governance is an admirable goal, it's often impractical in fast-moving business environments. Instead, analysts recommend implementing "trust models" that assess the reliability of data based on its origin, lineage, and level of curation. These models enable more nuanced, risk-aware decision-making and can prevent the misuse of data without stalling innovation. Richard Bovey, Chief for Data at AND Digital, comments, "Trust in data isn't just a technical challenge, it's deeply cultural and organisational. While advanced tools and trust models can help address the reliability of data, true confidence in data quality comes from clear ownership, clear practices, and company-wide commitment to transparency. "Too often, organisations are rushing into AI initiatives without fixing the basics. According to our research, 56% of businesses are implementing AI despite knowing their data may not be accurate in order to prevent from falling behind their competitors. "Businesses must consider taking a data and AI approach to their technical operations to build trust, cross-functional collaboration, and ongoing education. Only then can AI initiatives truly succeed." At the summit, autonomy was a central theme. AI systems may act independently in low-risk or time-sensitive situations, but full autonomy still raises concerns as, while users accept AI advice, they're still adjusting to autonomous AI decision-making. Stuart Harvey, CEO of Datactics, argues, "One of the biggest misconceptions we see is the belief that AI performance is purely a function of the model itself, when in reality, it all starts with data. Without well-governed, high-quality data, even the most sophisticated AI systems will produce inconsistent or misleading results. "Organisations often underestimate the foundational role of data management, but these aren't back-office tasks, they're strategic enablers of trustworthy AI and those businesses that rush into AI without addressing fragmented or unverified data sources put themselves at significant risk. Strong data foundations aren't just nice to have in today's technical landscape, they're essential for reliable, ethical, and scalable AI adoption." Gartner predicts that by 2027, 20% of business processes will be fully managed by autonomous analytics and these "perceptive" systems will move beyond dashboards, offering proactive, embedded insights. The company also believes that by 2030, AI agents will replace 30% of SaaS interfaces, turning apps into intelligent data platforms. To thrive, data leaders should thus prioritise trust, influence, and organisational impact, or risk being sidelined. For more from Gartner, click here.

Microchip enhances TrustMANAGER platform
International cybersecurity regulations continue to adapt to meet the evolving threat landscape. One major focus is on outdated firmware in IoT devices, which can present significant security vulnerabilities. To address these challenges, Microchip Technology, an American semiconductor manufacturer, is enhancing its TrustMANAGER platform to include secure code signing and Firmware Over-the-Air (FOTA) update delivery as well as remote management of firmware images, cryptographic keys, and digital certificates. These advancements support compliance with the European Cyber Resilience Act (CRA) which mandates strong cybersecurity measures for digital products sold in the European Union (EU). Aligned with standards like the European Telecommunications Standards Institute (ETSI) EN 303 645 baseline requirements of cybersecurity for consumer IoT and the International Society of Automation (ISA)/International Electrotechnical Commission (IEC) 62443 security of industrial automation and control systems standards, the CRA sets a precedent that is expected to influence regulations worldwide. Microchip’s ECC608 TrustMANAGER leverages Kudelski IoT’s keySTREAM Software as a Service (SaaS) to deliver a secure authentication Integrated Circuit (IC) that is designed to store, protect, and manage cryptographic keys and certificates. With the addition of FOTA services, the platform helps customers securely deploy real-time firmware updates to remotely patch vulnerabilities and comply with cybersecurity regulations. “As evolving cybersecurity regulations require connected device manufacturers to prioritise the implementation of mechanisms for secure firmware updates, lifecycle credential management, and effective fleet deployment, the addition of FOTA services to Microchip’s TrustMANAGER platform offers a scalable solution that removes the need for manual, expensive, static infrastructure security updates," says Nuri Dagdeviren, Corporate Vice President of Microchip’s Security Products Business Unit. "FOTA updates allow customers to save resources while fulfilling compliance requirements and helping to future-proof their products against emerging threats and evolving regulations." Further enhancing cybersecurity compliance, the Microchip WINCS02PC Wi-Fi network controller module used in the TrustMANAGER development kit is now certified against the Radio Equipment Directive (RED) for secure and reliable cloud connectivity. RED establishes strict standards for radio devices in the EU, focusing on network security, data protection, and fraud prevention. Beginning 1 August 2025, all wireless devices sold in the EU market must adhere to RED cybersecurity provisions. By incorporating these additional services, TrustMANAGER - governed by keySTREAM - tackles key challenges with IoT security, regulatory compliance, device lifecycle management, and fleet management. This solution is designed to serve IoT device manufacturers and industrial automation providers. For more from Microchip, click here.

'More than a third of UK businesses unprepared for AI risks'
Despite recognising artificial intelligence (AI) as a major threat, with nearly a third (30%) of UK organisations surveyed naming it among their top three risks, many remain significantly unprepared to manage AI risk. Recent research from CyXcel, a global cyber security consultancy, highlights a concerning gap: nearly a third (29%) of UK businesses surveyed have only just implemented their first AI risk strategy - and 31% don’t have any AI governance policy in place. This critical gap exposes organisations to substantial risks including data breaches, regulatory fines, reputational harm, and critical operational disruptions, especially as AI threats continue to grow and rapidly evolve. CyXcel’s research shows that nearly a fifth (18%) of UK and US companies surveyed are still not prepared for AI data poisoning, a type of cyberattack that targets the training datasets of AI and machine learning (ML) models, or for a deepfake or cloning security incident (16%). Responding to these mounting threats and geopolitical challenges, CyXcel has launched its Digital Risk Management (DRM) platform, which aims to provide businesses with insight into evolving AI risks across major sectors, regardless of business size or jurisdiction. The DRM seeks to help organisations identify risk and implement the right policies and governance to mitigate them. Megha Kumar, Chief Product Officer and Head of Geopolitical Risk at CyXcel, comments, “Organisations want to use AI but are worried about risks – especially as many do not have a policy and governance process in place. The CyXcel DRM provides clients across all sectors, especially those that have limited technological resources in house, with a robust tool to proactively manage digital risk and harness AI confidently and safely.” Edward Lewis, CEO of CyXcel, adds, “The cybersecurity regulatory landscape is rapidly evolving and becoming more complex, especially for multinational organisations. Governments worldwide are enhancing protections for critical infrastructure and sensitive data through legislation like the EU’s Cyber Resilience Act, which mandates security measures such as automatic updates and incident reporting. Similarly, new laws are likely to arrive in the UK next year which introduce mandatory ransomware reporting and stronger regulatory powers. With new standards and controls continually emerging, staying current is essential.”

'AI is the new oil—and data centres are the refineries'
With AI adoption reshaping global industries, Straightline Consulting’s Managing Director, Craig Eadie, shares his insights regarding how data centres are powering the GenAI revolution: "The age of AI is here. Generative artificial intelligence (GenAI) is rewriting the rulebook when it comes to everything from software development and call centre productivity to copywriting — boosting efficiency and, depending on who you ask, on track to raise the GDP of industrialised nations by 10-15% over the next decade. "The impact of AI will reshape the global economy over the coming years, consolidating value among the companies that successfully capitalise on this moment — and disrupting those that don’t. The 'arms race' to develop the next generation of AI technologies — like Google’s new Veo 3 video generation tool, released at the start of June, which is already making headlines for its ability to allow anyone willing to pay $249 per month to create hauntingly lifelike, realistic videos of everything from kittens playing to election fraud — is accelerating as well. AI has become the new oil: the global fuel for economic growth. Unlike oil, however, GenAI alone isn’t valuable. Rather, its power lies in the ability to apply GenAI models to data. That process, akin to refining crude into petroleum, happens in the data centre. "Productivity is far from the only thing GenAI is turbocharging. This rush to build, train, and operate new GenAI models is also accelerating the race to build the digital infrastructure that houses them. Goldman Sachs predicts that global power demand from data centres will increase 50% by 2027 and by as much as 165% by the end of the decade, largely driven by GenAI adoption. "As someone working in the data centre commissioning sector, it’s impossible to overstate the impact that GenAI is having, and will continue to have, on our industry. GenAI has exploded our predictions. It’s even bigger than anyone anticipated. The money, the scale, the speed — demand is growing even faster than the most optimistic projections pre-2023. By the end of 2025, almost half of all the power data centres consume globally could be used to power AI systems. "The data centre commissioning space we’re operating in today has transformed dramatically. On the construction and design side, huge changes, not just in how buildings are constructed, but in the technology inside those buildings, are reshaping how we commission them. "The battle to capitalise on the GenAI boom is a battle to overcome three challenges: access to power, materials, and talent. "GenAI requires an order of magnitude more power than traditional colocation or cloud workloads. As a result, there are serious concerns about power availability across Europe, especially in the UK. We can’t build the data centres we need to capitalise on the GenAI boom because there’s just not enough power. There are some encouraging signs that governments are taking this challenge seriously. For example, the UK government has responded by creating 'AI Growth Zones' to unlock investment in AI-enabled data centres by improving access to power and providing planning support in some areas of the country. The European Union’s AI Continent Plan also includes plans to build large-scale AI data and computing infrastructures, including at least 13 operational 'AI factories' by 2026 and up to five 'gigafactories' at some point after that. "However, power constraints and baroque planning and approvals processes threaten to undermine these efforts. Multiple data centre markets are already facing pushback from local councils and communities against new infrastructure because of their effect on power grids and local water supplies. Dublin and Amsterdam already stymied new builds even before the GenAI boom. This comes with risk, because AI engines can be built anywhere. GDPR means data must be housed in-country, but if Europe and the UK don’t move faster, large US AI firms will resort to building their massive centres stateside and deploy the tech across the Atlantic later. Once an AI engine is trained, it can run on less demanding infrastructure. We risk stifling the AI industry in Europe and the UK if we don’t start building faster and making more power available today. "The other key constraints are access to raw materials and components. Global supply chain challenges have spiked the cost of construction materials, and the lead times for data-centre-specific components like cooling equipment can be as much as six months, further complicating the process of building new infrastructure. "Access to talent is another pain point that threatens to slow the industry at a time when it should be speeding up. Commissioning is a vital part of the data centre design, construction, and approvals process, and our sector is facing a generational talent crisis. There isn’t enough young talent coming into the sector. That has to change across the board—not just in commissioning, but for project managers, consultants, everyone, everywhere. The pain point is particularly acute in commissioning, however, because of the sector’s relatively niche pipeline and stringent requirements. You can’t just walk in off the street and become a commissioning engineer. The field demands a solid background in either electrical or mechanical engineering or through a trade. Right now, the pipelines to produce the next generation of data centre commissioning professionals just isn’t producing the numbers of new hires the industry needs. "This obviously affects all data centre commissioning, not just AI. The scale of demand and speed at which the industry is moving means this risks becoming a serious pinch point not too far down the line. "Looking at the next few years, it’s impossible to say exactly where we’re headed, but it’s clear that, unless Europe and the UK can secure access to reliable, affordable energy, as well as clear the way for data centre approvals to move quickly, pain points like the industry talent shortage and rising materials costs (not to mention lead times) threaten to leave the region behind in the race to capture, refine, and capitalise on the new oil: GenAI."

EDGNEX announces $2.3 billion data centre in Jakarta
EDGNEX Data Centers by DAMAC, a global digital infrastructure company backed by a global conglomerate headquartered in Dubai, today announced the development of a 'next-generation,' AI-powered data centre in Jakarta, Indonesia - its second in the market. This project marks one of Southeast Asia’s largest AI-dedicated developments, with a future projected capacity of 144 MW and a total investment of $2.3 billion. Following the land acquisition completed in March 2025 by DAMAC, the site has entered early construction phases, with the facility’s phase one expected to be ready for service by December 2026. The Jakarta facility will deploy high-density AI racks and is hoped to be a factor in accelerating the country’s transition from an analogue base to an AI-powered digital economy. Indonesia remains a high-potential Southeast Asian market, yet faces digital infrastructure gaps, limited hyperscale readiness, and rising latency challenges. With AI adoption accelerating across sectors, this project seeks to respond to the nation’s growing demand for scalable, energy-efficient infrastructure. “This is our second project in Indonesia, and this development reinforces our commitment to bridging the digital divide in fast-growing markets across Southeast Asia (SEA), such as Indonesia,” says Hussain Sajwani, Founder of DAMAC Group. “We are proud to build what will become one of Southeast Asia’s most advanced, sustainable data centres to power the next wave of innovation and digital growth. The scale of AI workloads demands a new class of infrastructure. This project is part of our broader push across SEA, where we have committed over $3 billion in digital infrastructure investments to date.” The new facility will target a Power Usage Effectiveness (PUE) of 1.32, and builds on EDGNEX’s growing presence in Thailand, Malaysia, and other key SEA markets. In 2024, the company announced its first data centre in Indonesia, a planned 19.2 WM data centre to be built at MT Haryono in Jakarta. It aims to address the growing demand for cloud service providers, edge nodes, and potential artificial intelligence deployments. The first phase is scheduled for completion in the third quarter of 2026. The regional goal for Edgnex in SEA is 300+ MW of operational capacity by 2026. For more from EDGNEX, click here.

UAE-IX now powered by DE-CIX
DE-CIX, an Internet Exchange (IX) operator, and partner Datamena, Du’s carrier neutral data centre and connectivity platform based in the UAE and serving the Middle East and Africa (MEA) region, today announced the upgrade of the UAE-IX to offer 400 GE access. Connected customer capacity on the exchange has soared over the last year, growing two terabits, or 30%, in twelve months. The UAE-IX is the largest IX in the Middle East, based on both connected networks and peak traffic, and is now the only IX in the region to offer 400 GE access. Established in 2012 and operated by DE-CIX on behalf of partner Datamena, the IX today has over six terabits of connected capacity and connects close to 110 internet service providers (ISPs), carriers, cloud, content, and application providers, and global enterprises. It also provides enterprise-grade interconnection services, such as a Cloud Exchange, cloud routing, and application connectivity like the Microsoft Azure Peering Service (MAPS). “The UAE-IX today stands as a global internet hub, bringing together the network operators, content, applications, and cloud services to serve the entire GCC region with resilient and low latency connectivity,” claims Ivo Ivanov, CEO of DE-CIX. “This upgrade further reinforces the importance of the UAE-IX, now ready to serve the rising demand for everything digital. The excellent collaboration with our partner Datamena has enabled the UAE-IX powered by DE-CIX to shine as the most important aggregation point for network interconnection in the Middle East. I look forward to a bright future working together for the next decade of digital development.” Karim Benkirane, Chief Commercial Officer, Du, comments, "We are proud to partner with DE-CIX in leading digital growth in the Middle East with the upgrade of the UAE-IX powered by DE-CIX to 400 GE access. It is our vision to foster a seamlessly interconnected landscape where businesses and consumers alike can benefit from unparalleled internet exchange capabilities, heightened performance, and robust security. This milestone aligns with our commitment to maintaining the UAE-IX as a pioneer in interconnection and marks a transformative leap for regional digital ecosystems." DE-CIX has been active in the Middle East for over a decade, and now operates IXs in multiple countries in the region: Iraq, Jordan, Qatar, the UAE, and Turkey. The UAE-IX in Dubai is operated under the DE-CIX as a Service (DaaS) model. The DaaS program includes a set of services – such as installation, maintenance, provisioning, and marketing and sales support – designed for carriers, data centre operators, or other third parties to create their own IX and interconnection platform operated by DE-CIX. For more from DE-CIX, click here.

‘Businesses sleepwalking into cyber catastrophe’
Security leaders have warned that ‘businesses are sleepwalking into a cyber catastrophe’ due to the rapid adoption of AI tools, alongside lacking privacy and ethics controls, amid a wave of recent high-profile cyber-attacks and data leaks. Arkadiy Ukolov, Co-Founder and CEO of Ulla Technology, a global HR platform, cautioned that many businesses are putting their data at risk by rushing off to use third-party AI tools as the main system to streamline operations. The ongoing fallout from the M&S cyber-attack, alongside other major hits against Co-op, Dior, and Harrods, has highlighted the severity of data risks and how data is protected, forcing security teams to re-evaluate their protocols. Speaking from the Viva Technology event in Paris, Arkadiy says, “Data breaches and cyber threats are relentless so it’s vital that industries such as HR, law, government, and beyond are securing every aspect of their technology stack to protect their data. Unfortunately, the speed of AI adoption means that many businesses are sleepwalking into a cyber catastrophe, leaving critical gaps in their data protection processes and putting both sensitive internal and customer data at risk. “Even in an area such as meeting transcripts, there are sensitive conversations around company financials or workplace policy updates that cannot be exposed, requiring privacy-first collection and storage methods for data to protect against a breach. Understanding the risks and putting in place enterprise-grade security and data privacy can help businesses better guard against these risks, even with the added exposure from AI.” Viva Technology, hosted this year between 11 and 14 June in Paris, is Europe’s largest startups and technology event, attracting over 150,000 attendees and 11,000 startups each year. Key themes this year include the pace of AI innovation, regulation, the importance of human control, vertical industry applications for AI, and data security.



Translate »