Monday, March 10, 2025

Data


New collaboration seeks to simplify media asset management
VAST Data, the AI data platform company, has announced a partnership with Dalet, a technology and service provider for media-rich organisations. The collaboration is designed to simplify media asset management, offering more flexibility and increased productivity and efficiency for media and data-centric organisations. This collaboration integrates the VAST Data Platform with Dalet’s media workflow platform to empower joint customers to manage all of their video and product assets in a unified experience. VAST Data's partnership with Dalet seeks to deliver a cost-effective offering that provides high-performance, future-proofed media asset management to help media and broadcast organisations streamline media workflows, increase productivity, reduce the time-to-market for high-quality content, and lower the total cost of ownership - all while fortifying security and data resiliency. VAST is certified with Dalet platforms, enabling customers to manage their media with multi-protocol file and object access without the need for additional data copies, ensuring non-stop access to their valuable assets even as they scale and upgrade their systems. Together, VAST Data and Dalet deliver Media and Entertainment (M&E) customers the following benefits: Enhanced performance and efficiency. The integration promises to significantly improve the speed and efficiency of media operations, allowing for AI-enabled workflows that harness the power of a single, cloud-native technology stack to facilitate production, distribution, archive and monetisation of news, sports, and entertainment across all channels. Scalability for future growth. Customers will benefit from a highly scalable solution that effortlessly accommodates the burgeoning volumes of high-resolution content, ensuring that media data sets can grow without constraints. Cost reductions. VAST Data is one of the first vendors approved by Dalet to utilise its deduplication technology. By unifying a multi-modal data management environment, organisations can avoid incurring excess infrastructure costs leveraging VAST’s unique Similarity-based data reduction to store excess data and eliminate cross-protocol data sprawl. When combined with Dalet's smart render technology, which intelligently conserves data by avoiding unnecessary decode/encode processes for untouched frames, customers can expect substantial savings. Data reliability and security. With VAST Data’s robust innovations, customers can rely on data durability and security, ensuring that valuable media assets are protected and always accessible. VAST Data eliminates many attack vectors in a multi-tenant environment by hosting industry-standard network attached services, object services, and database services from standard client protocols such as NFS, SMB, S3 and Apache Arrow. “Success in media and broadcasting requires the agility to respond to an ever-changing landscape of consumer tastes and evolving media technologies," says John Mao, Vice President, Technology Alliances at VAST Data. “This partnership is poised to redefine what’s possible in the media and entertainment industry, driving innovation and offering unparalleled efficiencies to our customers. The ability to seamlessly transition between Dalet Galaxy five and Dalet Pyramid without compromising data integrity or incurring additional storage costs represents a significant leap forward for the M&E industry." By harnessing the VAST Data platform, the company believes that Dalet can now offer its customers a competitive edge, while enabling them to leverage advanced capabilities for media production, distribution, and monetisation. “Designed for mission-critical media workflows, Dalet’s unified platform for asset management, content production and orchestration maximises return on investment (ROI) globally through industry-leading automation, full elasticity, and AI-enabled applications,” notres Aaron Kroger, Product Marketing Lead at Dalet. “Leveraging VAST’s AI-driven infrastructure allows us to provide our users with cutting-edge capabilities to manage and monetise their content more effectively than ever before.” For more from VAST Data, click here.

New integrated clock chip for AI data centres
SiTime Corporation, a precision timing company, today introduced its Chorus family of clock generators for AI data centre applications. This new MEMS-based clock-system-on-a-chip (ClkSoC) family offers ten times higher performance in half the size, compared to standalone oscillators and clocks. Chorus’ new approach includes clock, oscillator and resonator technologies in an integrated chip, simplifying system clock architecture and accelerating design time by up to six weeks. Chorus, combined with recently acquired timing products from Aura Semiconductor, builds on SiTime's strategy to offer a complete portfolio of highly differentiated innovations. According to a recent Bloomberg Intelligence report, Generative AI to Become a $1.3 Trillion Market by 2032, “the AI data centre hardware market is surging by an estimated 33% annually and projected to reach approximately $200 billion [£160 bn] by 2027.” Rapid upgrade cycles for AI hardware will be essential to running data and compute-intensive AI workloads. “AI is driving tremendous needs for higher data throughput in data centres and lower power consumption, and SiTime is uniquely positioned to help address these issues,” says Piyush Sevalia, Executive Vice President of Marketing at SiTime. “Before Chorus, hardware designers had to use discrete product types, such as clocks, oscillators and resonators, which resulted in performance compromises. Chorus delivers integrated clock generators to solve these problems and is yet another example of how we are transforming the timing market with our unique approach.” Chorus, with its integrated MEMS resonator, addresses the limitations of legacy clock generators, eliminating problems such as noise and matching the resonator's impedance with the clock. Also, Chorus can reduce the board area for timing by up to 50% by replacing up to four standalone oscillators. Data centre equipment such as servers, switches, acceleration cards and smart network interface cards (NICs) are ideal applications for Chorus. “SiTime continues to solve the electronics industry’s toughest timing challenges with advances in silicon MEMS timing technology,” said Dave Altavilla, Co-Founder, President and Principal Analyst at HotTech Vision & Analysis. “SiTime's new MEMS-based family of clock generators represents a significant leap forward, offering enhanced performance, reliability and integration essential for the evolving needs of big iron AI data centres.”

New regional data storage option in Australia
Autodesk, a provider of software offerings for architecture, engineering, construction and manufacturing industries, has announced that customers now have the choice to store their project data primarily in the Australia region. As of today, a new data storage location for select Autodesk Construction Cloud products is now operational in Australia. “The launch of the Australia region data storage location supports Autodesk’s regionalisation strategy for our cloud offerings,” says Sumit Oberoi, Industry Strategy Manager, Asia Pacific at Autodesk Construction Solutions. “Having a local data storage option for select Autodesk Construction Cloud services gives our customers more choice with respect to the primary storage of their project data and helps them meet their data residency preferences.” Data plays a crucial role in the construction industry. It enables better planning, efficient execution, and effective management of construction projects. With accurate data, companies can make better informed decisions about resource allocation, cost estimation, time management, and risk assessment. By providing its customers with more choice for primary storage of their project data, Autodesk is enabling customers to get a better handle on their data strategies. Steven Bloomer, Regional Information Management Lead at GHD (an Autodesk customer), adds, “Hosting Autodesk Construction Cloud in Australia supports our strong focus on data. Secure and effective management of data is a priority for every client we work with and where that project data is stored is critical. This can impact how we enable a collaborative and interconnected common data environment for Australian-based projects. Autodesk establishing Australia as a storage region for project data gives us the option of a local native solution for the Autodesk product stack used in our project delivery". The new regional data storage in Australia aims to empower customers with choice. With three global data storage locations available for select Autodesk Construction Cloud products, customers can choose where to primarily store their project data, prioritising trust and control. It also offers reduced latency. With the Australian server, customers in the region can enjoy optimised performance through reduced latency when working within the region for these select services. Autodesk says that its launch of a data storage option in Australia underscores its commitment to supporting customers in the region, offering them greater control of their project data and helping optimise performance for their projects. For further information about Autodesk’s new regional offering in Australia, click here.

DigiCert and Deutsche Telekom to redefine digital trust in Europe
DigiCert, a global provider of digital trust, has announced a strategic collaboration with Deutsche Telekom to enhance its digital certificates and identity management offerings. Leveraging the expertise of DigiCert, Deutsche Telekom claims that it will deliver comprehensive solutions tailored to the diverse needs of European customers. The partnership with DigiCert equips Deutsche Telekom to address a wide spectrum of requirements, ranging from public key infrastructure and identity and access management to digital certificates and hardware security solutions for various devices, including smartphones and computers. With a single-source approach, Deutsche Telekom aims to cater to the demands of both its customers and their subsidiaries, ensuring seamless integration. "This strategic move aligns with our commitment to maintain certified trust service provision in geo-redundant data centres, ensuring compliance with European legal standards,” says Andreas Brasching, Head of Trust Centre and Identity Security, Deutsche Telekom Security. “By joining forces, we fortify our position as a leader in the digital security landscape, ensuring that our offerings continue to meet the evolving needs of our customers while upholding the highest standards of data integrity and sovereignty." With an eye toward future advancements, Deutsche Telekom aims to enhance flexibility and scalability in its service offerings through collaboration with DigiCert. In an era characterised by rapid IoT proliferation and expanding attack surfaces, the partnership enables Deutsche Telekom to deliver agile, scalable platforms capable of meeting evolving security needs. Additionally, seamless identity management across diverse applications and device instances will enrich Deutsche Telekom's security portfolio, empowering customers with greater control and agility in their digital ecosystems. "We are thrilled to collaborate with Deutsche Telekom, a renowned leader in digital innovation and trust services,” says Stuart Schielack, Vice President, Global Channels and Alliances at DigiCert. "This partnership underscores our shared commitment to delivering cutting-edge security solutions that empower businesses and individuals to navigate today's complex digital landscape with confidence. Together, we will leverage our combined expertise to drive innovation and set new standards for digital trust and security in Europe and beyond."

The Data Lab appoints Heather Thomson to Interim CEO position
The Data Lab has announced that its CEO, Brian Hills, has stepped down from his role. Following its successful application to the Scottish Funding Council to be funded as part of a new 10-year Innovation Infrastructure programme, Brian is taking a new opportunity in the private sector. In his place, Scotland’s innovation centre for data and AI has appointed Heather Thomson as Interim CEO. She will take up the position with immediate effect. Les Bayne, Chair of The Data Lab, says, “Brian has been integral in the evolution and growth of The Data Lab over the last nine years. Since joining the centre in 2015 and taking up the CEO role in 2021, he has played a major part in the creation of a wealth of successful programmes. As a result, these programmes have generated £200 million in additional revenue for Scotland’s data and AI sector, as well as forming and safeguarding more and 1,350 jobs in the sector. We would like to extend our thanks to Brian for his leadership and contribution to The Data Lab and wish him all the best for the future.”  Since joining The Data Lab in 2018, Heather has led the £8 million data and AI skills programme, helping to create a highly skilled workforce and a closely connected business, academic and public sector community whilst addressing the challenges and opportunities derived in a changing world of work. She was appointed to the senior leadership team in 2021 as Director of Skills and Education, playing a fundamental role in developing strategy, delivery and culture across the wider organisation. Heather will now lead the transition from existing funding to new Innovation Infrastructure funding. Les Bayne adds, “Heather’s appointment marks a significant milestone in the future of The Data Lab, as we celebrate 10 years since the centre’s inception. Heather starts this role as the next funding chapter kicks off and the opportunity for data and AI continues to grow, presenting a huge opportunity for Scotland which The Data Lab will be at the heart of.” The Data Lab will appoint a permanent CEO in 2025, following the launch of the new Innovation Infrastructure programme.  

AI and sustainability: Is there a problem? 
By Patrick Smith, Field CTO EMEA, Pure Storage AI can do more and more. Think of any topic and an AI or genAI tool can effortlessly generate an image, video or text. Yet the environmental impact of, say, generating a video by AI is often forgotten. For example, generating one image by AI consumes about the same amount of power as charging your mobile phone. A relevant fact when you consider that more and more organisations are betting on AI.   After all, training AI models requires huge amounts of data, and massive data centres are needed to store all this data. In fact, there are estimates that AI servers (in an average scenario) could consume in the range of 85 to 134Twh of power annually by 2027. This is equivalent to the total amount of energy consumed in the Netherlands in a year. The message is clear: AI consumes a lot of energy and will, therefore, have a clear impact on the environment.  Does AI have a sustainability problem?  To create a useful AI model, a number of things are needed. These include training data, sufficient storage space and GPUs. Each component consumes energy, but GPUs consume by far the largest amount of power. According to researchers at OpenAI, the amount of computing power used has been doubling every 3.4 months since 2012. This is a huge increase that is likely to continue into the future, given the popularity of various AI applications. This increase in computing power is having an increasing impact on the environment.   Organisations wishing to incorporate an AI approach should therefore carefully weigh the added value of AI against its environmental impact; while it’s unlikely a decision maker would put off a project or initiative, this is about having your cake and eating it. Looking at the bigger picture and picking technology which meets both AI and sustainability goals. In addition to this, the underlying infrastructure and the GPUs themselves need to become more energy-efficient. At its recent GTC user conference, NVIDIA highlighted exactly this, paving the way for more to be achieved with each GPU with greater efficiency.   Reducing the impact of AI on the environment  A number of industries are important during the process for training and deploying an AI model: The storage industry, data centre industry, and semiconductor industry. To reduce AI's impact on the environment, steps need to be taken in each of these sectors to improve sustainability.  The storage industry and the role of flash storage  In the storage industry, concrete steps can be taken to reduce the environmental impact of AI. An example is all-flash storage solutions which are significantly more energy-efficient than traditional disk-based storage (HDD). In some cases, all-flash solutions can deliver a 69% reduction in energy consumption compared to HDD. Some vendors are even going beyond off-the-shelf SSDs and developing their own flash modules, allowing the array’s software to communicate directly with flash storage. This makes it possible to maximise the capabilities of the flash and achieve even better performance, energy usage and efficiency, that is, data centres require less power, space and cooling.   Data centres power efficiency  Data centres can take a sustainability leap with better, more efficient cooling techniques, and making use of renewable energy. Many organisations, including the EU, are looking at Power Usage Efficiency (PUE) as a metric - how much power is going into a data centre vs how much is used inside. While reducing the PUE is a good thing, it’s a blunt and basic tool which doesn’t account for, or reward, the efficiency of the tech installed within the data centre.    Semiconductor industry  The demand for energy is insatiable, not least because semiconductor manufacturers - ,especially of the GPUs that form the basis of many AI systems - are making their chips increasingly powerful. For instance, 25 years ago, a GPU contained one million transistors, was around 100mm² in size and did not use that much power. Today, GPUs just announced contain 208 billion transistors, and consume 1200W of power per GPU. The semiconductor industry needs to be more energy efficient. This is already happening, as highlighted at the recent NVIDIA GTC conference, with CEO Jensen Huang saying that due to the advancements in the chip manufacturing process, GPUs are actually doing more work and so are more efficient despite the increased power consumption.   Conclusion  It’s been clear for years that AI consumes huge amounts of energy and therefore can have a negative environmental impact. The demand for more and more AI generated programmes, projects, videos and more will keep growing in the coming years. Organisations embarking on an AI initiative need to carefully measure the impact of their activities. Especially with increased scrutiny on emissions and ESG reporting, it’s vital to understand the repercussions of energy consumption by AI in detail and mitigate wherever possible.  Initiatives such as moving to more energy efficient technology, including flash storage, or improving data centre capabilities can reduce the impact. Every sector involved in AI can and should take concrete steps towards a more sustainable course. It is important to keep investing in the right areas to combat climate change!

Mathpix joins DataVerge colocation facility to support AI workflows
Mathpix, an AI-powered document automation and scientific communication company, has joined DataVerge, a carrier-neutral interconnection facility in Brooklyn. DataVerge surpassed Mathpix’s criteria, which included robust and redundant power, a fast connection to AWS US-East-1, scalability and proximity to its Brooklyn, New York headquarters, making it the colocation facility of choice. As more companies rely on AI for their business, colocation and data centres must deliver greater than ever levels of uninterrupted power and connectivity to support high-density AI workloads. Though many companies with a thriving presence in the New York metropolitan area, are now seeking to reap the benefits of AI, few New York area data centres are equipped with the abundance of power required to meet their AI needs. In addition to power, DataVerge, which is the largest interconnection facility in Brooklyn, New York, offers more than 50,000ft2 of customisable data centre space, along with secure IT infrastructure and rapid deployment and connection to more than 30 carriers as well as 24/7 access and support. Mathpix enables organisations to quickly and accurately convert PDFs and other types of documents, including handwritten text and images, into searchable, exportable and machine readable text used in large language models (LLMs) and other applications. According to Nico Jimenez, CEO of MathPix, “DataVerge enables us to colocate our own servers, which are equipped with our own GPUs. This setup provides us the opportunity to select the hardware we need to build and configure our servers so that they significantly reduce latency, and at a considerably lower price point than what the hyperscalers charge for colocation.” “AI is essential to how many New York area companies run their businesses,” says Jay Bedovoy, CTO of DataVerge. “Our colocation and data centres provide them with uninterrupted power and connectivity, as well as the scalability, high performance and proximity needed to avoid latency issues, which is critical for AI and other real-time interaction in decision-making applications.” 

UK company solves Microsoft 365’s biggest backup challenge
Channel-first cloud and disaster recovery specialist, virtualDCS has launched a comprehensive Azure backup service, which protects more than 250 configurable items in an established Microsoft 365 estate. Known as CloudCover Guardian for Azure, the new service forms part of virtualDCS’s CloudCover 365 solution, which offers complete Microsoft 365 backup and recovery. Plus, in another major development, virtualDCS has also launched a unique ‘clean room’ service for organisations and users that need to restore their systems, in a sterile and isolated environment, in the event of a ransomware attack. Configurations covered by CloudCover Guardian for Azure span user accounts, access privileges and unique security groups across popular applications such as SharePoint, Teams, OneDrive, Exchange and Entra ID (formally Azure Active Directory). Crucially, Microsoft does not currently offer a backup service for these configurations. Richard May, CEO at virtualDCS, says, “Backing up the data without your Azure configurations will not provide you with the glue to reassemble your data following a disaster, as the combination of Microsoft 365 data and Azure configurations is what defines your complete Microsoft 365 tenancy. “It’s the con­figurations that effectively govern how Microsoft 365 is used and any lost or amended configurations can cause irreparable damage to the productivity, data security and reputation of an organisation.” Created by the virtualDCS, CloudCover Guardian for Azure completely redefines what’s possible when it comes to backing up and restoring a Microsoft 365 estate’s configurable items. The system captures every vital configuration, meaning a Microsoft 365 environment can be proactively managed from a single console, for a complete, clean recovery after an incident. As a result, IT teams and individuals restoring their systems are no longer trying to remember settings and manually reconfigure which groups users were in, what they had licences for and what they had access to. Importantly, it’s also a vital security tool. Because it runs twice a day, it alerts administrators by email to any changes on their system that could be caused by insider threats, viruses or anything else, within hours of it happening. In addition to offering a 360-degree view of an estate and fast restoration, CloudCover Guardian for Azure also enables organisations to maintain a clean blueprint of their desired Azure configurations. Whether it is the accidental deletion of a user or the suspicious amendment of a security policy, organisations can quickly roll-back to a safe state. After recreating and restoring an environment, users can then simply reintroduce data back into the environment. The system also enables unique Azure configurations to be easily replicated across new tenancies. This is all complemented by another new service from virtualDCS. With the company’s new ‘clean room’ offering, where rather than taking hours or days to restore systems after a ransomware attack, organisations can now transfer data into a cloud-adjacent environment. This significantly improves recovery time objectives (RTOs) by giving organisations immediate access in an isolated environment, to avoid ransomware loops and ensure everything is healthy before data is reintroduced into the business. The ‘clean room’ can be used in a number of invocation scenarios, such as restoring data as it exists to a sterile clean room environment for monitoring and data assessment. Once the recovery is confirmed clean, organisations can access data from the recovery room until their infrastructure is stabilised.

Pedab and Object First join forces for advanced Veeam data protection
Pedab and Object First have announced a partnership to provide Veeam customers with advanced data protection to reduce the risk of cyber threats like ransomware. This partnership brings together two companies with complementary expertise, including Pedab's strong knowledge of IT infrastructure and software solutions and Object First's innovative approach to immutable backup storage for Veeam. Object First claims that it is known for offering the best storage for Veeam, provided as an out-of-the-box immutable backup storage appliance. It offers the last line of defense against disasters and ransomware, ensuring that data can be recovered quickly and completely, with minimal downtime.  Pedab will introduce Object First’s solution Ootbi (Out-of-the-Box Immutability) to the Northern European market. Ootbi can be deployed in 15min and supports up to 80VMs running locally, powered by Veeam Instant Recovery on a four-node configuration. The solution’s power is matched by its security. It offers object-based immutability with zero access to the root or the hardened Linux operating system by default—meaning data cannot be accessed, changed, or deleted. Ootbi's storage environment is completely secured, validated, and third party-tested and comes preconfigured out of the box – no security expertise required. With Ootbi, businesses can safeguard their data with confidence, knowing that critical information remains secure and unchangeable, even in the face of evolving cyber threats. “Ootbi by Object First is an important enhancement to Pedab’s portfolio of carefully selected providers. It ensures Pedab’s mission to continuously evolve our full-service solution offering to partners. With Ootbi we can now help Veeam partners provide peace of mind to their customers with an immutable onsite backup storage,” says Jesper Bartholdson, CEO of Pedab Group. “We are excited to announce Pedab as our distributor in Northern Europe. With their extensive expertise in the IT industry, partnering with Pedab was an easy choice,” says Mark Haddleton, EMEA Channel Sales Director at Object First. “Together, we are committed to delivering secure, simple, and powerful backup storage for Veeam users, helping businesses stay one step ahead of ransomware threats.”

Keepit provides fast targeted restore time after ransomware attack
Keepit, a global provider of cloud backup and recovery solutions, has announced the results available to organisations leveraging Keepit SaaS data protection. The results show Keepit solution provides organisations the ability to restore and recover backed-up data after a ransomware attack in a quick, efficient and accurate manner. These findings stem from a recent study conducted by Forrester Consulting, which quantify the value of the world’s sole vendor-independent cloud dedicated to SaaS data protection. Keepit commissioned Forrester Consulting to conduct a Total Economic Impact (TEI) study and examine the potential return on investment (ROI) that enterprises may realise by deploying Keepit SaaS data protection. The purpose of this study was to provide a framework to evaluate the potential financial impact of Keepit SaaS data protection on end users. To better understand the benefits, costs, and risks associated with this investment, Forrester interviewed four representatives with experience using Keepit. For the purposes of this study, it aggregated the interviewees’ experiences and combined the results into a single composite organisation that is represented as a manufacturing organisation with $2bn in annual revenue. While ransomware attacks have become increasingly common and organisations face the risk of losing  critical  data, Keepit claims to offer strong protection, providing a crucial lifeline for recovering user data after cyber attacks or other events. With Keepit, information technology (IT) administrators can quickly find, restore, and save data. Additionally, the study also shows that its solution helps prevent the negative effects of ransomware attacks and saves time and money for IT teams, resulting in smoother, more time-efficient SaaS backup operations. Benefit worth $819,100 According to the study, the Keepit solution limits the impact of a ransomware attack for the composite organisation by allowing it to recover and restore data quickly, preventing data loss and reducing downtime. This benefit is worth $819,100. And, the study finds, “the time needed to restore the tier-one users is 90% lower than the time the composite would spend restoring its data without Keepit, which has been identified by interviewees to be at least three weeks.” “We can prove it’s possible to significantly lower downtime during recovery from a ransomware attack. Lowering downtime is a sure-fire way to maximise return on your investment in a backup solution. Because it’s not a question of if an attack will happen, the question is how to bounce back when it does,” says Paul Robichaux, Microsoft MVP and Senior Director of Product at Keepit. Three quarters of security decision-makers suffered a breach in the last 12 months, and in the study, Forrester research recommends, “Backups are the best insurance policy against an attack, but to be effective they need to be part of a planned and tested backup and recovery process.” The study participants noted that their organisations planned for a disaster scenario in the event of a ransomware attack and were aware of the exposure risk and potential losses they could suffer as a result. As a follow-up to the Forrester study, Keepit will be holding a webinar at 2pm EDT on 26 March, titled, ‘The ROI of Ransomware Recovery’. Hosted by Keepit’s Paul Robichaux, the webinar will feature guest speakers Brent Ellis, Forrester Senior Analyst, and Elia Gollini, Forrester Associate Consultant. Key takeaways from the webinar will include: Actionable insights and recommendations on how to address current gaps in disaster recovery planning Insight into best practices for ensuring business continuity during a ransomware attack Information about the return on investment (ROI) organisations have realised when deploying Keepit for ransomware recovery To register for the webinar, visit the sign-up page here.



Translate »