Data Centres
Infrastructure Management
IT
AI and machine learning- data centres need to differentiate to survive
By Peter Ruffley, CEO, Zizo
The promise of AI
At present, the IT industry is doing itself
no favours by promising the earth with emerging technologies, without having
the ability to fully deliver them, see Hadoop’s story with big data as an
example - look where that is now. There is also a growing need to dispel some
of the myths surrounding the capabilities of AI and data led applications,
which often sit within the c-suite, that investment will give them the equivalent
of the ship’s computer from Star Trek, or the answer to the question ‘how
can I grow the business?’ As part of any AI strategy, it’s imperative that
businesses, from the board down, have a true understanding of the use cases of
AI and where the value lies.
If there is a clear business need and an outcome
in mind then AI can be the right tool. But it won’t do everything for you
– the bulk of the work still has to be done somewhere, either in the machine
learning or data preparation phase.
AI
ready vs. AI reality
With IoT, many organisations are chasing the
mythical concept of ‘let’s have every device under management’. But why? What’s
the real benefit of doing that? All they are doing is creating an overwhelming
amount of low value data. They are expecting data warehouses to store a massive
amount of data. If a business keeps data from a device that shows it pinged
every 30 seconds rather than a minute, then that’s just keeping data for the
sake of it. There’s no strategy there. The ‘everyone store everything’
mentality needs to change.
One of the main barriers to implementing AI is
the challenges in the availability and preparing of data. A business cannot
become data-driven, if it doesn’t understand the information it has and the
concept of ‘garbage in, garbage out’ is especially true when it comes to the
data used for AI.
With many organisations still on the starting
blocks, or having not yet entirely finished their journey to become data
driven, there appears to be misplaced assumption that they can quickly and
easily leap from being in the process of preparing their data to implementing
AI and ML, which realistically, won’t work. To successfully step into the world
of AI, businesses need to firstly ensure the data they are using is
good enough.
AI in
the data centre
Over the coming years, we are going to see a
tremendous investment in large scale and High-Performance Computing (HPC) being
installed within organisations to support data analytics and AI. At the same
time, there will be an onus on data centre providers to be able to provide
these systems without necessarily understanding the infrastructure that’s
required to deliver them or the software or business output needed to get value
from them. We saw this in the realm of big data, when everyone tried to swing
together some kind of big data solution and it was very easy to just say we’ll
use Hadoop to build this giant system. If we’re not careful, the same could
happen with AI. There’s been a lot of conversations about the fact that if we
were to peel back the layers of many AI solutions, we’ll find that there is
still a lot of people investing a lot of hard work into them, so when it comes
to automating processes, we aren’t quite in that space yet. AI solutions are
currently very resource heavy.
There’s no denying that the majority of data
centres are now being asked how they provide AI solutions and how they can
assist organisations on their AI journey. Whilst organisations might assume
that data centres will have everything to do with AI tied up. Is this really
the case? Yes, there is a realisation of the benefits of AI, but
actually how it is best implemented, and by who, to get
the right results, hasn’t been fully decided.
Solutions to how to improve the performance of
large-scale application systems are being created, whether that’s by getting
better processes, better hardware or whether it’s reducing the cost to run them
through improved cooling or heat exchange systems. But data centre providers
have to be able to combine these infrastructure elements with a deeper
understanding of business processes. This is something very few providers, as
well as Managed Service Providers (MSPs) and Cloud Service Providers (CSPs) are
currently doing. It’s great to have the kit and use submerged cooling systems
and advanced power mechanisms but what does that give the customer? How can
providers help customers understand what more can be done with their data
systems? How do providers differentiate themselves and how can they say they
harness these new technologies to do something different? It’s easy to go down
the route of promoting that ‘we can save you X, Y, Z’ but it means more to be
able to say ‘what we can achieve with AI is..X, Y, Z‘. Data centre providers
need to move away from trying to win customers over based solely on monetary
terms.
Education
and collaboration
When it comes to AI, there has to be an
understanding of what the whole strategic vision is and looking at where value
can be delivered and how a return on investment (ROI) is achieved. What needs
to happen is for data centre providers to work towards educating customers on
what can be done to get quick wins.
Additionally, sustainability is riding high on
the business agenda and this is something providers need to take into
consideration. How can the infrastructure needed for emerging technologies work
better? Perhaps it’s with sharing data between the industry and working
together to analyse it. In these cases, maybe the whole is greater than the sum
of its parts. The hard bit is going to be convincing people to relinquish
control of their data. Can the industry move the conversation on from being
purely technical and around how much power and kilowatts are being used to how
is this helping our social corporate responsibility/our green credentials?
There are some fascinating innovations already
happening, where lessons can be learnt. In Scandinavia for example, there are
those who are building carbon neutral data centres, which are completely air
cooled, with the use of sustainable power cooling through solar. The cooling
also comes through the building by basically opening the windows. There are
also water cool data centres out there under the ocean.
Conclusion
We saw a lot of organisations and data centres
jump in head first with the explosion of big data and not come out with any
tangible results – we could be on the road to seeing history repeat itself. If
we’re not careful, AI could just become another IT bubble.
There is still time to turn things around. As we
move into a world of ever-increasing data volumes, we are constantly searching
for the value hidden within low value data that is being produced by IoT,
smartphone apps and at the edge. As the global costs of energy rise, and the
numbers of HPC clusters powering AI to drive our next generation technologies
increase, new technologies have to be found that lower the cost of running the
data centre, beyond standard air cooling.
It’s great to see people thinking outside of the
box on this with, with submerged HPC systems and full, naturally aerated data
centres, but more will have to be done (and fast) to meet up with global data
growth. The appetite for AI is undoubtedly there but for it to be able to be
deployed at scale and for enterprises to see real value, ROI and new business
opportunities from it, data centres need to move the conversation on, work
together and individually utilise AI in the best way possible or risk losing
out to the competition.
Beatrice - 6 January 2021