More power, less waste – how AI is changing data center management

Data centers are the foundations of digitization – just as you would not build a house without the right footings, no organization can truly evolve without the right data centers in place. The likes of artificial intelligence, the Internet of Things, and virtual and augmented reality are all putting new pressures on enterprise infrastructure, as are more established technologies, such as cloud computing and native applications. According to IDC, by 2020, the demands of next-generation applications and new IT architectures will force 55 percent of enterprises to either update existing data centers or deploy new ones.

This increase in workload demand is pushing up the amount of power that data centers consume as well, which leads to spiraling cooling costs. Gartner estimates that ongoing power costs are increasing at least 10 per cent per year due to cost per kilowatt-hour (kwh) increases and underlying demand.

It all adds up to the modern data center being more complex than ever before, while still needing to be cost effective. Do current workforces have the capacity to manage, optimize and efficiently run them? The short answer is no, probably not. Operating in the digital era means being able to move rapidly and decisively, and that extends to the data center as well. To do that with a human-led approach simply isn’t cost effective.

So how can organizations manage both complex data centers and fully securing them without being limited by their workforces? The answer may lie, ironically, in one of the emerging technologies putting increased strain on the data centers in the first place – AI.

From energy to utilization, accurate planning to enabling decentralized data analysis and actioning, AI has the potential to turn the data center into an intelligent, autonomous enabler of digital business.

The energy efficient data center

Approximately 10 percent of data center operating expenditure (opex) is power, with Gartner estimating that will rise to about 15 percent by 2021. This in turn requires significant resources committed to keeping them cool. In the past, keeping the stacks cool has led businesses to try floating them out at sea or choosing locations with lower temperatures. Now, however, AI can help reduce the amount of energy they use for cooling, irrespective of location. Take, for example, one of the largest data center operators on the planet – Google. Through machine learning, it managed to reduce the amount of energy it uses for cooling by up to 40 percent. It did this by using historical data that sensors already collected (such as temperature, power and pump speeds) and then used Deepmind AI to train neural networks that made up the AI system.

Specifically, it trained the networks on the average future PUE (Power Usage Effectiveness), the ratio of the total building energy usage to that of IT. Two other networks were then trained to predict the future temperature and pressure of the data center over the next hour. This simulated the recommended actions to keep the PUE at an optimum ratio and ensure operating constraints were not breached.

By doing this, the system could then predict how different actions would affect energy consumption at any one time, feeding that back to the data center’s control system, which implemented the required changes accordingly. Dan Fuenffinger, a data center operator at Google, said “We wanted to achieve energy savings with less operator overhead. Automating the system enabled us to implement more granular actions at greater frequency, while making fewer mistakes.”

Full to capacity

Keeping data centers cool is critical. However, it is also vital to ensure that the infrastructure is being used properly. Deploying AI to analyze data captured from IoT sensors in the data center can help keep it fully operational and effectively utilized. As Ravin Metha, Managing Director of Orange-owned The unbelievable Machine Company, wrote in a post on the company’s blog, “if [sensor] data is merged with log data, values recorded by data sensors in the system, and empirical values, and fed into self-learning, deep neural networks, AI analyses will be able to provide the information needed to maintain the system at 100 percent availability at a significantly reduced cost.” This means predictive, rather than scheduled, maintenance, which cuts costs by not requiring servers to be offline unnecessarily or having spare parts on hand just in case – much like the approach manufacturers deploy in Industrial IoT-driven smart factories.

Twinned with digital

Another use for AI that data centers can draw from manufacturing is the idea of the digital twin. In Industry 4.0 factories, machines have a digital form as well as a physical one, and the former allows the latter to be controlled from a central point. It also aids in the planning of machine placement and use – rather than waste time and energy trying things out in real life, digital twins allow factories to run tests on virtual approaches, before physically implementing the best.

Going one step further, business information modeling, a construction approach, allows the creation of highly detailed virtual walk-throughs of entire buildings. It all runs on AI algorithms, with the process, with digital twin, planning or BIM automated to constantly find the optimized variation and implement it accordingly. That might mean altering the physical stacks, or it might be the creation of detailed plans when the physical movement or building of data centers is required.

The robot-run data center

According to CBRE, a typical data center may employ 30 people. Other estimates put the figure at closer to 100 for larger data centers, but the image is the same – even the largest of sites do not require huge workforces to run. That said, those employees still need to cover everything, from optimizing stacks, maintenance and security to keeping the site clean. Again, manufacturing provides inspiration for where data centers can incorporate AI into streamlining workflows to optimize its assets, both machine and human.

Take, for instance, car manufacturing, which has increasingly incorporated robotics into production lines. In recent years, these robots have gone from doing pre-programmed maneuvers to increasingly taking on what were once human roles, even down to inspecting paint finish quality. This has all been enabled by AI, with robots trained to understand what they are trying to achieve, either through specific program or by observing humans at work. Once trained, the AI-powered robots then have the scope to further optimize the process in ways human workers may not be able to. In data centers, this has the potential to be applied to the most effective alignment of assets, for instance based on workloads on any given day. This is AI right through the chain – from the digital twinning of stacks and the understanding of what works best when, to the actual physical movement of hardware, with AI helping robots adapt to their environments.

AI and the decentralized data center

The applications of AI that have been covered so far are most likely to be found in the large-scale sites that cover hundreds of acres. However, as the creation of and demands on data increase exponentially, so organizations are realizing that the centralized management of data is both time consuming and, fundamentally, at odds with the speed and agility of digital operations today. This has led to the rise of Edge computing, which involves processing, analyzing and actioning data closer to its source (the network edge). That means having the compute power available at these locations, whether a bank branch, retail outlet, wind farm in the North Sea or a self-driving car.

In effect, these are mini data centers in themselves. It is simply not feasible, nor logical, to have human workforces operating them. AI offers organizations a way of implementing, managing and optimizing a huge array of mini data centers right across a decentralized network, all working independently to process the right data for its location, identifying the actions to take and implementing them accordingly. That might be hyper-local offers on digital points of sale in a selection of shops; it might be adjusting the speed of a wind turbine. Whatever it is, everything is done quickly to make the most of a window of opportunity to deliver better results.

The nerve center of all activity

Being able to effectively manage and protect complex technology is at the heart of successful digital transformation. The appeal of AI is that it can help manage data centers effectively, keep them secure from evolving threats, and ensure that the right technology infrastructure is in place to support digital business ambitions. Mehta summarizes it by saying, “AI is not going anywhere – in fact, it will affect all medium-sized and large businesses. For many companies, the challenge will be how to exploit this new technology in the most profitable way and use it to digitize their business. The IT systems deployed to implement the technology, with the data center as the nerve center of all activity, are inextricably linked to this process.”

Discover how AI can impact our daily lives and watch Anne-Sophie Lotgering, Chief Digital Officer for Orange Business discussing AI-based business models at the Gartner Symposium/ITxpo 2018. Read the GlobalData analysis of our global data center and cloud services capabilities.

Josh Turner
Josh Turner

I am a technology writer with a decade of experience in business, technology and logistics. From starting off my career writing questions for a TV quiz show, I’m now spending my time looking at how the world of business is going digital and transforming a variety of sectors and industries.