Telehouse for Technophiles: The Brain in the Machine
December 8, 2016How Data Centers Are Using Deep Learning
Machine and deep learning, which emerged from Artificial Intelligence (AI), the theory and development of computer systems that can perform tasks normally requiring human intelligence, mimics activities in layers of neurons in the neocortex, the area of the brain where thinking occurs. Deep learning software can be programmed to recognize patterns in digital representations of sounds, images and other data. In fact, machine intelligence is transforming the future of everything from communications to healthcare, and from manufacturing and transportation to advanced robotics. Writers and filmmakers such as Arthur C. Clarke and Stephen Spielberg have foretold of a brave new world where AI will take one day influence every waking aspect of our personal and professional lives.
Science-fiction aside, machine learning is already well-established in our everyday world, from your faithful companion Siri to facial recognition programs to language translation. But it can also help to tackle some of the world’s most challenging industrial problems, such as rampant energy consumption that adversely impacts the environment. Large-scale commercial systems, including high performance data centers, consume a lot of energy, and while much has been done to mitigate energy usage in enterprise and colocation facilities, deep learning can do much more to manage the world’s increasing need for high performance computing power.
Using a system of neural networks focused on different operating scenarios and parameters within data centers, some owner-operators have created a more efficient and adaptive framework to understand data center dynamics and optimize efficiency. One program that utilized an AI unit to manage power usage in a company’s data centers achieved a 40 percent reduction in the amount of electricity needed for cooling across its facilities.
In another application of deep learning, an enterprise facility planned the most efficient methods of cooling by analyzing data from sensors strategically located among the server racks, including information on inlet temperatures and cooling pump speeds.
Recently a deep learning system that was developed by Google’s open source machine intelligence program was shown 10 million images sourced from YouTube videos, and proved almost twice as effective as any previous image recognition system at identifying objects such as cats. Why are identifying cats pertinent to data centers? Because these programs can be adapted for use as security access controls to facilities using facial recognition technology for CCTV.
Simply put, machine learning algorithms can analyze in milliseconds what it takes days for humans to accomplish. This evolution of data center automation now makes it possible for IT professionals to monitor and analyze everything in real-time. IT operations can accurately detect anomalies across their applications, networks and infrastructure. The result? Better outage prevention, improved incident management and less downtime.
Given that the latest study from Emerson Network Power and the Ponemon Institute found that the average data center outage costs businesses approximately three-quarters of a million dollars, machine learning’s ability to assist organizations to architect and manage their IT infrastructure to reduce costly interruptions is a welcome advance upon strictly human intervention.
Telehouse provides a wide portfolio of energy-efficient hardware in its data centers and utilizes energy monitoring systems to identify usage and deviation from norms.