6 Things To Know About Data Centers and Artificial Intelligence (AI)
January 5, 2018Brief History of Modern AI
Artificial intelligence is not a modern concept. Ramon Llull, a thirteenth-century Franciscan philosopher, discussed artificial being in his book Ars Magna. But the modern AI research phenomenon started in 1956 at a Dartmouth College workshop organized by Marvin Minsky. Initially, the pioneers of AI had lofty goals. They wanted to create machines that could learn like human beings.
Soon the trailblazers of AI realized that the path to artificial human learning wasn’t going to be easy. They made some progress in projects like ELIZA that could talk in English sentences. There was also a program that could solve algebra problems. But none of these achievements could measure up to the imagination of the public.
For the next few decades, even though AI research continued to make progress, it stayed out of sight and mind of the general population. But the 90s saw a resurgence in AI discussion when search engine, robotics and speech recognition software started to gain popularity.
Also, AI was making some breakthrough in a subset of its algorithms called machine learning. The introduction of the internet and the rise of powerful personal computers made it possible for AI to make rapid progress. Besides machine learning, another AI concept deep learning was also gaining momentum. Today machine learning and deep learning are helping create applications that can learn autonomously and solve complex problems.
Basic Concepts and Vocabulary for AI
For anyone investigating the state of today’s AI technology, you need to have a basic understanding of machine learning and deep learning.
Both technologies share the same fundamental principles. But they differ in the way models are developed. As a first step, it’s helpful to remember that machine learning is a subset of artificial intelligence and deep learning is a subset of machine learning.
In a machine learning algorithm, humans develop a model and then present that model along with some training data to the AI machine. The machine uses the inputs from the training data to make predictions based on the provided model and fine-tunes the model to align it with the training data results. In other words, create a model and let machine learning improve it using the known data.
Deep learning algorithms create nested neural networks based on the training data. The input data is broken into a hierarchy of concepts. Deep learning takes a complex idea and keeps breaking it down into simpler and simpler concepts until each node on the neural network is a simple mathematical idea. The neural network works like a decision tree. Instead of using any provided models, deep learning builds the model from its own analysis of the training data.
In machine learning, humans have to be more hands-on and provide the initial model. In deep learning, it develops the neural network through abstraction of concepts. Voice assistants like Siri and Cortana are using machine learning. Deepmind’s AlphaGo that defeated the world Go champion in 2017 is the most famous example of deep learning. It was a milestone in AI development.
Because deep learning needs to figure out the solution from the ground up without help from humans, it requires a lot more training data than machine learning. It also requires more high-end GPUs to create the neural networks. Projects using deep learning use a data center or colocation service provider to meet the computational needs. Machine learning projects can work on low-end machines. However, due to better performance possibilities, machine learning also ends up using data center resources.
Data Centers and Artificial Intelligence
The rise of artificial intelligence is affecting global data centers in two ways:
- AI applications need the global data centers to provide the necessary computational power.
- AI applications are being developed to improve the data centers themselves.
So, data centers are both serving and being served by artificial intelligence. Here are some important topics regarding the interaction between data centers and AI technologies:
- Need for Global Data Centers with GPUs
The demand for microprocessors and servers have skyrocketed due to machine learning and deep learning. Deep learning applications like voice search and image recognition require high-end GPUs. It is one of the reasons for the dramatic rise of Nvidia’s stock price (Nvidia GPUs are also in high-demand for blockchain processing). A Tier 4 data center with support for GPU-based processing is ideal for deep learning applications. Due to the apparent business opportunities, companies are looking into building data centers that cater specifically to the needs of machine learning and deep learning.
- Artificial Intelligence Helping Data Centers Become Energy Efficient
As an owner of a data center in New York, the Telehouse team understands the effect of high electricity prices. Recent developments in AI can help improve the energy efficiency of data centers significantly. Google has reported that it was able to cut down its energy usage by 40% using deep learning. Its Deepmind AI controls about 120 variables of the data center, like fans, cooling system, windows, and more. Google used 4,402,836 MWh of power in 2014. So, a power savings of 40% could result in millions of dollars saved down the line. It’s likely that more data centers will start using similar AI-based solutions to save on energy.
- Using Artificial Intelligence for Server Optimization
AI-based solutions are moving into other areas too. Data centers have to maintain physical servers and storage equipment. Inefficiencies in server usage mean leaving money on the table. AI-based predictive analysis can help data centers distribute workloads across the servers. Latest load balancing tools with built-in AI capabilities are able to learn from past data and run load distribution more efficiently. With AI-based monitoring, companies are able to better track server performance, disk utilization, and network congestions.
- Using AI for Data Center Security
Data centers have to be prepared for cyber attacks and threats. But the landscape of cybersecurity is ever changing. It’s difficult for human beings to stay up-to-date on all the information. As a colocation service provider with our own Internet Exchange Point, Telehouse has to be vigilant about any cyber threats. It requires a lot of work and human-hours to monitor and manage cybersecurity issues. Machine learning and deep learning applications can help data centers adapt to changing requirements faster. A British company Darktrace is using machine learning define normal network behavior and then detect threats based on the deviation from that norm. Generally, data centers have tried to deal with threats by restricting access and create impenetrable walls. But with a constant flux of users, this approach of restricting access has never been enough to ensure security. The more dynamic approach of AI-based systems can help data centers be more secure without imposing stringent rules on their users.
- Future Data Centers with AI Operators and Robots
The immense scale of modern data centers is mind-boggling. Telehouse’s data center in New York is 162,000 square feet of colocations space with a connection to our own Internet Exchange Point. Maintaining such a facility requires a lot of resources. Artificial intelligence based solutions can help in this area too. Dac, an AI-based application with data center skills, is being developed by Litbit. Dac will be able to use the strategically placed Internet of Things (IoT) sensors to detect loose electric wires in the server rooms or water leakages in a cooling system. Dac will use ultrasound hearing capabilities to find potential failing power supplies. Another company Wave2Wave is combining AI with robotic automation to create robots that will help data centers maintain physical equipment.
- Better AI-based DCIM Tools in the Future
Data Center Infrastructure Management (DCIM) solutions are popular with data centers and colocation service providers. They help monitor various aspects of temperature, floor security, equipment status, fire hazards, cooling systems, ventilation, and more. But there are just too many factors to monitor. Future DCIM solutions can hand over a lot of the tasks to artificial intelligence. Then humans can concentrate on the most critical and creative aspects of maintaining an efficient data center. These intelligent DCIM systems will proactively help during disaster recovery and also help data centers comply with HIPAA, PCI DSS, SOC and other regulations.
Last Words
At the moment, artificial intelligence is looking promising for the data center industry. The rise of AI-based applications will increase the demand for colocation service providers. Global data centers and colocation service providers will step up their game to meet this demand. And AI-based applications will help these data centers run efficiently to provide better service to their customers. If you are developing an AI-application, it is important to choose global data centers and colocation service providers who can help you provide cost-efficient services through the use of latest technology for energy efficiency, optimization, security, compliance, and disaster recovery.