What Is the Future of Data Centres?
In the digital world of today, data is one of the most important assets of a company. Every day, whenever we order something online or create an account on a social media website, we create data. The more information we produce, the more vital data centres become.
The mega data centre market was worth around $18,82 bn in 2017. According to projections, it will grow to around $24.25 bn by 2023. But, beyond the numbers, what else does the future hold for data centres?
Edge Computing
Edge computing refers to distributed computing technology that pushes the core processing functions near or at the source of the data. Instead of transporting data back to the centralized cloud, it allows users to gather and study data “on position.”
To place compute capability where it’s needed, leading organisations and businesses are heavily investing in technologies that both drive the requirement for a new kind of edge data centres and demand growth of centralised cloud data centre services. The new shape in the maturation of the data centre industry may be marked by the first commercial edge deployments.
Data users can optimise response times by placing response processing and analysis close to the source. Edge computing will drive more efficient bandwidth utilisation and minimise physical reach (distance) and connections which introduce latency into the infrastructure. To gain an advantage, more and more competitors will rely on edge computing systems that can intelligently process data.
Lights Out
A few issues have bubbled up to the top as a consequence of the coronavirus pandemic and lockdowns.
- Businesses that relied on on-site data centre support employees realized that they had no or limited visibility into their data centre operations because their employees couldn’t monitor the data centre on-site.
- Cloud migration projects that were labelled “low priority” suddenly became a “top of the list” priority.
- Many businesses that had deployed remote capable data centre infrastructure management (DCIM) found gaps in their coverage.
To ensure they won’t be caught off guard again, more companies than ever are opting for “lights-out” data centres. A lights out data centre is essentially an unstaffed data centre that is physically or geographically sealed off from the rest of the building.
Thanks to advances in resource management software and remote access hardware, an operator of a lights-out data centre can relatively easily and efficiently handle troubleshooting, maintenance, and other tasks. Moreover, automation allows a single sysadmin to manage thousands of servers.
In fact, ensuring data centre safety and optimum functioning is often easier when the facility is “lights out.” The manager of a data centre doesn’t have to worry whether someone will forget to lock the server room door or spill soda on the power supply.
The AI Arms Race
In all of its forms, AI (artificial intelligence) is hardware-intensive technology that analyzes data near and far. As such, it will play a major rule in the future of data centres.
AI includes everything from inference engines running on smartphones to algorithm training at cloud campuses. Its major purpose is to make services and products smarter. So, it’s no surprise artificial intelligence will become a strategic priority for many businesses—if it hasn’t already.
Artificial intelligence is driving the demand for faster and more efficient computing software. The hardware arms race has already started, and we can expect a cluster of AI hardware startups to join the likes of AMD, NVIDIA, and Intel very soon.
Early examples include Groq and Cerebras Systems. Their hardware solutions boast eye-popping specs. More liquid cooling and higher rack densities are some of the implications these specs have for data centres. To cool this new AI gear, more immersion and liquid-to-the-chip solutions will be deployed.
Climate Risk and On-Site Power Generation
Climate change is affecting the availability and cost of power. This is already impacting the data centre industry. Big tech companies are getting flack for using unsustainable energy to power their data centres. To address these issues, both Facebook and Apple have built data centres near hydropower resources.
To supply its data centre in Central Oregon with clean power, Apple brought a small hydroelectric plant near the facility. Facebook has built a data centre right next to a hydroelectric plant in Luleå, Sweden.
We can also expect more tech companies to start building data centres in cooler climates. To save energy on cooling, enterprises will build centres in places near the Arctic circle. For instance, Google has bought 109 hectares of land in rural Sweden as part of its data centre strategy.
More and more businesses will turn to on-site power generation. As transmissions costs continue to inflate, data centres will be forced to generate power locally. The growing complexity of global energy delivery, along with climate risk, will prompt many data centre operators to pursue non-traditional utility power.
Takeaway
In order to keep up with networking trends, data centres around the globe are rapidly evolving. To capitalize on future data-driven opportunities, more and more enterprises are turning to edge computing, on-site power generation, and lights-out data centres. To take advantage of new developments, companies need to understand how the leaders in the market are moving forward.