Insight and analysis on the data center space from industry thought leaders.

Preparing for the Future of Blockchain

A digital twin should capture the data center in its entirety – including IT assets, racks, power networks and cooling distribution – and enable users to test what-if scenarios.

Industry Perspectives

August 29, 2018

4 Min Read
DataCenterKnowledge logo in a gray background | DataCenterKnowledge

amy_20miller_0.jpg

Amy Miller is Product Marketing Manager for Future Facilities.

Blockchain is big – it’s predicted to grow from a $708 million market in 2017 to $60.7 billion by 2024. And it’s expected to dramatically change the global data center industry over the next 10 to 20 years. The rising popularity of blockchain stems from the mining of cryptocurrencies, which relies on an ever-growing number of servers for storage and significant power to operate and cool those servers.

Although there is a general consensus that the concept of a decentralized network such as blockchain will disrupt nearly every industry – from tech to banking to security – there is uncertainty as to how it will be used, when it will be regulated, and its impact on physical infrastructure. These concerns create a challenge for data center owners and operators who want to prepare for these changes by making the decisions and investments necessary to support future growth.

For now, we can predict that the most critical ways this increase in demand will affect data centers will be the impact on capacity management and the need for high-density computing.

Capacity Planning

It used to be that managing data center capacity was simply about making sure that each server had the space to manage its workload and was working as designed. Today, capacity management means much more. It is about understanding the physical and virtual infrastructure, such as local servers and equipment, cloud and hybrid solutions, virtualization, and colocation partnerships; analyzing where specific workloads should be placed for optimum performance; and evaluating and forecasting demand and setting aside capacity to deal with it.

As the demand for capacity utilization increases, the tools that are used must evolve as well. Standard best-practices and spreadsheet-style asset management won’t be able to address the needs of IT at this scale. Instead, data center managers will need to rely on intelligent tools that look beyond asset location and space utilization, and extend to variables such as cooling, weight and power.

High-density Computing

High-density computing, or the use of server-dense racks and virtualized server management, can help to increase capacity utilization, but can also cause significant strain on power and weight limitations. While high-density servers are able to carry more data in the same or smaller physical footprint, they also run hotter than traditional servers. This requires a new approach to cooling, which might include switching from air to liquid cooling, for example. In a 2018 survey, Uptime Institute found that traditional data center cooling systems are often unequipped to accommodate high-density equipment, causing issues for operators as cabinet densities increase. Whether a data center is able to convert entirely or devotes only a portion of the current architecture to high-density, these technologies require significantly increased cooling needs, in addition to increased power, to run safely.

As businesses slowly transition to this new blockchain-style architecture, the best way that data center managers can stay ahead is to remain flexible by preparing for the eventual impact on operations. The key to this flexibility is the ability to test potential scenarios or failures without causing risk to existing IT. The only way to achieve this is through the use of a digital twin.

The Digital Twin

Digital twins “provide a software representation of a physical asset,” such as a data center, and allow companies to “better understand, predict, and optimize the performance of each unique asset [within].” A digital twin should capture the data center in its entirety – including IT assets, racks, power networks and cooling distribution – and enable users to test what-if scenarios.

An ideal digital twin should be able to accurately predict the airflow and temperature distribution in a data center through the use of computational fluid dynamics (CFD), and analyze power failure scenarios through the use of power system simulation (PSS). Predicting both power and cooling within the space are critical factors when it comes to deploying high-density equipment and understanding potential capacity losses.

Furthermore, the digital twin enables you to design and test any scenario or change to your operations, without the risk of physical implementation in a production environment. Using a digital twin to test high-density deployments and capacity-saving scenarios enables you to make informed decisions when preparing for blockchain’s future impact on your data center. This, in turn, creates the opportunity to optimize administrative processes and account for space availability, capacity utilization, asset requirements, power needs and future costs.

Though the full transition to blockchain-style decentralization may take time, especially in more conservative or regulated industries, it has the potential for enormous impact on how businesses operate. Prepare for the evolution of the data center industry by virtualizing your facility today.

 

Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Informa.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like