Skip navigation

An Industry of Innovation

ICT is now the technology that creates scientific discoveries, powers financial innovation, and develops the latest in product design. And the application of information technology is not in any way slowing down.

Tate Cantrell is CTO of Verne Global.

Let’s break down the acronym ICT: Information and Communication Technology. ICT as a sector and as an industry was created to enable efficient sharing of information. Improved communication between the workers that make up our economies has generated huge returns in terms of GDP growth around the world. But, ICT is changing. No longer is the technology stack enabling information sharing, the technology stack is now creating the information. In terms of work product, ICT has moved from the front office to the factory floor and is now a key raw material in the makeup of every durable good and service the world over.

Data centers that were built to support the emerging ICT industry were focused as you would expect at the centers of communication. Business models built around enabling the interconnection between content and carriers were  seamless and lightning fast. This ensured that the role of interconnecting information networks would continue to enable the product development workforce. Enterprises kept their applications within their own buildings and campuses and used traditional carriers to connect into the ICT industry networks. Enterprises maintained physical control of the assets that ran financial systems, Human resources systems and general office communications systems. Forward thinking companies at the time differentiated through emerging technologies like virtualization to ensure that they could deploy new applications quickly and easily.

Artificial Intelligence

Two events have forced a change in the enterprise data center. First, advancements in computational methodologies have dramatically increased the value generation potential for data. Deep Learning, the latest set of applications built on Artificial Intelligence (AI) technologies such as Machine Learning has emerged as a new arms race. Companies compete to gain access to the best sources of data and then race to extract the most value from that data.

AI is driving change in the way that companies are using compute and leveraging data. Second, enterprises have dramatically increased the uptake in the application of scientific tools into their product development toolkit. PhD dissertations are no longer sitting on the shelf in obscurity collecting dust. Enterprises are instead soliciting the latest technology and racing to integrate these novel developments into new methods of value generation. Most of these scientific advancements leverage scientific computing methods in what is referred to as HPC or High Performance Computing. The increased infrastructure demands of deep learning workloads and HPC applications have simply outstripped the capabilities of the enterprise class data center.

AI is certainly not new to the technology landscape. The digital concepts of applying neural networks to extract patterns from seemingly random data sources was first taking shape around the time that Turing was helping to unwind Enigma. The ability to apply these concepts at scale has created an entire generation of applications that are shaping the way that we value data and the information that we are able to extract from it.

Today’s enterprises have been empowered to move beyond management of traditional ICT infrastructure, releasing CIOs to the next phase of brokering access to data sources that will help to drive innovation and ultimately enterprise value. And in order to ensure that scaling is an option for the long term, CIOs are choosing to buy specialized hardware applied in specialized data centers or simply bypassing the data center all together and purchasing AI services via a cloud model. The enterprise data center is not the place to leverage the future returns that AI offers.

High Performance Computing

HPC is also not a new concept. Computer-aided Engineering or CAE was conceived in the late '70s and using scientific models to solve complex differential equations was one of the first applications put onto the bulky mainframes of decades past. But today, access to scientific computing is no longer limited to PhDs in white coats. Enterprises are using the advancements in multi-core CPU servers to bring CAE and scientific simulations into the design rooms and onto the factory floor. In the case of the automotive industry, HPC has enabled the creation of lighter cars made of carbon reinforced plastics while ensuring that the complex structural models are appropriately analysed to comply with increasing safety regulations, even while carrying a half of a ton of lithium-ion batteries.

HPC is eliminating the need for traditional tools like the crash test dummy and the wind tunnel, and in turn is accelerating the pace of innovation in not only manufacturing but in many other industries as well. And as the pace of development surges forward, enterprises are requiring an exponential increase in HPC. As the requirements increase, so do the hardware specifications and the power requirements for the servers that compute the HPC workloads.

Enterprises that use their HPC applications as a differentiator in time to market and quality of design procure and manage their own infrastructure in typically outsourced data centers that are specifically suited to handle the high intensity compute. But more and more, enterprises are requesting HPC on-demand services that allow the innovators to access their applications without requiring the enterprise to bear the balance sheet burden of the latest servers.

ICT has graduated from its roots in simple communications. Emerging technologies such as Blockchain stand to further disrupt the way that we think about ICT. Not only does Blockchain have the potential to drive the democratization of scientific computing and data access with public ledgers that record our collective progress on the toughest scientific challenges, Blockchain is pushing our industry many years forward in how we approach data center design. Since Blockchain is built and rewarded on the concept of competitive computational proof of work, Blockchain has pushed data center design fully in the direction of the commodity.

Both AI and HPC are ripe for leveraging such commodities to reduce recurring costs and increase the overall return on investment. We believe that the future in high intensity computing for AI and HPC will be with on-demand services that are run in data centers designed and operated similarly to the same industrial data center factories that power today’s most advanced Blockchain computing. With a sustainable approach to energy management, we will move beyond ICT. If not already, then someday soon, we will all work as a part of the IOI, the Industry of Information or perhaps more simply - the Industry of Innovation.

Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Informa.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

 

 

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish