Insight and analysis on the data center space from industry thought leaders.

Prepping Your Data Center for the Internet of Things

Open source-based solutions have the functionality, scalability and reliability to enable businesses to successfully cope with the explosion of data associated with the IoT - predicted to grow to 44 zettabyes by 2020.

Industry Perspectives

August 13, 2015

5 Min Read
Prepping Your Data Center for the Internet of Things

Pierre Fricke is the Vice President of Product Marketing at EnterpriseDB.

By now, most IT professionals and CIOs are aware of the astounding statistics surrounding the Internet of Things (IoT), especially as it relates to the amount of data being collected by the growing network of connected devices. In fact, according to Cisco, there will be well over 50 billion connected devices by 2020.

This is largely powered by the fact that more than 82 percent of businesses will be using IoT applications by 2017 alone, according to Forrester. The rapidly growing interest in harnessing the power of the IoT is driving massive increases in spending. A recent study from Tata Consultancy Services found that 26 global companies plan to spend at least $1 billion each on IoT in the coming year. With the tremendous cost associated with capturing the exponential growth of data, and the competitive advantage such data can provide, companies that cannot afford to make this investment in the technology could be paralyzed.

Enterprises are becoming more keenly aware that remaining competitive in their market is influenced by their ability to collect, manage and analyze data. But a key element many companies often overlook in the cost-benefit equation is the impact of investment requirements at the back end of an IoT infrastructure. For companies that rely on low-cost devices – ranging from smartphones to facility sensors – to collect information, it will become increasingly cost prohibitive to purchase expensive software licenses and hardware that manage the data once it’s acquired. Open source software offers an innovative solution to help companies control these rising costs.

Today, open source-based solutions have the functionality, scalability and reliability to enable businesses to successfully cope with the explosion of data associated with the IoT. As Gartner noted in its April 2015 report, State of Relational Open Source RDBMSs 2015, “Open-source RDBMSs have matured and today can be considered by information leaders, DBAs and application development management as a standard infrastructure choice for a large majority of new enterprise applications.” The report also stated: “Information leaders who opt for an open-source DBMS (OSDBMS) licensing model can benefit from much lower costs than using a commercial model, even with today’s hosted cloud and database platform as a service (dbPaaS) offerings.”

With these benefits in mind, open source databases are playing an increasingly vital role in the IoT-enabled data center, by helping CIOs free up the resources and budget that will help support the IoT and its applications.

Keeping the IT 'Lights' On Could be Destroying Your Business

ABI Research is estimating there may be as much as 44 zettabytes of data by 2020. To put that into perspective, up through 2013, humanity had produced only 2.7 zettabytes. The exponentially growing volume of data that devices are capturing is staggering, and as a result, it’s becoming increasingly difficult for businesses to store it. For many CIOs, an initial reaction to this challenge would be to buy more hardware and software, or to create clusters to manage and store the data. However, CIOs can often overlook the costly investment requirements at the back end of an IoT infrastructure. CIOs would do well to consider open source solutions, such as Postgres, which are much more cost-effective, and can perform just as well as propriety IT systems.

By implementing leading open source database Postgres, CIOs can divert budget away from operational expenses and leverage the resulting cost savings to fund new and more strategic opportunities that can improve business performance. For example, companies could invest these new-found funds into the marketing department’s customer engagement initiatives or new mobile technologies.

Postgres and Integrating Structured and Unstructured Data

Another consideration for CIOs implementing a data center infrastructure strategy with the IoT in mind is the variety of data types produced by new applications and devices. Integrating unstructured and semi-structured data into relational tables comprised of structured data is a major challenge. Today, many organizations have a patchwork of applications. Developers have turned to NoSQL-only solutions to address specific workloads. However, these systems can result in numerous data silos, which can cause major headaches and huge expense. It also makes the collected data much harder to manage.

Postgres has a solution for this. A unique feature called Foreign Data Wrappers (FDWs) allow for the seamless integration of data from disparate sources such as MongoDB, Hadoop and MySQL into a common model under the Postgres database. FDWs are able to link the Postgres databases to these external data stores so DBAs can access, manage and, most importantly, manipulate data from foreign sources as if it were part of the native Postgres database. With FDWs in place, Postgres acts as the central hub for data, enabling IT teams to ensure the integrity of their data regardless of what type it is or what generated it originally.

Developing Your Database Infrastructure Strategy

As CIOs work to develop a strategy for organizing their organization’s IoT infrastructure, there are several important decisions to consider. Evaluating whether to locally collect and store data at the device for future use or to move it to a centralized management system is one such decision. The argument for keeping the data local is that it speeds up the data-accessing process, and, therefore, will provide actionable insights at a faster rate. However, the economics of keeping the data local could be burdensome due to the number of individual database instances, as well as the costs of integrating data from this distributed architecture.

Unlike other SQL databases, Postgres was developed to be expandable and to easily incorporate new data types, indexing schemes and languages without compromising other features of the database. For CIOs, this expandability allows them to centralize company data in a reliable and scalable way, but with a much greater cost savings than traditional commercial databases.

The IoT promises to arm CIOs and IT professionals with a much deeper understanding of the business and customer environments by providing real-time data. However, in order to reap these benefits, executives need to understand the significant economic impact that storing and managing this data could have on their data center investments. Open source systems like Postgres provide a viable solution to the challenges presented by the IoT.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like