Skip navigation
Redefining System Architecture with Data at the Core

Redefining System Architecture with Data at the Core

To meet the new demands of business, IT can no longer count on the tried and true, writes Momchil Michailov of Sanbolic. What today’s IT teams need is a modern system architecture that has data at its core.

Momchil Michailov is the CEO and Co-Founder of Sanbolic.

Data is the ultimate corporate asset. Yet most IT architecture starts with a discussion of hardware.

Data has many different forms, lifecycles, values and uses; and collecting, protecting, analyzing and making data available to support business processes is the core value of IT systems.

According to a CSC report, by 2020 over one-third of all data will live in or pass through the cloud, and data production will be 44 times greater than it was in 2009. That said, IT systems need to be designed around the data, tuning the performance, the level of data protection, and access profile for each workload. System architecture in today’s cloud era should be defined by the data it contains rather than the hardware that stores and makes it available.

Software-defined data platforms are drastically and rapidly changing the IT model. By abstracting the underlying hardware, and allowing data management and access to be defined workload by workload, data characteristics are now defining the infrastructure used, rather than vice versa. IT investment, therefore, needs to be better matched against the value of data to the business, while allowing increased flexibility and responsiveness.

The thorny road to software-defined infrastructure

There is no doubt that companies today need to run IT systems and applications to enable critical processes and business intelligence. Unfortunately, these core outcomes are delivered from the data in applications running on storage systems that are obsolete. Storage devices currently represent the biggest portion of the IT budget and are also the most limiting factor in the way IT is run – storage interoperability, shared visibility and management are some of the key limiting factors.

Placing data in storage silos segments it and the servers that access it, hampering IT’s ability to run flexible and agile operations. Couple those challenges with the fact that today’s CIOs are under continued budget pressure while being expected to deliver and maintain more applications and services that are 10 times faster than ever before, and we can see why the promise of software-defined data platforms is enticing, yet still elusive to most enterprises.

A software-defined data platform, the building blocks to success

What today’s IT needs to focus on is ensuring the availability and scalability of the data placed in the storage arrays, and the infrastructure’s ability to enable workload elasticity while providing enterprise service level capabilities. The building blocks are:

Storage as a single unit: A key capability of storage and data management solutions going forward will be their ability to meet the storage needs of dynamic applications and place growing data on appropriate storage media while delivering the performance for applications in both physical and virtual environments. just as servers are the vehicles that deliver the applications.

Public cloud providers have redefined the necessity to run storage the traditional way. Instead, they utilize a converged model where server and storage are in a single unit with the ability to scale out data to meet application demands. This model is called hyper-converged or Server San.

There are a few key attributes of this new hyper-converged storage that are of note:

  1. Better application performance thanks to CPU being close to the Disk or Flash for much faster input/output (IO);
  2. Storage cost is dramatically reduced;
  3. Eliminates the proprietary and cost-intensive licensing of traditional storage and data management tools; and
  4. Puts server and application administrators in charge of the complete IT stack for their applications.

Software-defined storage management: This hyper-converged infrastructure needs to be enabled by a complete software stack so administrators can modularly deploy storage and CPU capacity. This will eliminate the “forklift” storage update – the major upgrades or overhauls that are required for customers to adapt to this infrastructure. Nimble software management tools will allow the non-disruptive swap of individual servers and storage components on as-needed basis.

Dynamic scale-out of storage resources: Today, the performance of the storage, the services it provides and its cost govern the applications that run on it. Leveraging hyper-converged commodity infrastructure and layering the advanced storage and data management services on top of it creates a new, shared infrastructure, eliminating the upfront storage cost. Enterprises can pick the right storage medium (FLASH, SSD, HDD) and, through software-provisioned storage volumes based on SLAs, offer both file and block access to avoid infrastructure siloes and storage islands. Just as hypervisors allowed us to migrate applications and workloads to circumvent server tie-down, software-defined storage services decouple data from the underlying storage devices.

Seamless orchestration: True hyper-converged infrastructure provides storage and compute; however, in order to be fully operational, the data center of tomorrow needs its orchestration.

A sophisticated orchestration layer allows organizations to migrate workloads across physical and virtual machines and place data in the right medium and location. By controlling and enabling infrastructure through software layers and orchestration, IT can focus on the economics, performance, SLA and availability of its workloads. The end result is hardware cost reduction, non-disruptive hardware upgrades, reduced management cost, tier-one capability, and the ability to span across data centers - on-prem and cloud infrastructure.

However, the most important benefit is the ability to scale out workloads and harness data, the most valuable business asset.

A modern system architecture

To meet the new demands of business, IT can no longer count on the tried and true. What today’s IT teams need is a modern system architecture that has data at its core. The next major frontier in IT will be the adoption of nimble platforms that will redefine how IT gets designed and delivered. We are on the brink of a major storage evolution that will transform how IT enables the business.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish