Skip navigation

The Fruits of Innovation: Top 10 IT Trends in 2014

The changes we’re due to see in 2014 start with the way people think. A good bit of the change going on in IT is about the maturity of its business leaders and their business planning skills associated with all of those changes, writes Mark Harris of Nylte. He outlines a number of changes coming in the new calendar year.

Mark Harris is the vice president of marketing and data center strategy at Nlyte Software with more than 30 years experience in product and channel marketing, sales, and corporate strategy. Nlyte Software is the independent provider of data center infrastructure Management (DCIM) solutions.

mharrisMARK HARRIS

Nylte Software

The IT industry and its data centers are going through change today at a breakneck pace. Changes are underway to the very fundamentals of how we create IT, how we leverage IT, and how we innovate in IT. Information Technology has always been about making changes that stretch the limits of creativity, but when so many core components change at the same time, it becomes both exciting and challenging even for the most astute of us IT professionals.

The changes we’re due to see in 2014 start with the way people think. A good bit of the change going on in IT is about the maturity of its business leaders and their business planning skills associated with all of those changes. In the end, these leaders are now tasked to accurately manage, predict, execute and justify. Hence, the CIO’s role will evolve. Previously, CIOs were mostly technologists that were measured almost exclusively by availability and uptime. The CIO’s job was all about crafting a level of IT services that the company could count on, and the budgeting process needed to do so was a mostly a formality.

Best Qualities in a CIO

The most effective CIOs in 2014 will be business managers that understand the wealth of technology options now available, the costs associated with each as well as the business value of each of the various services they are chartered to deliver. He or she will map out a plan that delivers just the right amount service within their agreed business plan. Email, for instance, may have an entirely different value to a company than their online store, so the means to deliver these diverse services will need to be different. It is the 2014 CIO’s job to empower their organizations to deliver just the right services at just the right cost.

For technologists, 2014 will be a banner year for change at a nearly unprecedented rate.. If we look back at the IT industry’s history overall, it really started about 60 years ago with IBM’s first commercial release of the mainframe, became a distributed computing world in the late 1970’s, transitioned to an Internet connected world in the mid-1990’s, and then exploded into the current generation of dynamic abstracted computing beginning in the mid-2000s. This new approach to computing puts a tremendous emphasis on the back-end data center services rather than the capability of the end-user’s device. After a few false-starts (like the netbook), the mature web-based, handheld, mobile and VDI revolution has become a cornerstone of computing, and it has become a race to put all of the actual computing back into the data center and do so in a modular fashion. That said, most existing data centers pre-date this dynamic period and hence their entire supporting infrastructure is mostly ill-prepared to handle this dramatic shift towards cost-oriented data center services.

What Lies Ahead?

In 2014, we see much of the fruit of this race for innovation. These trends for 2014 are not just niche possibilities nor proof-of-concepts, but are those that we will see the rapid production-level adoption.

1. Big Data is finding its footing as the initial hype has settled and its commercial applicability has been demonstrated across all major industries. Big Data has become one of the biggest topics for the Enterprise. The premise is simple, put all of the historical transactional and business knowledge in one place, and then use various analytic means to extract actionable inferences. The keys to Big Data’s promise are just how to put ALL of a company’s data logically in one place, and then when this massive data set has grown to over 2TB (the very definition of Big Data), how to retrieve various forms of meaningful and actionable knowledge from the dataset interactively. This reality is now possible and in 2014, many end-users will move their Big Data experimental pilots into full production. The Big Data vendors will also keep progressing with support for even large memory-based data structures, real-time streamed data and even more advanced data mining techniques based upon the maturity of their own unique technology. Expect to see many of the world’s largest organizations looking to consolidate their many discrete data stores into one logical Big Data structure and then in some pretty innovative ways, looking for ways to interpret this wealth of knowledge into impressive new business drivers.

2. The Software Defined Networking (SDN) market will continue to be slightly fragmented, but adoption rate for SDN will grow. In 2014 we will see many networking vendors attempting to differentiate themselves in the SDN space. Important to remember is that Software Defined Networking has two key components, the intelligent controller and the physical switches. That said, there is more than one technical approach to these SDN systems, so the end-user community will be faced with making some choices. The largest of SDN vendors will each refine and articulate their specific approach to SDN, and attempt to help their prospects understand why the combination of their robust switching and controllers has the most value. The common premise of these vendors will be that SDN allows better value through economies of scale and choice in a multi-vendor interoperable world.Many SDN vendors will focus on differentiation by demonstrating overall performance and/or capacity. Appliance-level switch and controller comparisons and performance benchmarks will emerge (much like the gigabit switching revolution 10 years ago) and interoperability and scale will be a hot topic. In 2014 the rate at which SDN will be adopted by the end-user community will increase significantly as all of the mainstream players have now announced and delivered compelling components in their SDN space.

3. Asset Lifecycle management will be adopted to control costs. The traditional goal of IT was to assure that applications worked as needed and remained available to their users. The costs to provide this were viewed mostly as a purchasing department’s challenge. Shave a few percentage points off of the purchase of servers or switches and everyone considered the job done well. As a result, it has been commonplace to leave equipment in service for years and years and ignore the economic impacts of doing so. If it didn’t break, don’t fix it mentality. In 2014 we will see an increasing rate of adoption of innovative new tools that manage these physical devices as business assets, with standard accounting processes and costs being addressed. Thanks in part to the abstractions now available through software-defined approaches, physical devices are no longer tied so tightly to their installed applications so the complexity of making hardware changes (including those demanded by technology refresh cycles) has been greatly reduced. It is becoming very apparent that servers or switches that exceed their depreciation and warranty schedules can easily be replaced to reduce operating costs, so the adoption of new tools to manage all of this change will become a higher priority. Data Center Infrastructure Management (DCIM) tools will take their place as the strategic business management solution to manage all of the costs associated with this change.

4. The Server is being redefined. For the past 20 years, a server was generally defined as an x86-compatible chip that was housed in a 19-inch rack-mounted package. Every server in the data center fit this definition. Sure there were many differences in how fast each model of server was, or the amount of memory, disk or I/O capacity, but everything was essentially the same server architecture at the core. In 2014, we see dramatic alternatives. The major server vendors and a handful of startups will offer both low-power x86 designs as well as even lower-power “ARM” and “Power” chipsets. The cost of electricity is a major component of operating costs, so innovative ways of processing data on a lower power budget just makes sense. It turns out many applications are perfectly suited for these new low-power CPU options and in 2014 we will see these applications being touted as wild successes. Surprisingly, some of these new low-power CPU designs rival the IOPS performance of their x86 cousins! In addition to the CPU chip changes, we will see form-factors shipping in volume that are intended for dense applications. Traditionally we had just two form-factor choices; 1) proprietary chassis or 2) Standard 19-inch. In 2014 we will see an onslaught of half-width servers and we will also see the first version of an alternative server packaging scheme referred to as Open-Compute. Open-Compute has been pioneered by companies like Facebook and Google and is a standardized ‘open’ form-factor. Think of this design as the Open-Sourced version of server hardware. In 2014 we will see initial adoption of Open-Compute by a much wider commercial audience.

5. The Public Cloud grows up. Once viewed as a platform for only a limited set of specialized applications as well as a bit of a non-critical platform, the Public Cloud has struggled to take a primary spot at ALL of the IT tables. In fact, the Public Cloud industry’s report card for the last few years would reveal a mixed bag of successes and failures which has stunted it’s growth. At these critical point in the adoption curves, we have seen high-profile failures limiting its strategic adoption. Leveraging Public Cloud technology also requires some fundamental re-education and new economic modeling which is all new to most IT organizations. In 2014, we will see the largest Public Cloud players pro-actively discuss these failures and provide detailed insight into how these issues will be addressed to assure they do not happen in the future. In 2014 the mainstream virtualization vendors will extend their capacity and migration tools to span from in-house to Public Cloud-based instances. As computing capacity is used up internally, it will become commonplace to simply expand to the Public Cloud for demand peaks. In addition, the Open Source world of public cloud computing (dominated by OpenStack) will see its newest release (“Havana”) open some eyes into what is possible in a multi-vendor Cloud world. In 2014 OpenStack’s Havana now supports OpenFlow controllers (one of the main multi-vendor approaches to SDN), OpenvSwitch & and VMware’s NSX. Havana will also deliver support for orchestration including direct support Amazon’s CloudFormation templates. Finally, significant user-level accounting will appear making OpenStack a business-grade option in a multi-tenant world. OpenStack and the tools that are built for it will likely prove to be one of the biggest enablers in creating a multi-vendor Public Cloud.

6. Desktop Virtualization continues to grow. Related to the tablet and mobile revolutions, desktop virtualization is a set of approaches providing access to core business applications in a pseudo desktop experience to the end user without the need for expensive and hard to maintain end-point hardware. The promise of desktop virtualization has been around for years, with X11 software implementations in the 1980s and Thin Clients in the early 1990’s. In 2014 we finally see the maturation of the client-side hardware and software required for a satisfactory remote desktop experience with high quality “thin” user devices and rich application delivery network protocols such as HTML5. While several technical catalysts for desktop virtualization exist today, the main catalyst in 2014 is economics; there is simply no better time to capitalize on the economies of scale by putting computing back in the data center. Data centers can manage applications better. They can deliver economies of scale, and assure business continuity without the reliance on thousands of individual machines to manage.

7. The Enterprise will embrace the opportunity available with BYOD strategies In 2014 we will see the number of handheld platform choices stabilizing at three; Apple’s IOS, Google’s Android and Microsoft’s Windows 8. To some degree we had to break a few eggs to get to the point that these three platforms would earn the respect of users, the IT organization and application developers alike. Previous offerings in the handheld market had crazy form-factors, spotty security profiles, difficult to support operating systems and limited corporate applications. Consequently Enterprise IT organizations had been reticent to allow these devices into their officially supported infrastructure.These devices are now business-grade, can be easily secured, and can access nearly any business application. With the movement for core computing migrating back to the data center, these handheld devices are tremendous ‘viewing portals’ into the application environment hosted in the data center or the cloud. In 2014 expect to see all other handheld offerings vanish and the Enterprise best practice manuals to official support BYOD style remote and handheld access to core business applications.

8. Store more, store smarter, grow modularly. The rate of growth in storage requirements is mind-boggling. Everything we do in IT takes up storage space and it has become very clear that simply spinning more platters is a costly and losing battle. For years it has been intuitively obvious that storage is one of biggest opportunities for innovation in the data center, and yet technology had not yet caught up to be truly smarter. Concepts such as de-duplication and a hybrid migration across multiple media types remained were considered too risky for the mainstream Enterprise. Modularity has always come with a list of caveats. In 2014, we will see the emergence of low-risk, production grade, smarter and more modular storage. Real-time and in-line de-duplication has now been shown by the largest and smallest storage vendors, gaining credibility in the process. Fabric based backup is now commonplace as is storage that appears as a single device yet includes a combination of solid-state and spinning disks, with auto migration of data based upon access demand. Add to these new capabilities the ability to create storage arrays that are modular in nature and self-configure as additional storage is added or removed. All of the biggest storage vendors are now delivering these smart modular offerings, and more than a hundred startups exists to address this massive opportunity.

9. The modular Data Center facility can be found in many forms. The idea of modularity for computing simply makes sense. Modularity has always been an on-going goal for active computing devices themselves but there was little consideration given to making the structure that houses all of this equipment itself modular. Whereas in years past large monolithic data centers were conceived, built and justified based upon their fully loaded capacity, it was intuitively obvious that the cost for the first square-foot of occupied space was enormous, while the cost of the last square-foot was essentially free. This ‘build-big once’ model ignored the timeframes associated with occupying all of those square feet. A long time coming, the modular data center was born and has been seen in one shape or fashion for a dozen years already. First starting with ISO shipping containers, then adding factory built and purpose designed modules and extending all the way up to concrete and steel structures that are designed to be expanded one chunk at a time. In 2014, all three of these styles of modular data center options will become center-stage through significant efforts by the vendors involved, and each vendor will seize the opportunity to re-introduce their value in a market now looking for modular building-blocks. The timing is perfect as the economics of computing require incremental investment strategies. Large, well-known Enterprises will include one or more forms of modular computing into their go-forward plans.

10. Primary Infrastructure via Wireless is reality. The pace at which the wireless industry is introducing new capabilities is awe-inspiring. Whereas many of the early 802 networking standards took years and years to ratify, we are seeing new production grade wireless standards emerging that are dramatically more functional than their predecessors even before the previous technology could be fully deployed. Twenty years ago we saw 1Mbps wireless, then a handful of years later we got “11b” at 11Mbps. From there a longer quiet period before production “11G” 54Mbps emerged, a few more years before we saw “11n” at hundreds of mbps, and now in 2014 (and before the ink for 11n even dried) the industry is now beginning to ship 802.11ac with reliable speeds greater than 1Gbps. And better yet, this new standard isn’t just about raw bandwidth, but instead allows intelligent and adaptive networks to be built. In 2014, the largest wireless vendors will deliver 802.11ac devices which include application awareness and link-layer bandwidth conservation to balance users to access points, essentially making everyone’s connected experience better. For the first time since “WiFi” became commonplace, the option of deploying commercial grade, solid wireless networks as the primary user connectivity for new build-outs in an Enterprise will be very real and cost effective. (And keep an eye on the fast shaping 802.11ad, which is already promising five times the performance of 802.11ac and will have some commercial shipments in 2014!)

In 2014, we are seeing the fruits of almost 10 years of innovation and imagination all coming together at the same point in time. Dynamic and abstracted computing models can now be deployed in a low-risk and coordinated fashion across the entire data center. Servers, storage and networks can all be implemented in a software-defined manner, enabling business applications to be deployed in the data center. Users have a great number of choices to access these business applications, and the requirement for each end-user to act as an individual system administrator for desktop and/or handheld devices has all but vanished. As the cliché goes, “The stars have aligned’ and 2014 is poised to be a banner year across all facets of the IT industry.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish