Jeff Rauscher, Director of Solutions Design for Redwood Software, has more than 31 years of diversified MIS/IT experience working with a wide variety of technologies including SAP, HP, IBM, and many others.
No matter what services or products your organization provides, you constantly need timely and accurate data to stay ahead and keep moving forward. But gathering and processing strategic data has always been a challenge. It takes time and often a great deal of human effort.
For more than half a century, companies have relied on IT to reduce volumes of business data down to human size. In fact, this year marks the 50th anniversary of the IBM System/360, the world’s first commercially available mainframe computer. Today, the mainframe is everywhere in business and continues to serve as the backbone for many of the world’s corporate data centers.
According to SHARE Inc., an independent organization of IT professionals, 71 percent of the global Fortune 500, 96 percent of the world’s top 100 banks, and 9 out of 10 of the world’s largest insurance companies use IBM’s System z. Contemporary mainframes have evolved and grown in tandem with other technologies, becoming considerably faster, more energy efficient and with significantly greater capacity than their predecessors.
All business IT today owes its success, as well as much of its structure, to mainframe design. The three-tier architecture of database, application and user interface developed into the client-server architecture that is the standard for all business software now. This structure, along with the mainframe itself, ushered in another technology that businesses depend on every day – the ERP.
Gartner first coined the acronym “ERP” for “Enterprise Resource Planning” in 1990, but the concept of ERP had been in practice for several decades before that. This broad concept for business management software that includes groups of integrated applications spread quickly. Today virtually every business process has at least one ERP component offered by any number of global software providers.
Along with the mainframe and ERP, companies continue to add new technologies to enhance their data management and analytics capabilities. They rely on distributed computing as well as Software-as-a-Service, (SaaS), virtual machines and cloud-based applications. At this point in the continuing history of IT and business data, executives and the people who provide them information are at the crossroads of an old problem – extrapolating actionable insights from their data.
Time for Big Data
From the earliest days of business computing, companies have used machines to cope with large volumes of data. The important difference now is the speed and performance of modern computers, which can gather and analyze huge amounts of information very quickly. While the scope of this available data is new, the challenge of Big Data is old and familiar: convert that data into actionable, meaningful information.
That’s why more companies are investing in Big Data technology to support market intelligence and predictive analytics. Wikibon measured 58 percent growth in the market between 2012 and 2013. Just last year Gartner predicted that “By 2016, 70 percent of the most profitable companies will manage their processes using real-time predictive analytics or extreme collaboration.” But the question of exactly how this will happen still remains for many organizations.
With decades of investment in layers of technology, how can business and IT most effectively leverage everything they have– from the trusted mainframe to the latest Big Data cruncher? The answer lies in automation.
Automate for excellence
Recent research from The Hackett Group found that top-performing IT organizations focus on automation and complexity reduction as part of essential IT strategy. These top companies implement 80 percent higher levels of business process automation. Using automation, these firms have 70 percent less complexity in their technology platforms and spend 25 percent less on labor. By automating processes across diverse technologies, platforms and locations, IT organizations can begin to see the real benefits of decades of technological advances.
For example, one company that analyzes data from more than 400 million customers worldwide automated and coordinated its application, data and infrastructure processes so that they could be repeated and expanded with minimal user interaction each time they added a new client.
The company can now link a standard set of automated data management processes to maximize client data value in a rapid, repeatable and consistent way for every customer. IT staff no longer have to complete a long list of repetitive manual tasks to set up new accounts bridging information from diverse technologies in the enterprise. Instead, they focus on data analysis.
The speed at which companies acquire data continues to escalate. At the same time, businesses continue to demand more analytics. IT and knowledge workers use a mix of legacy technologies and the latest innovations in a complex landscape to manage new challenges. But, in the end, it all comes down to execution.
Never have accuracy and speed been more important for knowledge workers. Just as with the original industrial revolution that introduced automated manufacturing, automation provides answers for the data center, too. The only way to dramatically improve process quality and speed simultaneously at a significantly lower cost—across every part of the enterprise – is to automate. Automated processes guarantee a level of consistency and accuracy in the complex enterprise that’s impossible through manual effort alone.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.