Skip navigation
ibm storage cebit 2015 getty.jpg Sean Gallup/Getty Images

Measure Twice: Testing Data Center Storage With New WorkloadWisdom

“Once storage is shared, even changes that look quite minor could have noticeable effects on service quality.”

One of the key tenets of modern data center systems design is automation. From infrastructure to software, everything is automated. It’s the only way to effectively work at scale, building out physical and virtual infrastructures across software-defined fabrics of compute, networking, and storage. With Virtual Instrument’s new WorkloadWisdom, that automation can include workload testing for validating infrastructure changes before making them in production.

As Bryan Betts, principal analyst at Freeform Dynamics told Data Center Knowledge, “One of the dirty secrets of the storage business is just how much the performance of storage can vary depending on the interplay between the storage device and the workload. Back in the days of direct-attached spinning disks, some of it was obvious: we knew that an application driving a lot of random access would hit the disk a lot harder than one doing mainly sequential [access], and that some disks would handle random better than others. But it all got a whole lot muddier as storage moved onto the network and became shared storage, and more importantly as it got more and more software-driven.”

That complicates designing and testing modern storage architectures, which are increasingly dynamic, flexible, and tuned to the specific workload. Testing workloads can be a problem. While it’s possible to construct synthetic traffic and data, it’s often hard to match that to real-world operations. Most test harnesses use scripted browsers to deliver captured web traffic to test application operations, but that can be an issue when trying to test systems at scale, because generating appropriate traffic levels from remote PCs is difficult.

As Betts noted, “The storage equivalent of a network packet generator is essential – one that can record workloads, tweak or multiply them up if you need, and then play them back on whatever you want to test. The problem is that once storage is shared, even changes that look quite minor could have noticeable effects on service quality.”

Virtual Instruments is best known for its performance management tools, focusing on assessing infrastructure operations, including the VirtualWisdom application-centric data center infrastructure monitoring and analytics platform. WorkloadWisdom 6.0 takes those techniques and applies them to storage infrastructure testing and planning.

Based on the LoadDynamix storage validation tooling, the new version is designed to allow infrastructure teams to test and validate systems without needing to deploy applications, so they can build out infrastructures in parallel with application development. Validating storage performance reduces the risk of under- and over-provisioning, removing some of the guesswork associated with designing storage systems.

Using data captured from your live network, WorkloadWisdom delivers synthetic traffic to storage systems in a test and staging network, so you can see the impact of infrastructure changes on storage performance before making them on your production system. The system can test workloads over different network types, with support for Ethernet and Fibre Channel deployments, for both block and file storage. WorkloadWisdom can also support object stores using common protocols, including AWS’s S3 and OpenStack Swift.

You can use WorkloadWisdom to design tests that span both your network and any cloud storage elements, testing both on-premises and hybrid infrastructures. A single 2U appliance handles workload generation using a custom Linux kernel to increase network efficiency. You can assign load generation ports, using data to simulate IOPs, latency, and throughput. Once you’ve built your simulated load, you can use it test your planned storage array design, with tunable read and write block sizes. You can choose write locations and hotspots to simulate more complex operations.

As well as synthesized data, WorkloadWisdom will simulate metadata operations. That’s key to measuring performance, because it’s common for data writes and reads to account for less than 15 percent of traffic; the rest is RPC calls to storage metadata.

Synthetic transactions in WorkloadWisdom mimic the type and statistical frequency of data that’s stored in a live application without needing actual data, reducing the risk of data leakage by using random bytes or files that replicate the behavior of a live system. Using Virtual Instruments’ existing ProbeNAS monitoring capabilities, traffic can be generated based on the performance of both SMB and NFS storage networks, as well iSCSI. Other workload data can be imported from most vendors arrays, including NetApp, HPE, and Dell EMC, and Virtual Instruments is developing a portal for sharing workloads.

You can archive existing synthetic workloads to use when you need to test changes in your infrastructure and update them as applications change. That’s not just useful for major changes like swapping out hardware or finding out whether changing from Fibre Channel to iSCSI will improve performance, but also to test the impact of changing some settings as part of storage tuning or testing new firmware. That impact can be more significant than you think.

One Virtual Instruments customer, a payment processing organization, was advised by their storage vendor to upgrade the firmware on all their storage targets to get feature enhancements and improved performance. Instead, Virtual Instruments’ Mark Chauvin told us, the firmware updates changed some settings, and their production environment went down. “Now they have a test environment that mirrors their production environment, and any recommended change is tested to make sure it won't slow down or break things.”

As Betts pointed out, “Once storage is shared, even changes that look quite minor could have noticeable effects on service quality.” In the modern data center, that’s a key issue. A tool like WorkloadWisdom helps validate any changes to your storage network – even if you’re just tweaking a line or two in a configuration.

TAGS: Storage DevOps
Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish