Arvind Venugopal is Senior Product Manager at Cleo.
Amazon recently launched a service to literally drive a truck to your data center, load it up with all of your data, and drive it back to an Amazon server farm to plug it in and push it to the cloud. The rationale behind this offering stems from the idea that businesses looking to move massive amounts of data – terabytes and petabytes of information – to Amazon’s cloud don’t have a fast, affordable option to do so over the internet. But what if they did?
It’s hard to believe that we’ve advanced so far technologically that we now have to rely on “old school” actions to move and maintain the digital data. But that’s what it’s come to given most software companies’ inability to quickly and securely pipe massive amounts of information into and out of the enterprise.
Given these limitations, mega-companies like Amazon recognize the need to bring its own massive data centers – those comprising its Amazon Web Services (AWS) – closer to the point of data generation. But this particular truck service, with its obvious security and risk issues, might be something that happens only in the largest companies and just once or twice a year. So what’s a business to do about the high-volume, large data set transfers it must facilitate on a daily basis?
Transferring petabytes of data anywhere – physically or digitally – will still take a considerable amount of time to move, and not every business can afford to summon an Amazon truck to back up its databases or move log files to the cloud. But the organization looking to replace unsecured physical data movement and affordably move large volumes of information while maintaining control of the endpoints will benefit from accelerated data transfer and governance capabilities, especially when those capabilities already fully integrate with your B2B systems, cloud solutions, and internal applications.
The Data Deluge
In today’s business ecosystem, as the size of data increases for all communications with customers, partners, distributors, and suppliers, as well as with internal applications and backup systems, the smooth exchange of data becomes much more difficult using traditional legacy file transfer tools. By 2020, experts estimate that 1.7 MBs of new digital information will generate every second for every person on Earth, and that number increases exponentially for businesses.
Organizations currently generate, capture, move, and store things like:
- Customer information
- Banking and other financial data
- Personal medical and genetic data
- Social media interactions and customer support logs
- Shipping records, customs documents, and logistics information
But businesses are using traditional protocols over high-latency networks – or, on a smaller non-Amazon-truck scale, are still shipping physical devices – to transfer these growing data sets internally and externally. With all of this digital innovation, data centers and organizations rely far too much on less innovative methods as a means to simply get the job done.
A high-speed data transfer protocol harnesses the power of lean file transfer and integration technology to accelerate the flow of large files and data sets, specifically over long distances between servers, data centers, and other destinations. Accelerated file transfer enables businesses to efficiently move data while also maintaining control and governance while:
- Moving large data sets into and out of data lakes
- Copying databases between data centers to create redundancy and backup
- Transferring and receiving huge data sets from partners
Recognizing the need for a high-speed file transfer solution now positions your business for a future that includes further data volume increases, but recognizing the kinds of capabilities such a solution should include can be less clear.
A next-generation accelerated file transfer solution should quickly, easily, and securely move extremely large files while also supporting blazing-fast transfers of smaller files, all while optimally using existing network bandwidth and resources.
While most high-speed data transfer solutions in the market are either hardware-based or based on lesser-used network technologies, an advanced software-based solution can be deployed as part of an enterprise shared-architecture model powered by the same engine that drives your routine business processes, all on a single platform.
Transfer speed obviously will be the priority, but an advanced high-speed solution also must enable organizations to monitor and track performance metrics from both an IT and a business perspective, as well as act on data via reports and dashboards based on business objectives.
In addition to the tracking, alerting, and authentication, a leading solution supports:
- External accelerated file transfer for files of all sizes with support for other protocols and big data connectors
- SSL encryption to secure data in motion
- Guaranteed delivery of transferred packets
- Feeding data directly to cloud architectures or big data storage mechanisms
- Metadata tagging to extend the technology into other applications
- Automatic checkpoint restart and data integrity checks
Amazon recognized that the way most companies currently move massive amounts of information isn’t good enough, and with data volumes only continuing to increase across the consumer and business spectrums, their customers need solutions. Vendors extending high-speed software solutions to digitally exchange information will be ahead of the curve in serving their customers.
For the business that can’t simply order up a high-tech truck to take away its data, an advanced high-speed file transfer solution from one of these vendors provides an efficient and affordable way securely move its mission-critical data.
Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Penton.