How to Move Bigger Data into the Cloud

Add Your Comments

Nicos Vekiarides, CEO of integrated cloud storage provider TwinStrata

Nicos_Vekiarides-tnNICOS VEKIARIDES
TwinStrata

It’s no secret that storage needs will continue to grow in the upcoming years. In fact, research from IDC projects installed raw storage to exceed 7 zettabytes (each zettabyte is 1 million petabytes) by 2017 as part of a staggering 16 ZB digital universe. What this means for IT organizations is an increased strain managing growing storage capacities. Further exacerbating matters are regulatory requirements extending data lifetimes to 10 years or more and effectively requiring data to remain accessible online for that extended period of time.

While cloud storage does not necessarily meet IT’s growing storage capacity needs, adding cloud-integrated storage into existing NAS or SAN environments makes it relatively easy to relocate data securely into a cloud provider, eliminating the need to maintain an expanding on-premise storage infrastructure.

Offloading storage to cloud may sound good on paper, but it does present at least one initial challenge when it comes to storing large amounts of data: getting the initial upload to the cloud.

Importing Data to the Cloud

A little math can compute how long it takes to upload a large amount of data across a WAN. For instance, if you have an uplink speed of 100Mbit/sec, you should be able to push nearly 1TB per day. Before becoming too comfortable with that figure, consider that it is theoretical and does not account for other users sharing the WAN link, hops/latencies or other overhead that can slow down throughput. Even at a theoretical maximum of 1TB/day, 100TB of data may take over 3 months to upload – a relatively long and cumbersome process.

In instances where network bandwidth is not the best option for the initial upload, consider shipping your data to the cloud via a cloud provider import service (as offered by Google and AWS). You can ship disks containing your data directly to the cloud providers, who can load the data in one of their data centers or at a high-bandwidth access point with zero impact to your network. For large data sets, this can be the difference between weeks and months versus days to get data into the cloud.

What about Security?

While best practices dictate that corporate data should be encrypted at-rest in the cloud, security is sometimes a forgotten aspect of the import process. Transporting unencrypted storage can easily become the weak link in an otherwise tightly secured cloud environment.

Ideally, a data import process ought to follow the same practice of encrypting data at-rest prior to transporting it. Would you ever hand your unencrypted data to stranger? Well, whether the data is handled by a “trusted” shipping/transport company or cloud provider, there is no reason to leave a window open for breach by an unauthorized or unknown party.

Look for an import process that encrypts and encapsulates data into object format stored in the cloud prior to shipping the data from your premises. Following the same security practice used for storing data online in the cloud eliminates having to make any security compromises during the import.

The Bottom Line

Cloud storage has become a viable alternative for both storing data online and protecting data by copying it offsite. While the process of loading large data sets into the cloud may seem ambitious and cumbersome, cloud import processes can significantly reduce the time requirements as well as the potential network impact.

An import process that follows best practices around security can provide a rapid data upload with the appropriate level of security that your corporate data demands.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Add Your Comments

  • (will not be published)