Will we all need semi trucks to move data to the cloud? | #reInvent


To demonstrate its commitment to moving vast amounts of data to the cloud, Amazon Web Services rolled out an actual 18-wheel truck called the Snowmobile at this year’s AWS re:Invent conference. The point was that moving huge amounts of data around is not easy, but Amazon Web Services has the ways and means to accomplish it. Luckily, for customers and vendors alike, there are some less labor-intensive methods available.

Giorgio Regni, CTO of Scality Inc., spoke about how the company is navigating data migration challenges. He told Stu Miniman (@stu), co-host of theCUBE*, from the SiliconANGLE Media team, the distinction between testing and production is key. (*Disclosure below)

He said for many, it makes sense to do DevOps and testing in the cloud and then go into production on-prem. “As long as the security, the users, the protocol are the same, we can move from one to another seamlessly,” he said.

Shrinking data

Regni said tools like Amazon Elastic MapReduce for data mining allow companies to wrangle out just what they need.

“So what about if you just move to the cloud the subset of data you need for data mining and kill it when you’re done?” he said. “This way you keep your data on-prem and just send a little bit to the cloud, so you’re not sending petabytes; you’re sending just what you need.”

Test-dev on demand

Continuing in the testing-before-production vein, Scality brought out its open-source S3 Server last June. It allows developers to code against S3 and send on-prem instances for testing all from a laptop.

Watch the complete video interview below, and be sure to check out more of SiliconANGLE and theCUBE’s coverage of AWS re:Invent(*Disclosure: AWS and other companies sponsor some AWS re:Invent segments on SiliconANGLE Media’s theCUBE. Neither AWS nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)

Photo by SiliconANGLE