

Microsoft Corp. has announced a massive increase in the maximum file size limit on its Azure Blob Storage service, rising from 195 gigabytes to 4.77 terabytes.
The increase is designed to accommodate the scientific research community and other users who sometimes have enormous cloud storage needs for workloads, explained Michael Hauss, a program manager with the Azure Storage team. “The increased blob size better supports a diverse range of scenarios, from media companies storing and processing 4K and 8K videos to cancer researchers sequencing DNA,” Hauss wrote in a blog post.
In addition to the increased blob size, Microsoft has also boosted its maximum supported block size from 4 to 100 megabytes.
Hauss said that developers can use the .NET Client Library version 8 or the REST API version 2016-05-31 to take advantage of the new limits. He added that support for Java, Node.js and AzCopy will become available in the following weeks.
In a second announcement, Microsoft said it’s adding its Azure Import/Export service to the Azure Portal management hub. With the service, users can now transfer large amounts of data to and from Blob Storage in a similar fashion to how Amazon Web Services’ Snowball service works, by physically shipping their hard drives to Microsoft. That eliminates the need for expensive and time-consuming data transfers over the web. To help facilitate this, the company also updated WAImportExport, a tool that helps to simplify the process of loading data onto hard drives before shipping them out.
“You will no longer need to shard the data and figure out optimal placement of data across multiple disk,” Rena Shah, a Microsoft program manager, explained in a second blog post. “You can now copy data from multiple source directories in a single command line.”
THANK YOU