UPDATED 09:03 EDT / JUNE 12 2018

BIG DATA

AtScale moves its business intelligence abstraction platform to the cloud

AtScale Inc. today is releasing a cloud version of its business intelligence abstraction platform, claiming to simplify the process of provisioning a large-scale analytics environment using data that resides in a single cloud, multiple clouds or a combination.

AtScale Cloud is available today in marketplaces on cloud platforms from Amazon Web Services Inc., Microsoft Corp. and Google Inc. The software now supports a variety of cloud databases, including AWS’s Elastic MapReduce and Redshift, Google’s BigQuery and big data platforms from Cloudera Inc., Hortonworks Inc., MapR Technologies Inc. and Snowflake Computing Inc.

AtScale provides an abstraction layer for a Hadoop cluster or other back-end data store that enables it to be accessed by a wide range of business intelligence applications without the need for extensive data extract/transform/load procedures. “You don’t have to worry about where data is or how it’s stored,” said Dave Mariani (pictured), AtScale’s co-founder and chief executive.

Models built in AtScale can run anywhere to connect any supported BI tool to any data platform. Performance is boosted by Adaptive Cache, the company’s query acceleration technology, which analyzes query patterns in real-time to optimize response.

This capability eliminates the need for data marts, which are small databases that contain extract data from a data lake or warehouse. The abstraction layer supports one set of universal semantics for queries from tools ranging from Excel spreadsheets to Tableau Software Inc.’s visualization tools.

AtScale intercepts queries and rewrites them to a common metadata repository that includes a semantic model, governance rules and a data lineage map. The Adaptive Cache creates summary tables, and machine learning algorithms generate “smart aggregations” that anticipate future queries based upon historical activity.

AtScale’s ability to query data in any location eliminates time-consuming data transfer and cleansing processes when moving data to the cloud, Mariani said.

“Previously, you would copy data to AWS S3, create a schema in Redshift, load data into Redshift, build a data model in Tableau, tune Redshift for performance, report and go back and repeat the data modeling process for another BI tool,” he said. With AtScale Cloud, “You copy data to S3, create a virtual cube in AtScale and report from any BI tool. You don’t have to load data into Redshift, but customers get same performance as if data was in Redshift.”

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU