Large Data Analytics Made Easy with Google BigQuery


Data analysis in this era of Big Data is certainly not easy, but Google is determined to make it simple by adding new capabilities to Google BigQuery that will give your business new ways to work effectively with large amounts of data.

Google has introduced three major updates to BigQuery to help simplify the large data analytics for enterprises:

  • Big JOIN: use SQL-like queries to join very large datasets at interactive speeds;
  • Big Group Aggregations: perform groupings on large numbers of distinct values;
  • Timestamp: native support for importing and querying Timestamp data.

“Joining terabyte-sized tables has traditionally been a challenging task for data analysts, requiring sophisticated MapReduce development skills, powerful hardware, or a lot of time — often all three,” wrote Ju-kay Kwek, Google BigQuery product manager in a blog post. “Today with BigQuery you can get directly to business insights using SQL-like queries, with far less effort and far greater speed than you could before.”

The strategy behind improving the BigQuery is to entice the Hadoop users. Google also argued that using BigQuery will save users money, because they only pay for the queries that are processed, rather than pay for the computational costs of running individual Hadoop supporting components. Debuted in 2010, BigQuery’s latest capabilities have been designed to replace the Hadoop’s MapReduce. Besides, other BigQuery updates include a couple of other new features as well.

The best part is that the pricing remains the same, and users need to pay only for the actual data that’s processed by their queries.