At Oracle OpenWorld 2013, the annual event that kicked off yesterday in San Francisco, California, CEO Larry Ellison revealed details on the long-awaited in-memory offering, an update that received its fair share of hype in recent weeks (read our original coverage here, see highlights below).
“Virtually every existing application that runs on top of the Oracle database will run dramatically faster by simply turning on the new In-Memory feature. Our customers don’t have to make any changes to their applications whatsoever; they simply flip on the in-memory switch, and the Oracle database immediately starts scanning data at a rate of billions or tens of billions of rows per second,” Ellison stated.
Oracle OpenWorld 2013 is currently in full swing at the Moscone Center, but Ellison was quick to take center stage with a string of product updates and strategy plans for the company. Here’s the highlights from Ellison’s keynote:
The SAP HANA Competitor
Dubbed simply as the “In-memory option,” it is the latest upgrade to Oracle’s database product, designed to target both transactional and analytical workloads, and promises to deliver “ungodly” improvements to performance.
It is Oracle’s answer to SAP’s HANA In-memory database that would prevent customers from jumping ship from Oracle to SAP. Oracle’s main concern is that many of its customers use SAP software alongside what they are offering. And with HANA ‘s speedy database technology, many were quick to abandon Oracle.
SAP has about 1,500 clients and is growing rapidly, though compared to Oracle’s 30,000 or so clients that is quite miniscule. Nevertheless, Oracle can’t let more customers slip from its grasp. With Oracle’s own In-memory offering, the company hopes to prevent customers from abandoning its database offerings altogether.
The In-memory option will be introduced in Oracle’s 12c Database, which was announced at last year’s event. It aims to boost speed by 100 times.
“When you put data in memory, one of the reasons you do that is to make it go faster,” Ellison said. “Oracle had a goal of 100 times faster queries for analytics and a doubling in throughput for transaction processing with the in-memory option.”
On the hardware side, Ellison introduced the M6-32 Big Memory Machine and the Oracle Database Backup Logging Recovery Appliance.
The M6-32 comes with 32 terabytes of memory and is designed to run the 12c database with the in-memory option. The massive memory will allow an enterprise client to store all or part of their database, which would make the database faster, no longer slowed down by “input/output” (I/O), and eliminating the need to grab data from separate storage drives attached to the main server.
As for the Oracle Database Backup Logging Recovery Appliance, it is the backup for the main database and would be very useful in case of disaster. This appliance will allow the customer to restore all of their data.
Backup and Recovery in the Cloud
For those that do not want to deal with appliances and hardware, Oracle now has a backup and recovery in the cloud service. Though many companies are already offering the same service, Ellison stated that what Oracle offers is different.
“We think by designing the software and hardware together, you get extreme performance so you need less machines, you spend less, need less floor space in the data center, use less electricity, less labor maintaining it,” Ellison explained.
Oracle is hoping that what they announced this week will keep their customers happy and entice them to spend more on their offered services than jump to the competition.
MySQL Enterprise Monitor 3.0
Oracle also announced the availability MySQL Enterprise Monitor 3.0, which features improved manageability while delivering new real-time monitoring and alerts capabilities, visual analysis tools, and better remote monitoring of MySQL databases in the cloud.
MySQL Enterprise Monitor guarantees quick problem-solving, as it continuously monitors MySQL databases and improves productivity by warning developers, database administrators (DBAs) and system administrators of potential problems before they impact the infrastructure and recommends best practices to improve performance, security and reliability.