Jeffrey Kelly
Latest from Jeffrey Kelly
Speed the Key to SAP HANA’s “Fast Data” Approach to Analytics
SAP may be pushing its new in-memory analytics appliance, called HANA, as the company’s response to the Era of Big Data, but processing and analyzing increasing data volumes is just one part of the story. The other, more important part for SAP, in my opinion, is speed. For those unfamiliar with it, HANA is an ...
Alpine Data Labs Offers Visualization Tools to Create In-Database Analytics Models
Alpine Data Labs, or ADS, didn’t invent the concept of in-database predictive analytics. SAS Institute has been collaborating with data warehouse vendors Netezza (now part of IBM) and Aster Data (since acquired by Teradata) to inject predictive modeling and analytics capabilities into their MPP analytic databases since last summer. But Alpine Data Labs, based in ...
SAP Simplifies Application Integration with NetWeaver Gateway
SAP customers have long wanted an easy way to connect SAP applications to outside environments. Now they have one. SAP today announced SAP NetWeaver Gateway, a new framework that greatly simplifies the task of connecting SAP ERP, CRM, BI and other applications to non-SAP systems. It taps REST, SOAP and the Open Data Protocol, or ...
SAPPHIRE 2011: SAP’s Snabe Wants to Simplify Your IT Environment
Cloud computing, in-memory analytics, and mobility were the buzzwords at Day 1 of SAPPHIRE, but judging by Co-CEO Jim Hagemann Snabe’s own words, SAP is focused on one thing: simplicity. “I actually believe that is one of the biggest tasks in the industry,” said Snabe, speaking to SiliconANGLE founder John Furrier and chief Wikibon analyst ...
Gallagher: Greenplum Acquisition Part of a Continuing Transformation for EMC
When COO Pat Gelsinger asked Brian Gallagher, head of EMC’s enterprise storage division, to examine the data analytics and data warehouse space in December 2009, it didn’t take long for Gallagher to pinpoint where the innovation in that market was coming from. It wasn’t from the mega-vendors – Oracle, IBM, Teradata and Microsoft – but ...
Big Data processing, flexible analytics tools key to understanding customer lifecycle
If you want to maintain high levels of customer satisfaction and customer loyalty, you’d better understand your customers’ entire lifecycle. “If a customer comes at you from one viewpoint, maybe the web, and you don’t correlate that back to a call-center experience, you really lose out to when the customer went left or right,” said ...
Schmarzo: EMC Greenplum Customers Need to Think Different, Embrace Big Data Possibilities
The era of Big Data requires a new way of doing business, according to Bill Schmarzo. “We’re going to see a total revamping of the architecture that supports the decisions that users are trying to make,” said Schmarzo, Global Enterprise Information Management Competency Lead at EMC. Speaking to Wikibon’s Dave Vellante and SiliconANGLE’s John Furrier ...
Big Data a Big Part of EMC’s Total Customer Experience Initiative
As EMC transforms from a strictly storage company to an information services company, listening and responding to the voice of the customer is more important than ever, according to Jim Bampos, head of EMC’s Total Customer Experience program. Delivering services is an increasingly critical part of EMC’s business. And Big Data plays a big roll. ...
Hadoop End-Users Should Align with Apache Community
Despite the significant progress made by the Apache community and start-up contributors like Cloudera, Hadoop is still in its infancy. Like most young open source technologies, Hadoop is and will continue to be for some time a moving target. Development of Hadoop is highly iterative and experimental in nature, so end-users should carefully consider the ...
Will EMC’s Greenplum-Hadoop Gambit Pay Off?
Wikibon believes EMC’s success or failure in the commercial Hadoop market depends on four critical factors. 1. EMC’s reception by the open source Apache Hadoop community. Credibility is critical to gaining adoption in a young, open source technology community like Hadoop. Traditionally, such credibility is contingent on making significant contributions to the open source project ...