Top Stories

Automation and Easier Aggregation in Hadoop Clusters Signals Data as a Service Trend

Automation and Easier Aggregation in Hadoop Clusters Signals Data as a Service Trend

Yesterday I wrote about Cascading 2.0, an alternative to MapReduce. The application framework, managed by Concurrent, allows for developers to develop "Cascading," big data apps using high-level scripting languages. The apps then get scheduled to run across a Hadoop cluster. Also yesterday,  HP executives presented their case for integrating Hadoop with Autonomy and HP Vertica, its impressive analytics technoloogy. In both the news from HP and Concurrent, executives often referred to "aggregation," as what serves as a priority in developing big data systems. It's becoming clear why. Aggregation represents the next phase on the road to data as a service. HP executives described... more »
IBM Debuts New DevOps Software and Cloud Services @ Edge

IBM Debuts New DevOps Software and Cloud Services @ Edge

IBM Edge is making it very clear that the IT giant is angling for big data, but that’s not the only thing that ended up on the drawing board. This week Big Blue introduced several new DevOps tools, some as-a-service, looking to attract enterprise clients by facilitating better collaboration between frontend and backend engineers. The company debuted the latest release of its Collaborative Lifecycle Management (CLM)software, a suite of offerings that addresses synchronization in IT. It’s built on Jazz and is natively integrated with Rational Requirements Composer, Rational Team Concert, and Rational Quality Manager. On top of this, a number of services... more »
Evolution of Cloud Adoption through 2015 [Infographic]

Evolution of Cloud Adoption through 2015 [Infographic]

Sure, cloud computing may be a marketing construct that allows companies to avoid the certain disaster that would result if sales teams were forced to explain virtualization and dynamic capacity allocation to every potential customer. Yes, Larry Ellison, deep down we agree with your hatred of the term, “cloud computing”. However, even if you have disdain for the term, it is impossible to deny the impact of the cloud on the enterprise. A new infographic from software and services provider Axway illustrates that impact and makes a few predictions about how cloud adoption will evolve in the next three years. There... more »
Oracle Purchases Collective Intelligence, Pushes Against Salesforce.com/Radian6

Oracle Purchases Collective Intelligence, Pushes Against Salesforce.com/Radian6

Oracle is keeping the acquisition train rolling with the purchase of Collective Intelligence, a social analytics firm that provides similar cloud-based services to Radian6, which Salesforce.com purchased last year. And if you think that's a coincidence, I have a multi-tenant, highly scalable bridge to sell you. The rivalry between Salesforce CEO Marc Benioff and Oracle CEO Larry Ellison seem to be getting more intense. Last week, Salesforce announced the $800 million purchase of social media presence optimization service provider Buddy Media. But that came after Oracle's $300 million buy of Virtue, which offers its own services in the same arena. In... more »
IBM Acquisitions Deliver Storage for the Rest of Us

IBM Acquisitions Deliver Storage for the Rest of Us

The Edge conference in Orlando, FL is IBM’s opportunity to show off its wide range of storage solutions to the world. While IBM is typically known for its large enterprise offerings, some of its acquisitions highlighted during the conference make this year’s Edge “the storage coming-out party,” as said by Wikibon founder David Vellante. theCube is covering the IBM Edge conference live. Today, Wikibon analyst John McArthur and Wikibon founder Dave Vellante were fortunate to spend some time with IBM systems storage VP, Bob Cancilla (full video below). McArthur and Vellante prompt Cancilla to reminisce a few years back into his... more »
No thumbnail available

AT&T Accepting Galaxy S III Preorders Tomorrow

Today’s mobile news roundup features the following: US carriers now accepting preorders for the Samsung Galaxy SIII, PlayStation gets deeper into mobile with HTC partnership while Apple waits for the Samsung injunction. Samsung Galaxy SIII preorders now open In yesterday’s mobile news roundup we mentioned that five US carriers will be selling the Samsung Galaxy SIII starting this month of June, but the release date and pricing were still a bit of a mystery.  The latest news on this matter now indicates that AT&T will start accepting preorders for Samsung's new flagship phone starting tomorrow, June 6th. The Samsung Galaxy SIII will be... more »
Oracle Comes Up Short in Detailed Comparison of Exadata vs. Best-of-Breed

Oracle Comes Up Short in Detailed Comparison of Exadata vs. Best-of-Breed

Wikibon Co-Founder and CTO David Floyer is definitely not on Larry Ellison's Christmas card list this year. Not satisfied with his original exposition of the limited value of Oraclea week ago, Floyer has co-published a detailed, item-by-item examination of why it falls short with Wikibon Analyst Nick Allen. Titled “Comparing Oracle Exadata Storage with Best-of-Breed Arrays”  the Wikibon Professional Alert is built around three tables. The first lists the advanced functionality available across the board on best-of-breed Tier 1 storage not available on Exadata. These include support for numerous RAID types, consistency groups, snapshot copies, writable snapshots, clones, mix-and-match HHD types,... more »
IBM Smarter Computing Upgrades Aimed at Big Data

IBM Smarter Computing Upgrades Aimed at Big Data

So far the biggest update that that came out of the IBM Edge conference is that Big Blue has rolled out several major enhancements to the offerings in its Smart Computing portfolio, with an emphasis on data-driven efficiency. The gist of it is that several of systems are getting some big boosts. IBM is adding real-time compression to both the Storwize V7000 unified storage system and the System Storage SAN Volume Controller, and claims that the technology can shrink actively used data by up to 80 percent. This puts a new twist on the traditional approach of compressing low activity data,... more »
No thumbnail available

Veeam Boosts Freemium Approach with Free Backup

Veeam doesn’t have any objections to creating a community of free users in exchange for a lengthy list of paying customers. Today the backup software rolled out new features to the commercial edition of its solution, and at the same time introduced the Veeam Backup Free Edition. The latter is the successor of the free Veeam FastSCPTM tool, which has been around for quite some time and has become quite popular. The main addition to the paid Veeam Backup & Replication 6.1 solution is something that’s also included in the free version: something called Veeam ZIP, a compression mechanism that automatically... more »
The Future for IBM Storage is All About Leveraging Flash

The Future for IBM Storage is All About Leveraging Flash

With its debut this week, the IBM Edge conference in Orlando, FL is Big Blue’s opportunity to trumpet its storage offerings in an increasingly competitive market.  With EMC’s acquisition of Xtreme-IO rounding out the storage giant’s own flagship event last week at EMC World, IBM is finding itself in the necessary position to better market its storage solutions.  Here to discuss IBM’s goals around increased messaging for its new and existing products is Ed Walsh, who runs Storage Portfolio Strategy for the company.  Walsh stopped by theCube today with SiliconAngle founding editor John Furrier and Wikibon founder Dave Vellante to... more »

Premium Research

- George Gilbert

Premise 16 years ago proprietary flavors of Unix domina […]

- Stuart Miniman

Hyperconvergence is the current state – next generation of converged – better, faster, more distributed than original converged, but still just the beginning. Wikibon’s Server SAN research agenda is to show where we are today and give guidance for the future of this space.

- David Floyer

Wikibon recommends that CIOs put in place a four-point plan for implementing an all-flash datacenter strategy: 1/ Migrate to an All-flash Datacenter by 2016 2/ Proof of concept for Developer Data Sharing - Expect 2 x Productivity & 3 x Code Quality 3/ Reorganize Data Center Operations around Principle of Application Data Sharing 4/ Use IT Budget Savings to Fund Systems of Intelligence This research provides a technology understanding, the business case, and the strategic first steps on the road to an all-flash datacenter.

- David Floyer

IT will sound like yesterday’s news if it continues to advocate tiering, advocate putting the most important data on high performance storage, and declaring that less expensive lower performing storage is key and necessary. Today’s flash reality is that a single compressed/de-duplicated physical version of data must be shared between as many different applications as possible, because the incremental cost of creating and sharing another copy is close to zero.

- George Gilbert

Systems of Intelligence will power the next generation of enterprise applications built on big data. Line of business executives charged with digital transformation must understand the dynamics of Systems of Intelligence well-enough to be effective sponsors of new systems. IT executives must be effective partners and understand how to build Systems of Intelligence on top of Systems of Record. They will also need to understand how to build a radically new infrastructure to support these systems.

- Stuart Miniman

OpenStack is now positioned as an Integration Engine - not the entire stack, but like Linux, could grow into a critical component of IT over the next decade. This article examines the real state of OpenStack after the Kilo release and 2015 OpenStack Summit in Vancouver.

- David Floyer

Wikibon believes CIOs should put the highest priority in developing an all-flash datacenter strategy, and implement it so that all storage acquired and deployed is 100% all-flash by 2016. The key short-term focus for CIOs is to ensure that data sharing is optimized, and that IT organization and objectives support and drive increases in data sharing. The key long-term focus is to release the potential of the best and brightest in the organization to imagine an organization supported by applications without data limits.

- Stuart Miniman

The emerging PaaS platforms (Cloud Foundry, Heroku, OpenShift, Docker) are all evolving to capture the mindshare of developers. While the PaaS battles are still in early days, Cloud Foundry is emerging as a leading open source platform and ecosystem to build and deploy the new Cloud Native applications that are driving these economic changes via software. But this Platform-as-a-Service (PaaS) model requires changes to technology, skills (people) and internal IT processes. This means enterprises need to become familiar with this development and operational model to best understand how they can accelerate using Cloud Native applications to give themselves a business differentiation.

- David Floyer

Platform-as-a-Service (PaaS) is a full application lifecycle cloud service including initial development, testing, deployment, operations and maintenance. Wikibon defines three different PaaS cloud services, 1/ PaaS integrated with IaaS, an integrated platform (e.g., IBM Bluemix, EMC Pivotal Cloud Foundry), 2/ PaaS on top of IaaS, combining development services onto a specific platform (e.g., AWS, Microsoft Azure) and 3/ PaaS on top of SaaS, a development front-end to a SaaS platform with its underlying infrastructure (e.g., ServiceNow, SalesForce). Wikibon will be using this topology to define its PaaS analysis and forecasts.

- Jeff Kelly

When I began covering the Big Data market for Wikibon back in early 2011, it was early days. Big Data technologies such as Hadoop and NoSQL, while still early in their development, were becoming better known by enterprise practitioners. The market for commercial Big Data technologies and services was small but on the verge of rapid growth. But from a business use case perspective, Big Data was long on promise but short on specifics. A lot has changed over the last four years. Without question, Big Data technologies have developed at breakneck speed due in large part to a vibrant open source community of developers. Hadoop in particular has taken great strides. What was once a batch processing, unsecure and somewhat finicky framework is now a much more comprehensive, enterprise-grade, multi-application supporting Big Data platform.