Top Stories

How we do Big Data to Compute for the Large Hadron Collider

How we do Big Data to Compute for the Large Hadron Collider

We have seen Big Data utilized for several big computing applications, and it will now serve for computing for the Large Hadron Collider (LHC). It’s because the European Organisation for Nuclear Research (CERN) is facing a tough challenge in the discovery of a particle consistent with the Higgs Boson. CERN has to keep the Large Hadron Collider online for several months, in order to complete this discovery, but this is going to have huge computing needs. They need to collect a lot of event data to have a statistical chance of seeing the same outcome enough times to prove that... more »
VMware Could Launch Project Octopus Later This Month

VMware Could Launch Project Octopus Later This Month

VMware debuted Project Octopus  back in April. The service is the result of a joint effort between the virtualization kingpin and EMC Mozy, and is touted as “Dropbox for the enterprise.” Until now we only knew what it is, but today Bill Bliss, the head of VMware’s end user product development, hinted that that it may launch at the VMworld conference his company is holding from Aug 26-30. He also disclosed a few other details about the actual product and how it will be offered. Project Octopus, a temporary name that will expire once it rolls out of the development stage, will... more »
How Does Enterprise Solve “The Dropbox Problem” ShareFile Provider Citrix Speaks

How Does Enterprise Solve “The Dropbox Problem” ShareFile Provider Citrix Speaks

It’s been a lot like the poem The Highwayman recently with the IT industry a ghostly galleon tossed upon cloudy seas and as a result many in enterprise have been looking to how to provide cloud locker services to their employees and also maintain a secure perimeter. New, shiny applications have always found their way into businesses via employee interest (we’ve seen it with instant messengers and with Facebook) and now many IT departments must deal with cloud locker services like Dropbox being used inside their firewalls. This effect is being called the “Dropbox problem” but industry insiders such as Bill... more »
Microsoft Releases Attack Surface Analyzer 1.0 to Evaluate Malware Attacks

Microsoft Releases Attack Surface Analyzer 1.0 to Evaluate Malware Attacks

Microsoft recently released the Attack Surface Analyzer 1.0, whose beta version was launched last year. The tool is useful for software developers, especially during the verification phase of the Microsoft software lifecycle, and helps understand changes in Windows systems’ attack surface resulting from the installation of new application. As compared to the beta version, this 1.0 release comprises several performance enhancements and bug fixes to improve the user experience. The Attack Surface Analyzer 1.0 performs several checks, including analysis of changed or newly added files, registry keys, services, Microsoft ActiveX controls, listening ports and other parameters that affect a computer's attack surface.... more »
Putting the Work into Big Data, Caserta Concepts Teams with Cloudera

Putting the Work into Big Data, Caserta Concepts Teams with Cloudera

The newest member of Cloudera’s Connect Partner Program is Caserta Concepts, a consulting firm that offers services to enterprises that lack the in-house talent to implement a big data strategy.  It’s the latest name to join an already lengthy list of SIs, service providers, OEMs and other partners that compromise Cloudera’s substantial ecosystem. This community is one of the things that helped propel the vendor to its current position. “Cloudera and Caserta Concepts are empowering enterprises to explore the power of Big Data,” said Tim Stevens, Vice President of Business and Corporate Development at Cloudera. “This partnership helps clients use social... more »
SpaceBase Introduces Memory-Centric Data Management

SpaceBase Introduces Memory-Centric Data Management

It is difficult enough to find data in a large database, but imagine if you also needed to locate the data and map its location in relation to other pieces of data. A distributed memory system needs to be able to place data on a specific node and also do so without putting an enormous amount of stress on the nodes in a cluster. Rather than randomly allocating data using consistent hashing , this type of distributed RAM grid would need to use locality-sensitive distribution. Enter SpaceBase, a real-time spacial data store for massive multiplayer online (MMO) games, location-based services, and... more »
No thumbnail available

Tapping the 1 Percent of Twitter Data: FirstRain CEO Penny Herscher Gets It

There is such a thing as the global intelligence stream, and it's empowering consumers as much as it's providing valuable data sets to marketers and corporations.  It's a new kind of give-and-take that's blossoming around consumer-corporate interactions, but it's probably the first time such a relationship has so much potential to be equally, if not more beneficial to the consumer. And there's a segment of businesses cropping up around this potential, seeking out the best way to unearth consumers' true sentiment from the deep ocean of public web content, contextualizing it for business insight.  FirstRain is one of the newer companies... more »
No thumbnail available

CyberWars: Caught in the Crossfire @CyberWars

Ed. note: This is the first in a three-part series on CyberWars that will explore what is happening, why businesses should be concerned, and what they should do. In the last five years the first shots in a new kind of conflict, dubbed CyberWars by the press but more accurately “advanced persistent threats” by data security experts, have been fired. The revelations of Operation Aurora,  the 2009 penetration of Google, Jumiper Networks, Rackspace and probably several other technologically sophisticated companies; Night Dragon,  the penetration of the major oil and gas companies; Shady Rat; the RSA penetration; and, of course, Stuxnet, revealed a... more »
Rally Acquires Agile Advantage, Raising the Bar for Development

Rally Acquires Agile Advantage, Raising the Bar for Development

Rally Software, the company that offers Software-as-a-Service--based Application Lifecycle Management (ALM) platform and products, announced the acquisition of Agile Advantage, a performance management solution that helps organizations maximize the financial return of Agile software development projects.  ALM is the union of business management to software engineering and a continuous process of managing the life of an application through governance, development and maintenance. “Just as Rally disrupted the project management market 10 years ago, it’s now disrupting the next generation of portfolio management with the same Agile principles that revolutionized how projects are managed,” said Tim Miller, Rally’s CEO. “Adding Agile Advantage’s... more »
Global Cyberthreats intensify, China and U.S. Primary Source of Malicious IPs

Global Cyberthreats intensify, China and U.S. Primary Source of Malicious IPs

AlienVault, the only open Unified Security Management platform that delivers built-in controls and full visibility, has released an infographic giving details of trends in malicious cyber activity. Named as “The 2nd United Nations”, the infographic gives out the latest threat data generated by the AlienVault Open Threat Exchange™ (OTX). Here are some key revelations from the infographic: • China and the United States rank one and two among the top five sources of malicious IP addresses, followed by South Korea, Russian Federation, and Taiwan. • Host scanning was the top most malicious activity, followed by malware domain, malware IP, spamming, and malicious... more »

Premium Research

- David Floyer

CIO and senior IT executives should minimize investments in HDDs for latency storage investments going forward. Storage practitioners should focus on moving latency storage to flash, implementing a sound catalog strategy for the management of snapshots, and a strategy for linking to on-premise or cloud-based capacity resources. Any storage that involves assisting end-users and customers should be regarded as latency storage.

- Ralph Finos

A companion piece to Wikibon's Public Cloud Market Forecast 2015-2026, this research examines the revenue from SaaS, IaaS and PaaS vendors. The competitive environment surrounding the Public Cloud is in flux. SaaS remains turbulent with new entrants successfully gaining share and incumbent licensed software providers trying to develop SaaS offerings and reclaim leadership positions they have maintained for a decade or more. The IaaS segment leadership is beginning to crystalize as a function of scale, but PaaS is just formulating and finding its way. As such, enterprises need to be wary of which providers are winning and losing (and where), but more importantly what they themselves intend to accomplish with Public Cloud.

- David Floyer

Cataloging and automated policy management are the key enablers of a virtual flash world, where storage snapshots are both King and Knave. Combining cataloging and automated policy management is the only solution to enabling storage copy reduction in harmony is risk management and compliance. This enables and justifies an all-flash data center, enables data to be available quicker to the business and other IT functions, and drives greater business and IT productivity and responsiveness. CIOs and senior management should create a small team of the best and brightest, create an optimized all-flash virtual environment with a programmatically integrated catalog in a subset of the datacenter, and demonstrate the practicality and benefits of this environment to the business and IT.

- Stuart Miniman

VMworld has grown to be one of the largest and most important technology industry events. Wikibon has attended this event for many years and will have its largest presence this year as part of a double-set of theCUBE. Coverage will examine the broad and diverse ecosystem including storage, cloud, networking and much more.

- David Floyer

Wikibon believes latency storage vs. capacity storage is a key storage dimension, with different functional requirements and different cost profiles. Latency storage is found within the datacenter supporting more active applications, and in general has a high read bias. Latencies can vary from 1 millisecond down to a few microseconds; the lower (better) the latency the closer to the processor resources it is likely to be. It is also found for the metadata layer for capacity data. The boundary for latency storage will reduce down to 500 microseconds over the next three years. Capacity storage is found in archive, log, time-series databases for the Internet of things and many other similar applications. In general it is write-heavy. Latencies are generally above 1 millisecond, do not have to so close to the processor, and are suitable for remote private, public and hybrid cloud storage. Some parts of the capacity market place will have latencies as low as 500 microseconds over the next three years. Wikibon has added this dimension to the other storage dimensions projected, which include HDD vs. Flash, Hyperscale Server SAN vs. Enterprise Server SAN vs. Traditional SAN/NAS storage, Physical Capacity vs. Logical Capacity and SaaS Cloud vs. IaaS Cloud vs. PaaS Cloud. All these dimensions are projected for both Revenue and Terabytes. There is a strong correlation and interaction between the latency/capacity dimension and HDD/Flash dimension. Wikibon provides detailed breakdown of the storage projections for Premium clients.

- Brian Gracely

With the EMC Federation in transition, VMware needs to take a greater leadership role in delivering cloud services. VMware's ability to lead their customers transitions to Hybrid Cloud requires them to deliver a more robust, more agile vCloud Air platform.

- Brian Gracely

While the majority of Enterprise CIOs continue to have Hybrid Cloud near the top of their priorities, the solutions in the marketplace often fall short of expectations. Wikibon analysts look at the state of Hybrid Cloud from a C-Suite perspective.

- Ralph Finos

Maturity by industry is a function of customer and prospect skills, competitive drivers within the industry, the nature of the data in question as well as the complexity of the analytics required, and most importantly the problem that the prospect or customer is trying to solve. Identifying where customers are in their journey and helping them to reach the next level is key to vendor success.

- Ralph Finos

Both customers and vendors need to prioritize how they address adoption barriers. As with all emerging technologies, a full solution will often require extensive 3rd party participation, such as “data wrangling” and SQL data access tools. SQL data analysis offerings are maturing rapidly in the areas of application performance under greater user and data volume loads both from the Hadoop distribution vendors as well as 3rd parties. Other barriers, such as a skills gap across many roles, are more intractable. Smaller customers with fewer specialized practitioners in each role should include in their evaluations cloud-based solutions that are fully managed services.

- George Gilbert

Across all the primary roles involved in adopting Big Data applications, there are basic gaps in product maturity. However, IT leaders and practitioners should keep in mind that Big Data databases are part of a relatively immature ecosystems that requires advanced skills and integration technology in order to operate successfully. The ecosystem is evolving and maturing rapidly and there is a tremendous proliferation of technologies to augment early products.