Top Stories

This Week in Big Data: Symantec’s Hadoop Play, the Presidential Race and More

This Week in Big Data: Symantec’s Hadoop Play, the Presidential Race and More

There were a few big updates coming out of the big data ecosystem this week. Symantec upped its investment in analytics by launching an add-on for its Cluster File System that works with HortonWorks’ Hadoop distribution. The result is a product called Symantec Enterprise Solution for Hadoop, which is free for existing customers and can scale up to 16 petabytes. The company lists a number of advantages to its new software, including easier provisioning and faster analytics, as the data doesn’t have to be moved around. Atigeo is also doing something interesting with big data. The firm launched two new websites that... more »
Big Data: The Good, Bad and Ugly

Big Data: The Good, Bad and Ugly

It isn't often that a confluence of news stories leads to an editorial epiphany, as it has this week. Good epiphanies are hard to find; I'm lucky if I get one or two a year. This week, as most weeks go, gave us a heaping helping of stories on Big Data (after all, I've just returned from the Cassandra Summit in Santa Clara with John Furrier and Jeff Kelly). Like most other weeks, it's raining "Big Data" everywhere. Folks like you and me mostly assume it's due to the hype of Big Data-washing we're starting to see in enterprise tech, but... more »
No thumbnail available

Storing Data as DNA, Harvard Researchers Say Test Tubes Can Contain "the whole Internet"

Test tube data. It may sound strange, now, but genomics researcher and serial entrepreneur, George Church, told the Wall Street Journal encoding data in DNA “could be the wave of the future” for archives. Church was the senior researcher of a Harvard experiment to encode his forthcoming book, Regenesis, in DNA. According to Kyle Alspach, each of the nearly 55,000 strands of DNA used to store the text, contained an indicator of where the text belongs in the sequence of the book. The ability to store data as DNA, in the form of a viscous liquid or solid salt, presents... more »
The Public and the Personal Cloud in the Spotlight This Week

The Public and the Personal Cloud in the Spotlight This Week

VMworld 2012 is coming later this month, and VMware is already starting to up its cloud play to give customers plenty to talk about ahead of the gathering. This week the virtualization firm unveiled a very reasonable trial version of its vCloud management platform. For a few cents an hour prospective users can get their hands on a Linux VM that runs the software with 1 GB of RAM for testing purposes. This limited version includes all of the features that come with the standard edition. vCloud is rivaled by OpenStack, and this week one cloud provider published some data that interested... more »
No thumbnail available

How IBM's Acquisition of Texas Memory Affects the Too-Hot Flash Market

IBM acquired the privately-held Texas Memory Systems in a move that could drive the other big vendors to dive right into the flash scene. Terms of the deal were not disclosed. TMS is a firm that has been making storage systems for the past 34 years, and is now known mostly for its flash arrays and rackmount systems. Big Blue said that it will integrate the firm’s assets into its Smart Storage strategy, and one thing’s for sure: this buyout is going to have a big impact on the market. Most obvious, a flash-equipped IBM is going to be a threat to... more »
No thumbnail available

Big Data's Biggest Investors

With the growth in popularity of Big Data on the scene, it is no surprise that venture capitalists are now turning their attention to the field. Existing companies are realizing the value of the data ore they possess. Many new companies are starting out with Big Data as a focal point. These companies capture various types of data with the intent of mining it and making a profit. Like any mining mission, the extrapolation of the desired data is where the focus lies. While companies are seeing more value in their data, they may not have the needed knowledge or... more »
No thumbnail available

“All media fails, but Cloud backup is forever,” A Trip Down Memory Lane for Storage Media [Infographic]

However, popular new forms of storage media become, affording ease of use, portability and data security, they have their challenges and, ultimately, become “a useless piece of plastic, tape or metal.” Crashplan’s recent infographic, “The Lifespan of Storage Media,” presents statistics on storage media ranging from undeveloped film in 1885 to the USB flash drive in 2000 and many other forms of computer, audio, video and photo media in between. The report details how and why past storage media failed. While cloud storage has had its limitations, the infographic puts past and future data storage in perspective to explain why... more »
Windows 8 Becomes Available for MSDN, TechNet Subscribers

Windows 8 Becomes Available for MSDN, TechNet Subscribers

Windows 8 is now available on the Microsoft Developer Network Platform and the subscribers of TechNet. While most developers have been eagerly waiting for this Windows update, some techies reveal that there is no need to rush for the update as it has nothing significant for them. So, what are their reviews based on? Is Windows 8 really worth to go for? Let’s find out. While Steven Sinofsky, the President of the Windows and Windows Live Division described Windows 8 as s 'reimagining Windows, from the chipset to the experience,' and something that will bring new PC experience that readies Windows... more »
Dirt Jumper DDoS Toolkit Riddled With Vulnerabilities

Dirt Jumper DDoS Toolkit Riddled With Vulnerabilities

A security vendor has turned the tables on malicious hackers by releasing details of a vulnerability in a popular toolkit used to launch distributed denial of service (DDoS) attacks against corporate websites. The toolkit in question is the infamous Dirt Jumper, which is based on the RussKill application that has become one of the most popular DDoS tools in the World Wide Web’s seedy underground. Prolexic revealed earlier this week that it had identified a vulnerability within the command and control (C&C) structure of Dirt Jumper, which makes it possible to use open source pen testing tools to access the toolkit’s database... more »
Introducing GreenHopper 6 for Advancing Agile

Introducing GreenHopper 6 for Advancing Agile

Agility is what drives the IT world. Be Silicon Valley or any corner of the world, Agile is the key to success for software developers. Here we are introducing the GreenHopper 6, the latest update to the much known GreenHopper that has been serving over 650,000 users worldwide and leading the charge for teams discovering and advancing agile pursuits. The latest version comes with several enhanced features and refinements that are believed to take Agile and team performance to new heights. Let’s find out how. While GreenHopper is already a powerful way to transform JIRA into an agile project planning tool... more »

Premium Research

- George Gilbert

Hadoop is one of the most innovative ecosystems the industry has ever seen. But fragmentation and complexity are the trade-offs of all this rapid evolution while the platform is still maturing. Choice has a cost. This research report has only examined the compute engines that process data. But the fragmentation in management, governance, and security tools is just as great. There is a continually expanding array of tools such as Oozie, Falcon, Atlas, Knox, Ranger, HDFS DARE, Ambari, Hue, Sentry, Sahara, Cloudera Manager and Navigator, and Zookeeper. At some point it makes sense for customers to consider investing in a tool that can hide much of that complexity. To be clear, there is no magic product that can hide all these technologies. But when customers take the perspective of simplifying an end-to-end process, solutions are available to address the problem.

- David Floyer

CIO and senior IT executives should minimize investments in HDDs for latency storage investments going forward. Storage practitioners should focus on moving latency storage to flash, implementing a sound catalog strategy for the management of snapshots, and a strategy for linking to on-premise or cloud-based capacity resources. Any storage that involves assisting end-users and customers should be regarded as latency storage.

- Ralph Finos

A companion piece to Wikibon's Public Cloud Market Forecast 2015-2026, this research examines the revenue from SaaS, IaaS and PaaS vendors. The competitive environment surrounding the Public Cloud is in flux. SaaS remains turbulent with new entrants successfully gaining share and incumbent licensed software providers trying to develop SaaS offerings and reclaim leadership positions they have maintained for a decade or more. The IaaS segment leadership is beginning to crystalize as a function of scale, but PaaS is just formulating and finding its way. As such, enterprises need to be wary of which providers are winning and losing (and where), but more importantly what they themselves intend to accomplish with Public Cloud.

- David Floyer

Cataloging and automated policy management are the key enablers of a virtual flash world, where storage snapshots are both King and Knave. Combining cataloging and automated policy management is the only solution to enabling storage copy reduction in harmony is risk management and compliance. This enables and justifies an all-flash data center, enables data to be available quicker to the business and other IT functions, and drives greater business and IT productivity and responsiveness. CIOs and senior management should create a small team of the best and brightest, create an optimized all-flash virtual environment with a programmatically integrated catalog in a subset of the datacenter, and demonstrate the practicality and benefits of this environment to the business and IT.

- Stuart Miniman

VMworld has grown to be one of the largest and most important technology industry events. Wikibon has attended this event for many years and will have its largest presence this year as part of a double-set of theCUBE. Coverage will examine the broad and diverse ecosystem including storage, cloud, networking and much more.

- David Floyer

Wikibon believes latency storage vs. capacity storage is a key storage dimension, with different functional requirements and different cost profiles. Latency storage is found within the datacenter supporting more active applications, and in general has a high read bias. Latencies can vary from 1 millisecond down to a few microseconds; the lower (better) the latency the closer to the processor resources it is likely to be. It is also found for the metadata layer for capacity data. The boundary for latency storage will reduce down to 500 microseconds over the next three years. Capacity storage is found in archive, log, time-series databases for the Internet of things and many other similar applications. In general it is write-heavy. Latencies are generally above 1 millisecond, do not have to so close to the processor, and are suitable for remote private, public and hybrid cloud storage. Some parts of the capacity market place will have latencies as low as 500 microseconds over the next three years. Wikibon has added this dimension to the other storage dimensions projected, which include HDD vs. Flash, Hyperscale Server SAN vs. Enterprise Server SAN vs. Traditional SAN/NAS storage, Physical Capacity vs. Logical Capacity and SaaS Cloud vs. IaaS Cloud vs. PaaS Cloud. All these dimensions are projected for both Revenue and Terabytes. There is a strong correlation and interaction between the latency/capacity dimension and HDD/Flash dimension. Wikibon provides detailed breakdown of the storage projections for Premium clients.

- Brian Gracely

With the EMC Federation in transition, VMware needs to take a greater leadership role in delivering cloud services. VMware's ability to lead their customers transitions to Hybrid Cloud requires them to deliver a more robust, more agile vCloud Air platform.

- Brian Gracely

While the majority of Enterprise CIOs continue to have Hybrid Cloud near the top of their priorities, the solutions in the marketplace often fall short of expectations. Wikibon analysts look at the state of Hybrid Cloud from a C-Suite perspective.

- Ralph Finos

Maturity by industry is a function of customer and prospect skills, competitive drivers within the industry, the nature of the data in question as well as the complexity of the analytics required, and most importantly the problem that the prospect or customer is trying to solve. Identifying where customers are in their journey and helping them to reach the next level is key to vendor success.

- Ralph Finos

Both customers and vendors need to prioritize how they address adoption barriers. As with all emerging technologies, a full solution will often require extensive 3rd party participation, such as “data wrangling” and SQL data access tools. SQL data analysis offerings are maturing rapidly in the areas of application performance under greater user and data volume loads both from the Hadoop distribution vendors as well as 3rd parties. Other barriers, such as a skills gap across many roles, are more intractable. Smaller customers with fewer specialized practitioners in each role should include in their evaluations cloud-based solutions that are fully managed services.