Bitcoin Weekly 2014 December 10: Blockchain.info discloses security issue, BitQuest Bitcoin Minecraft server, Tim Draper buys into US Marshal’s auction again
The VMware/AWS deal will catalyze many changes, but high on the list will be the addition of machine learning features to systems of record.
In a much anticipated announcement, VMware and AWS announced a partnership that will give VMware users another path to the cloud and shakes up the power structure of the cloud space. Caveat: Lots of details to be worked out.
With the forthcoming announcement of tighter technical collaboration between VMware and AWS, big data pro's will have far more choices to consider. They will need to prioritize which workloads to deploy on whichever hybrid cloud platform emerges. Ephemeral workloads should see the largest TCO savings, regardless of the hybrid cloud scenario. At the same time, the deepest integration between VMware and AWS should yield the greatest savings.
Architecting data centers to satisfy the growing demand for high performance is challenging – primarily because the existing technology is running out of gas. This will require IT professionals to adopt new approaches for solving these demanding requirements. New approaches to advancing overall system performance must be studied and applied whenever possible. Server SAN, which moves storage closer to processors without giving up shared data access, is one advance. Deploying flash-based storage devices is another, especially in situations that demand either predictable database performance or large numbers of live copies. Finally, new technologies for increasing I/O parallelism such as DataCore's Parallel IO can amplify the benefits of Server SAN and flash storage.
Industrial IoT, the largest segment of the Internet of things (IoT) with the highest potential value, will require a deep integration between modern IT (Information Technology) and OT (Operations Technology). Modern IT technologies, to be truly extensible to OT, will need a hybrid cloud approach, with by far the majority of data and processing residing at the so-called "Edge". Architectures and software written by industrialists for industrialists, such as GE Predix, are showing how that can work.
The big data arena is at a crossroads. Use cases and tools are proliferating faster than most big data teams are gaining experience. In establishing the big data business capabilities required to cut through the complexity, CIO’s must balance the accessibility of integration of traditional SQL DBMS’s versus speed of innovation in the mix and match open source big data ecosystem.
In the big data domain, businesses are trying to solve complex problems with complex and novel technology -- and often failing. Simplifying the packaging of big data technologies will streamline big data pilots and accelerate big data time-to-value. CIOs looking to establish differentiating big data capabilities need to consider Single Managed Entities to help solve the complexity problem.
Oracle M7 technology is meeting or exceeding the announcement performance claims against previous generation. Even taking very conservative assumptions, the business case for migration from T5 to T7 servers is good. Wikibon concludes that for Oracle software and the servers they run on, the adoption of M7 technology (and T7 server technology) is best practice for these high application and software value workloads.
The premise tested in this research is that high value applications and software should be run on more capable converged performance-optimized infrastructure, even when they constitute a small proportion of the total workload. In contrast, cost-optimized infrastructure will save on short-term hardware costs, but will incur much higher overall costs long-term. The conclusion strongly recommends IT executives adopt a default of performance-optimized converged infrastructure for all mixed workloads when even a small proportion includes high-cost software and/or high-value applications.
Big data pros need to identify which data feedback loops in their machine learning applications can deliver sustainable differentiation through network effects. Starting early is critical because getting to scale is likely to create the "winner takes most" competitive dynamics that have become so common in tech industries. The biggest sin is to wait for the tooling to become automated enough for all competitors to jump in.