BIG DATA
BIG DATA
BIG DATA
Hortonworks Inc. is kicking off its DataWorks Summit in San Jose, California, this week with the announcement of a new software support subscription that provides unified pricing to organizations as they move between on-premises and Amazon Web Services Inc.-based cloud deployments.
Hortonworks Flex provides a single Hortonworks Data Platform subscription that is transferable between cloud and on-premises deployments. The subscription encompasses both software support and advisory services covering Apache Spark, data preparation tasks known as extract/transform/load, and analytics workloads in the Hortonworks Data Cloud for AWS.
Hortonworks subscriptions were previously sold on a fixed-capacity basis, even if capacity varied due to seasonal or project factors. “If customers wanted to add four nodes, they’d have to add that to the subscription and they couldn’t give them back,” Hortonworks Chief Technology Officer Scott Gnau said in an interview. “This is largely about flexibility and the ability to deploy across different domains and platforms.”
In making the announcement, the company cited Forrester Research Inc.’s 2016 Global Business Technographics Data and Analytics Survey, which said moving into the public cloud is the No. 1 priority for global data and analytics technology decision-makers. About a quarter of Hortonworks’ customers are using the company’s software in the public cloud today.
Hortonworks isn’t publishing the price list and Gnau wouldn’t speculate about whether the change will save customers money. Hortonworks will continue to sell fixed-capacity subscriptions. “In some cases, customers may have both schemes,” he said. “If their environment is predictable most of the time, this is a more economical way” to add capacity temporarily.
The offer is available both to users of the managed Hortonworks Cloud on AWS and to users who choose to build and manage their own clusters. Gnau said the offer would eventually extend to deployments on Microsoft’s Azure cloud platform.
The company is also using its user conference to announce the general availability of version 3.0 of Dataflow, its open-source stream processing platform. DataFlow, which is based on the National Security Agency-developed Apache NiFi project, can be used to manage data flows from edge devices, providing security and streaming analytics through open-source engines such as Apache Storm and Kafka. “You can plug in different streaming engines; it’s engine-agnostic,” Gnau said.
Streams can now be registered in a data dictionary for sharing using metadata descriptions. “Developers can have access to more streams without being the developer who created them,” Gnau said. HDF 3.0 also introduces Streaming Analytics Manager, which allows developers, business analysts and administrators the ability to build streaming applications without writing code.
Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.
Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.