Big data firm Hortonworks revamps pricing to cover both on-premises and cloud
Hortonworks Inc. is kicking off its DataWorks Summit in San Jose, California, this week with the announcement of a new software support subscription that provides unified pricing to organizations as they move between on-premises and Amazon Web Services Inc.-based cloud deployments.
Hortonworks Flex provides a single Hortonworks Data Platform subscription that is transferable between cloud and on-premises deployments. The subscription encompasses both software support and advisory services covering Apache Spark, data preparation tasks known as extract/transform/load, and analytics workloads in the Hortonworks Data Cloud for AWS.
Hortonworks subscriptions were previously sold on a fixed-capacity basis, even if capacity varied due to seasonal or project factors. “If customers wanted to add four nodes, they’d have to add that to the subscription and they couldn’t give them back,” Hortonworks Chief Technology Officer Scott Gnau said in an interview. “This is largely about flexibility and the ability to deploy across different domains and platforms.”
In making the announcement, the company cited Forrester Research Inc.’s 2016 Global Business Technographics Data and Analytics Survey, which said moving into the public cloud is the No. 1 priority for global data and analytics technology decision-makers. About a quarter of Hortonworks’ customers are using the company’s software in the public cloud today.
Hortonworks isn’t publishing the price list and Gnau wouldn’t speculate about whether the change will save customers money. Hortonworks will continue to sell fixed-capacity subscriptions. “In some cases, customers may have both schemes,” he said. “If their environment is predictable most of the time, this is a more economical way” to add capacity temporarily.
The offer is available both to users of the managed Hortonworks Cloud on AWS and to users who choose to build and manage their own clusters. Gnau said the offer would eventually extend to deployments on Microsoft’s Azure cloud platform.
The company is also using its user conference to announce the general availability of version 3.0 of Dataflow, its open-source stream processing platform. DataFlow, which is based on the National Security Agency-developed Apache NiFi project, can be used to manage data flows from edge devices, providing security and streaming analytics through open-source engines such as Apache Storm and Kafka. “You can plug in different streaming engines; it’s engine-agnostic,” Gnau said.
Streams can now be registered in a data dictionary for sharing using metadata descriptions. “Developers can have access to more streams without being the developer who created them,” Gnau said. HDF 3.0 also introduces Streaming Analytics Manager, which allows developers, business analysts and administrators the ability to build streaming applications without writing code.
Image: Pixabay
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU