SiliconANGLE Extracting the signal from the noise. Sat, 01 Oct 2016 01:42:55 +0000 en-US hourly 1 Redefining Hadoop for better data insights | #BigDataNYC Fri, 30 Sep 2016 20:30:10 +0000 This week, ODPi (a nonprofit organization accelerating […]]]>

This week, ODPi (a nonprofit organization accelerating the open ecosystem of Big Data solutions) announced that DataTorrent, IBM, Pivotal, SAS, Syncsort, WANdisco and Xavient have committed to the ODPi Interoperable Compliance Program. This program will make it simpler for enterprises to choose and adopt Big Data technologies and ensures these applications are interoperable across a wider range of commercial Hadoop platforms.

Berni Schiefer, IBM fellow, joined Dave Vellante (@dvellante) and Peter Burris (@plburris), cohosts of theCUBE, from the SiliconANGLE Media team, during BigDataNYC 2016 in New York, NY, to discuss the implications of ODPi’s compliance program for the industry, as well as its impact on future Big Data ventures.

Defining Hadoop differently

Vellante asked if IBM is building into a new Hadoop distribution with ODPi as the framework.

“We’ve had [IBM’s] Big SQL for many years … but the market in the distribution market is fragmented. … You can make software portable to different platforms. … Linux standardization [has] made it easier to port and run on different levels of Linux. ODPi is trying to do the same thing for the Hadoop ecosystem,” said Schiefer.

He explained that while the traditional definition of Hadoop is one with HDFS, MapReduce and YARN, people have replaced components with open-source tools, especially Spark. “Today, I think that Hadoop should be redefined as an ecosystem of collaborative tools … very much like Unix became. … @e should engineer tools to sit on top of that platform,” said Schiefer.

How ODPi came to be

Vellante asked about how ODPi came about, and wasn’t it unusual for competitors to be working together on a project like this? Schiefer explained that it is not a disconnect between competing at the product-offering level and collaborating [with competitors] to standard SQL for the benefit of clients and the industry as a whole. ODPi works to bring together different companies to make sure the “plumbing” is sufficiently standardized for everyone’s benefit.

Burris asked about IBM’s role in SQL and how that relates to Hadoop. “When I was first introduced to Hadoop, it was kind of like the no-SQL space. … It’s kind of turned into the new SQL space,” Schiefer said. “Not that SQL is perfect, but it is powerful, there are abundant skills and there’s lots of tools. Those are things that really help people get going.”

Schiefer concluded: “The ‘holy grail’ of Big Data is not about the data itself, it’s just sitting there doing nothing, [but rather] to get insight from the data.”

Watch the complete video interview below, and be sure to check out more of SiliconANGLE and theCUBE’s coverage of BigDataNYC 2016.

Photo by SiliconANGLE
]]> 0
Avast completes acquisition of rival antivirus company AVG Fri, 30 Sep 2016 20:21:01 +0000 Two months after antivirus company Avast Software s.r.o […]]]>

Two months after antivirus company Avast Software s.r.o. announced that it would be acquiring its rival, AVG Technologies N.V., for $1.3 billion, the deal is done.

According to Avast, while the buyout of AVG has finally concluded, it will be up to a year before the company has been fully absorbed by Avast.

Both Avast and AVG have their origins in post-Soviet Prague in 1990s in what is now the Czech Republic. The two companies specialized in lightweight, consumer-grade antivirus software, which offered both a basic free version and a more feature-rich annual subscription version. Through its acquisition of AVG, Avast now controls a significant portion of the home antivirus market.

“The combined company now has over 400 million users, more than 40 percent of the world’s consumer PCs outside of China and the largest consumer security installed base in the world,” Vince Steckler, chief executive officer of Avast, said in a statement. The US is its No. 1 market, with 58 million users.

Avast noted in its statement that thanks to its acquisition, the company will be able to support deployments to larger organizations using AVG’s reseller base. The company also said that while AVG’s operations will be folded into Avast’s corporate structure over the next year, AVG’s security and antivirus products will still be offered under their existing brand.

“We want our customers to be reassured that whether you use an AVG product or an Avast product, we will continue to support you,” Steckler explained. “We are nothing without our customers and partners who have helped us get to where we are today.”

Image courtesy of AVG Technologies
]]> 0
Splunk’s efforts to provide Answers to its community | #splunkconf16 Fri, 30 Sep 2016 19:00:34 +0000 No matter how sophisticated a set of developed solution […]]]>

No matter how sophisticated a set of developed solutions tools may be, the likelihood of retaining customer interest gets a big boost from those tools having an effective support system in place for users when they need some extra help.

At this year’s Splunk.conf event, Rich Mahlerwein, senior systems engineer at Forest County Potawatomi Community, and David Shpritz, information security consultant at Aplura, LLC, sat down with John Walls (@JohnWalls21) and John Furrier (@furrier), cohosts of theCUBE, from the SiliconANGLE Media team, to talk about the Splunk community and the many uses people are finding for its utilities.

Coming together

Both speakers were highly positive about the Splunk conference and the community it represented. “The Splunk community is completely different, and it’s unlike any other community I’ve ever seen; [it’s one] where everyone wants to help. Everyone is excited and loves this product, and they want everyone else to be excited and love this product as well, and so they help,” Shpritz said.

“I think Splunk’s willingness to put trust in its community, especially with things like [Splunk] Answers – Answers is a huge success, as far as I’m concerned – in terms of how quickly different questions, when they’re answered, get answers, the quality of those answers [is] usually pretty outstanding,” he added.

Solving the strangest problems

As far as the use-cases for Splunk go, the interviewees pointed to an incredibly wide range of applications, but they were able to hone in on some of the biggest areas. “I think security and services… those are the big focuses for a lot of our users who are coming in and talking to the community, looking at things like different security use-cases,” Shpritz said.

For Mahlerwein, the influx of these answer-seekers was a great thing. “I love seeing new people show up in the channels, or on Answers and stuff. Sometimes they come up with the most amazing solutions, too, because they approach things from a totally different angle,” he shared.

“So I really like the new blood. And the other thing I really like about the Splunk community is that, in general, I think it’s like the smartest group of people I know. It’s like there aren’t any slackers in there at all. Even the new people may not know much about Splunk yet, but gosh, they come up with some good things,” he added.

Asked to identify some of the strangest problems they’ve seen people looking to use Splunk to solve, both guests were hesitant, primarily because of the many possible answers they’d encountered. “It’s such a flexible product; it’s put to so many different kinds of uses that to pick out one or two or even three really weird things [people have come looking for solutions to] is really hard, because there’s a whole tier of the weird uses of Splunk,” Mahlerwein said.

Watch the complete video interview below, and be sure to check out more of SiliconANGLE and theCUBE’s coverage of Splunk.conf 2016.

Photo by SiliconANGLE
]]> 0
Data 3.0: Empowering enterprise by bringing data to the center | #BigDataNYC Fri, 30 Sep 2016 18:00:05 +0000 Business has a relationship with information. The moder […]]]>

Business has a relationship with information. The modern world of data has only enhanced that relationship, allowing businesses to learn more and see further. As the digital revolution continues, data is becoming the key to business success. As this data moves closer to the center of business, the tools used to manage, understand and present this information must also improve.

To gain some understanding of the latest data management tools, Dave Vellante (@dvellante) and Peter Burris (@plburris), cohosts of theCUBE, from the SiliconANGLE Media team, visited the BigDataNYC 2016 conference in New York. There, they sat down with Ronen Schwartz, senior VP and GM of Data Integration and Cloud Integration at Informatica Corp.

Informatica and the history of data

The discussion opened with a brief history of how Informatica and data have changed together. Schwartz related that Informatica was always focused on data. In the beginning, data was always a part of a single application. It was important, but only in the context of that system.

Later, enterprise started to consider cross-application data, but it was still very limited. He called the world we’re living in “Data 3.0,” with Big Data and small data, and more data consumers than ever before.

Schwartz noted that data was moving to the center of the enterprise. He stated that it wasn’t the application or the data warehouse that mattered, but the whole data.

“Data is the key asset of the enterprise in general,” he said.

Intelligence, data and access

If data is the key asset, then mapping the company’s assets is a very important thing, Schwartz pointed out. Think of financials, he said. A company needs to know where its money is, what it’s doing and who has access. The same is true of their data assets. He claimed that one of the things Informatica brings to the table is the intelligence to map that data.

Users must have access, but a company needs to control who can see what and at what level. Schwartz stressed that a true Data 3.0 infrastructure must support this framework with the intelligence to empower users with the data. Governing the data in the right way gets better results and improves the value of the information.

“In many businesses, the data is becoming the business,” he said.

Watch the complete video interview below, and be sure to check out more of SiliconANGLE and theCUBE’s coverage of BigDataNYC 2016.

Photo by SiliconANGLE
]]> 0
What you need to know about Nutanix’s blockbuster IPO Fri, 30 Sep 2016 17:44:11 +0000 Technology firms that have been holding off their initi […]]]>

Technology firms that have been holding off their initial public offerings of stock for fear of lackluster investor interest may be rethinking those plans today.

The latest tech company to go public, the cloud computing company Nutanix Inc., saw its shares jump 66 percent from their initial offering price when trading opened today. Originally expected to sell its shares for $11-$13 apiece, the maker of hyperconverged systems that combine computing and storage into one appliance had hiked the price to $16 Thursday night, netting $238 million.

By the end of trading Friday, shares had jumped 131 percent, to an even $37 a share. The company’s market capitalization of about $5 billion is now two and a half times higher than the $2 billion valuation that it received after its last round of private equity financing in 2014.

Investors snapped up Nutanix shares despite the company’s never having turned a profit, a fact that might have turned off some investors a few months ago. But investors are still buying high-growth companies, and Nutanix’s revenues jumped 85 percent in the first half of 2016, according to its pre-IPO earnings disclosure. The cash may help Nutanix pay back the $75 million loan that it took out from The Goldman Sachs Group Inc. earlier this year before the 10 percent annual interest rate starts taking its toll.

Wall Street’s aversion to money-losing tech firms may have been softened in recent months by the successful IPOs of Talend Inc., Twilio Inc. and Line, the Japanese mobile messaging company. All three companies’ stock market success can be credited to their fast revenue growth and strong competitive positions, factors that are also working in favor of Nutanix.

“There is a strong future for a new generation of infrastructure that delivers the simplicity and agility of cloud,” said Stu Miniman, senior analyst with Wikibon, the research firm owned by the same company as SiliconANGLE. “Nutanix is the leading solution in the hyperconverged infrastructure space and has clear differentiation in a market that is now getting the attention of the largest IT players.”

Although investors clearly liked the offering, the massive first-day pop in price isn’t completely positive for Nutanix itself. When shares rise this much in an IPO, it’s also an indication that underwriters priced it too low, meaning that the company could have raised much more money had it priced shares higher.

IPO pricing can be tricky, especially in an uncertain market. It’s common knowledge that underwriters can push for a lower price that guarantees a pop so they can reward clients and other insiders, who often can sell quickly at a profit. Company insiders don’t benefit immediately because they generally are barred from selling for a period of months to years.

However, the fact that Nutanix raised its offering price on the eve of the IPO might indicate that the demand was simply much higher than underwriters or the company had expected.

Nutanix Chief Executive Dheeraj Pandey spoke about the company to theCUBE, the video unit owned by the same company as SiliconANGLE, at Nutanix’s .NEXT conference in August:

(* Disclosure: TheCUBE was the media partner at the conference. Neither Nutanix nor other sponsors have editorial influence on content on theCUBE or SiliconANGLE.)

Photo courtesy of Nutanix

]]> 0
New alliances focus on open-source, data science empowerment | #BigDataNYC Fri, 30 Sep 2016 17:00:44 +0000 How can data science make a true market impact? Partner […]]]>

How can data science make a true market impact? Partnerships, particularly amongst open source communities. As IBM solidifies its enterprise strategies around data demands, two new partnerships emerge: one with Continuum Analytics, Inc., advancing open-source analytics for the enterprise; and another with Galvanize, initiating a Data Science for Executives program.

Continuum Analytics, the creator and driving force behind Anaconda — a leading open data science platform powered by Python — has allied with IBM to advance open-source analytics for the enterprise. Data scientists and data engineers in open-source communities can now embrace Python and R to develop analytic and machine learning models in the Spark environment through its integration with IBM’s DataWorks Project.

The new agreement between IBM and Galvanize, which provides a dynamic learning community for technology, will offer an assessment, analysis and training element for Galvanize’s Data Science for Executives program. This program empowers corporations to better understand, use and maximize the value of their data. The program will support IBM’s DataFirst Method, a methodology that IBM says provides the strategy, expertise and game plan to help ensure enterprise customers’ succeed on their journey to become a data-driven business.

Joel Horwitz, director of Corporate and Business Development, Analytics, at IBM, and Travis Oliphant, CEO and cofounder of Continuum Analytics, Inc., and Jim Deters, founder and CEO of Galvanize, Inc., joined Dave Vellante (@dvellante), cohost of theCUBE, from the SiliconANGLE Media team, during BigDataNYC 2016. The group talked about these exciting new alliances and what they mean for all three companies.

Connecting advanced analytics apps

The discussion started with Vellante asking Horwitz to frame up IBM goals with these recent initiatives.

“We’ve been on a mission … to really accelerate what’s happening in the data science community. … We recognized that Spark … was at the forefront of this new era of data science and analytics. …The actual intention of Spark was machine learning. … It’s now coupled with Hadoop. How do you extend that core Spark framework and go beyond? That’s why we’re partnered with Continuum Analytics … to actually explore that and start broadening the community,” said Horwitz.

As further explained by Oliphant, “Python ‘fits in your head,’ and it helps connect the experts with what they’re trying to accomplish … at the same time, it delivers huge value. We’re real excited about the partnership with IBM, because what we’re trying to do is connect those advanced analytics applications that come from machine learning. … We want to connect that to the Spark system.”

Teaching for the 21st century

Vellante asked Deters what Galvanize is all about.

“We run a 21st-century school where we teach data science, data engineers and software engineering. … We have a dedicated data science faculty, hundreds of students, hundreds of different members of corporate innovation partners. Think of this melting pot for learning; immersive education. … In addition to having open enrollment consumer products, we’ve been taking our curriculum and IP and centering it towards rescaling and modernizing corporations … to replatform themselves … as data companies,” said Deters.

The new Data Science for Executives program will include elements of both data science and data engineering. This program will give business leaders an understanding of both elements is key to developing and running a successful business data strategy, a must in today’s data-driven world, Deters said.

Watch the complete video interview below, and be sure to check out more of SiliconANGLE and theCUBE’s coverage of BigDataNYC 2016.

Photo by SiliconANGLE
]]> 0
Shark Tank investor Robert Herjavec talks security – and ugly Christmas sweaters Fri, 30 Sep 2016 16:00:19 +0000 You may know Robert Herjavec, chief executive and found […]]]>

You may know Robert Herjavec, chief executive and founder of the Herjavec Group, as the shark investor on ABC’s Shark Tank. But he began his career in the technology sector, and today his company is one of the largest information security providers in the world.

Herjavec and Atif Ghauri, senior vice president at the Herjavec Group, were at the Splunk .conf 2016 conference in Orlando, FL, this week to speak about cybersecurity and data transformation. The two sat down with John Furrier (@furrier) and John Walls (@JohnWalls21), cohosts of theCUBE, from the SiliconANGLE Media team, to talk about celebrity, ugly Christmas sweaters and cybersecurity. Herjavec is theCUBE‘s Guest of the Week. (*Disclosure below.)

Humble beginnings

Walls opened the interview by asking Herjavec to talk about his company and why its focus is cybersecurity. He addressed the early years when things didn’t begin as expected and explained the company’s growth.

“I’ve been in the security business for about 30 years. I actually helped bring a product called Check Point to Canada — firewalls, URL filtering, that kind of stuff. And we started this company [Herjavec Group] 12 years ago, and our vision was to do managed services … and we thought we’d do $5 million in sales in our first year, and we did $400,000. The market just wasn’t there,” he explained.

“Sim technology, log aggregation wasn’t what it is today. I think at the time it was Envision and our [first company] BRAK Systems that was really the first go-to-market sim. … So our initial business became around log aggregation, security, writing parsers, and then over time it grew and took us five years to get to $6 million in sales. And we’ll see $170 million this year. We went from a Canadian company to a global entity. We do a lot of business in the states, UK, Australia, everyone,” he said.

Invest in tech or the ugly Christmas sweater?

Furrier remarked that the Internet is creating a great deal of entrepreneurs in the enterprise.

“We’ve always been in the enterprise business, so we are seeing a lot of growth in that area,” Herjavec said. “A lot of VC [venture capitalists] money is going into that area because you can measure that level of return and you can go and get those customers. But on our show, we’re a bubble. We don’t do a lot of tech deals, like we’re talking [about], because it’s boring TV.”

Not that tech products are boring, he added. “Tech people love tech, consumers love the benefit of tech,” he said. “No consumer opens up their iPhone and says, ‘Oh my gosh, I love the technology behind my iPhone.’ They just love their iPhone. And our show is really a consumer platform. What we have learned is … we don’t invest in tech anymore. We invest in slippers, ugly Christmas sweaters, food products, because if you can tap into that consumer base, you’re good to go.”

Data is tough to value

Furrier asked Herjavec to put a price on data. He responded by saying that he feels its only worth as much as a company values it.

“I don’t think that data has any value. It’s the effect of the data that has value, and its very singular. It’s what somebody does to it. Whatever that data is worth to you from a business perspective, its worth fundamentally more to an outside bad party because they can package that data and sell it to a competitor, a foreign government, all those kind of places. So it’s the collection of raw data and applying it to something that has meaning to a third party,” he said.

“I don’t see it on the balance sheet as a hard core value because it has to have a transformative value. You have to do something with it,” Herjavec concluded.

Watch the complete video interview below, and be sure to check out more of SiliconANGLE’s and theCUBE’s coverage of Splunk .conf 2016.

(* Disclosure: TheCUBE was the paid media partner at .conf. Neither Splunk nor other sponsors have editorial influence on theCUBE or SiliconANGLE content.)

Photo by SiliconANGLE
]]> 0
Aiming to supercharge AI, Amazon launches cloud service powered by graphics chips Fri, 30 Sep 2016 15:56:22 +0000 Confirming rumors rumors from a few weeks ago, Amazon.c […]]]>

Confirming rumors rumors from a few weeks ago, Inc. today unveiled a new cloud computing service for artificial intelligence and other data-intensive applications.

The new addition to Amazon Web Services’ Infrastructure-as-a-Service (IaaS) is based on Graphics Processing Units (GPUs) from Nvidia Corp., which increasingly are used in the branch of AI called deep learning neural networks that benefit from the ability to do data crunching in parallel on a massive scale.

“These instances were designed to chew through tough, large-scale machine learning, deep learning, computational fluid dynamics, seismic analysis, molecular modeling, genomics, and computational finance workloads,”  AWS Chief Evangelist Jeff Barr said in the announcement.

The cloud giant’s new “instance,” or slice of a computing virtual machine, called the P2 uses up to eight of Nvidia’s Tesla K80 accelerator cards. The largest P2 configuration provides about 40,000 cores in total, which Amazon claims makes it the most powerful GPU-optimized cloud instance on the market.

To put this into perspective, AWS EC2 vice president Matt Garman highlighted that the P2 can perform operations involving single-precision floating-point numbers (values occupying 4 bytes) seven times faster than the largest instance in the previous-generation G2 series. And it’s even better at handling double-precision values, delivering up to 60 times more performance than its predecessor for a total of 23 teraflops.

The instance is available immediately from Amazon’s U.S. and Ireland data centers. It should put the cloud giant in a much better position to take on rivals like Microsoft Corp. that have also been targeting GPU-optimized workloads lately. The software giant rolled out an instance series optimized for artificial intelligence workloads only a few weeks ago, while IBM Corp. introduced a competing virtual machine family back in July.

Both offerings are based on the same Tesla K80 as the P2, which means that it’s well within the vendors’ reach to level the playing field. And they’ll no doubt try, if the fierce competition that the IaaS market has seen so far is any indication.

This one-upmanship is good news for the growing number of organizations that are looking to run their AI algorithms, simulations and other high-performance computing in the cloud. In a note to clients today, Global Equities Research analyst Trip Chowdhry said the announcement is a positive for both Amazon and Nvidia.

Image via Pixabay
]]> 0
The database evolution: Changing the way people think about data integration | #BigDataNYC Fri, 30 Sep 2016 15:45:15 +0000 In the era of Big Data, some may say what was old is ne […]]]>

In the era of Big Data, some may say what was old is new again. When it comes to data, the notion of collecting it has always been around since the old IBM days. Today, with the increase in available data, there is also a wide variety of options available as well. While technologies like Apache Hadoop offer upfront cost savings for housing your data, what are the other costs you should be considering when integrating your data onto a new platform?

James Markarian, CTO of SnapLogic Inc., a software provider that offers Integration Platform as a Service (IPaaS) tools for connecting data to the cloud data sources, knows the price of moving data. He discussed the evolution of databases and the current unknowns of data integration with George Gilbert (@ggilbert41) and Dave Vellante (@dvellante), cohosts of theCUBE, from the SiliconANGLE Media team, during the BigDataNYC event.

The database evolution

Markarian began the interview talking about the explosion of data and spoke about the marketplace and how it has radically grown over the years. ”When you look back 10 or 20 years, you could pick any flavor of database … as long as it was Oracle, IBM or Microsoft. Now you have relational databases; you have NoSQL databases, and then you have purpose-built databases for doing analytics so everything from Vertica [HPE] to Redshift [Amazon] and, of course, Hadoop; it’s exciting for us in technology,” he observed.

However, with all the new and exciting technology to choose from, the customer is left with the challenge of implementing it. Markarian said that almost everything has changed, from the platforms to people’s expectations about the value of their data. He also noted that the complexity has gone up considerably and it is now both challenging and exciting.

Minting IT scientists

According to Markarian, the issue with all this evolution is that things don’t change overnight. “We didn’t mint IT scientists or people with the technological skills all of a sudden,” he commented. He goes on to say that the while some technology is cheaper from a hardware and software perspective, there is still “people cost” — the cost of hiring individuals who can make it work while keeping up with the latest technology.

He does see a potential solution coming. “I don’t know if you heard about this thing; it’s called the cloud. It’s going to be big,” he feigned. He believes that many of the improvements in the app space for transnational applications are going to be moving to the cloud, making it simpler.

Helping to make the move

SnapLogic offers an IPaaS solution for customers who want to integrate applications and data in the cloud. The platform, hosted on Amazon Web Services, allows customers to deploy on-premise or in the cloud, and the data stays where the customer wants it.

The company is also incorporating data integration for analytic purposes and Hadoop integration in cloud and on-premise, Markarian said.

Watch the  video interview below, and be sure to check out more of SiliconANGLE and theCUBE’s coverage of BigDataNYC 2016.

Photo by SiliconANGLE
]]> 0
Chinese open source blockchain startup Antshares raises $4.5M through crowdsourcing Fri, 30 Sep 2016 10:22:59 +0000 Chinese open source blockchain startup Antshares has ra […]]]>

Chinese open source blockchain startup Antshares has raised more than $4.5 million in a crowdsourced fundraising.

Founded in 2015, Antshares is building a decentralized and distributed network protocol based on blockchain technology that will allow users to digitalize assets or shares, and accomplish financial transactions through a peer-to-peer network including registration and issuance and actual transactions along with settlement and payments.

Based in Shanghai, China, the company offers the Onchain platform that aims to service goods and assets ranging from real estate titles to corporate equity, supply chain assets to creditors’ claims.

“Our vision is to make Onchain a truly universal Blockchain framework,” Antshares founder and Chief Executive Officer Da Hongfei said in a statement sent to SiliconANGLE. “Utilizing different plug-in modules, our framework could be applied for a public chain, a consortium chain or even a private chain. Our cross-chain adaptor module, currently under development, creates interoperability among these different chains.”

Antshares claims to incorporate a number of firsts, including being the first open-source blockchain project developed in China; first significant Chinese initial crowd offering; and the first organic Chinese blockchain project to work with both Microsoft Azure and the Hyperledger Project.

The company is preparing to ready its wallets for global release by the end of October, and is also developing a mobile wallet for iOS, Android and Windows Phone.


Not only is Antshares interesting because it managed to raise $4.5 million through crowdsourcing, the company itself is also attempting to build a platform which venture capital-backed startups with their own proprietary versions of the blockchain are also attempting to do: build a blockchain-based financial transaction platform.

Antshares takes the open source part of their platform seriously, offering the code as they build it under an MIT license, a permissive license that only puts very limited restrictions, with the working code hosted on GitHub for anyone to download.

“If you are interested, you can download, copy, modify or fork it,” the company notes on its about page. A copy of the code is available from their Github page here.

Image credit: Antshares
]]> 0