SiliconANGLE’s Editor-in-Chief Mark “Rizzn” Hopkins interviewed computational search engine Wolfram Alpha’s Luc Barthelet at SxSWi 2011, and discussed several aspects of the company including its position in the market. Barthelet discussed the differentiation of Wolfram Alpha‘s platform from other search tools, most notably Google; and the company’s emergence in the enterprise sector. Barthelet also detailed about Wolfram’s evolution alongside big data trends.
Luc, in his interview with Mark, talked about their re-engineered API pricing geared to be more in line with pricing models that have emerged around other APIs currently on the market. He also talked about a few of the new apps that they have recently released, geared less towards getting mainstream adoption themselves (though they have been well adopted in college student demographics), but the capabilities of the API in integrating seamlessly with other graphical user interfaces.
Wolfram Alpha released an update to its “knowledge engine” about two weeks ago, enhancing its calculators to cover additional math concepts including vector and matrix manipulation, GCD, LCM, inverses of functions, linear approximation of functions, 3D plotting, base conversions, as well as currency conversion.
But computational knowledge is a concept large search portals are trying to address. Google has tools for things like calculating currency conversation – but it also has Google Instant; and so does Yahoo. Yahoo unveiled a new product called Search Direct in a San Fran press conference, which takes (instant) search to the next level. It doesn’t show links but rich content such as widgets and direct answers while cutting out the links – and guess what, it supposedly does it faster than Google, too.
Big Data, Algorithms, and Replacing Our Boss with an Algorithm
One of the topics Mark has harped on since he’s been back from SxSWi has been the day he sees coming sooner rather than later where the editor’s role in the newsroom starts to be replaced by algorithms.
He wrote on his personal blog last week:
This is important: It isn’t clear to me exactly how prominent a role the algorithm will play in the newsroom, but this much is clear after close to 20 interviews of startups in: computer aided curation is hear to stay.
We’ve profiled dozens of efforts involving computer-run curation-based digital properties: Paper.li, Flipboard, Postpost, Zite, Flavors.me, Genieo, and many others. Several more such efforts were debuted at SxSWi this year, and even more services that fit within the ecosystem of services and products that aid in that computer automated editing process.
Mark asked Wolfram Alpha, in the context of this trend, where they see themselves within the coming New Media ecosystem. Luc doesn’t see their toolset as a tool for curation or total automation, but it clearly has applications in the realm of fact checking.
Luc, from the interview:
“I was listening to President Barrack Obama speak about the productivity of the nation, and said that the United States was first in productivity. It surprised me because I didn’t think it was, so I used Wolfram Alpha to take GDP divided by population ranked by countries, and found out that the United States was 19. Since I’m sure the president has a number of people triple-checking his facts, I knew that couldn’t be the metric that they used.
“I then tried GDP divided by eligible workers, and found that the United States ranked 13 on that list, so here we are, we’re making progress, but not quite there yet. If you divide GDP by employed worker, you find that in that category, the US was number one.”
“It took some work to do, but I think that’s what we’re going to see in the future of journalism people using tools like Wolfram Alpha to make quick calculations against super-large datasets and check facts as they cross their newsdesks.”
Ultimately, as the term and concepts of Big Data gain widespread adoption, competition in this market will increase. As it stands, Wolfram-Alpha leads head and shoulders above any other potential competitors in terms of computation and big data-based fact-checking.