

Scott Howser, vice president of marketing for Hadapt, joined Dave Vellante and Jeff Kelly of Wikibon on theCUBE this afternoon, broadcast live from the MIT Chief Development Officer Information Quality Symposium.
Before discussing the Hadoop ecosystem that is being widely adopted, Howser reflected on the early days of Hadoop and the authentication challenges which themselves have evolved as the ecosystem has been maturing.
For the full interview, watch it below:
Early data practitioning required performing triangulation on an individual user in order to engage them and determine their authenticity. As Howser noted, utilizing a single sign-on was the Holy Grail. However, this method is no longer able to be a reality between a provider and a user. As additional channels are brought online, verification of whom a user says they are becomes increasingly more complicated. This is true, in no small part, due to a dramatic change in engagement where there are not equal requirements allowing a user access to data.
Prior to the addition of the concept of omnichannel, databases had to be both small and disparate. Upon merging the collected data, drawing any knowledge from the analytics of the pool of information was on the scale of months and years. The need to be able to move faster on collected data led directly to revolution in the Hadoop ecosystem.
Addressing a reconciliation of data quality with omnichannel and schema, Howser commented, “We allow users to dump all that data into one schema.” By doing this, a user is able to normalize data through iterations in Hadoop. As you provide access to more channels, a marketeer is able to draw from mountains of data in one common repository. A direct result of this is a marked decrease in the time required to cull and normalize data, allowing the user the ability to interact with it in a more timely and meaningful way.
Howser acknowledged there would naturally be at trade-off between accuracy and performance. He stated this could be mitigated by building in models that allow you to decide that the more rapid analysis would be good enough. Aiming for an 80 percent accuracy rate would, in his opinion, justify the expedited analytics. “The element of time…the time it takes to define something that rigid takes months,” Howser said. That span of time is not practical for the operation of a business or organization. “Let’s fail fast. Do a lot of iterations. You set some sort of confidence in this particular application.”
Vellante commented how the role of CDO is only a decade old. When considering the role of CDO is to try to achieve information quality, and with Hadoop effecting a change of that model, Vellante queried Howser asking, “How do you see the notion of information quality adapting?” Howser responded the CDO should really have an unparalleled command of the entire business. The CDO should be able to accelerate what the business is trying to accomplish by applying quality information to help the business solve its problems.
The MIT CDOIQ Symposium is being held on the campus of MIT in Cambridge, Massachusetts. Cambridge, as it turns out, is also the new home city for Hadapt HDQ. Howser discussed how the move into the heart of the tech corridor has been a boon for recruiting and dissemination of the benefits of Hadoop. “I believe Hadoop is the operating system of big data. I stand behind that.” The future maturity of Hadoop is signaled by the rapid growth of Hadept. “What people are engaging us to do is transition from legacy methodologies.”
THANK YOU