SiliconANGLE Extracting the signal from the noise. Thu, 24 Jul 2014 12:56:02 +0000 en-US hourly 1 Study: People are “more honest” when chatting to a robot Thu, 24 Jul 2014 12:35:44 +0000 Continue reading ]]> small__12728858904A study has shown that chat-bots might be a better alternative to filling in questionnaires when it comes to screening applicants for security clearance.

The research was carried out at National Center for Credibility Assessment along with military IT contractor ManTech International, and found that people were more honest when chatting with a robot than when writing their replies.

ManTech is a mega-corporation that does everything from providing IT and software to intelligence agencies, to multimillion-dollar maintenance jobs for the US Navy. Meanwhile, the National Center for Credibility Assessment’s reason for existence is to “provide graduate and continuing education courses in psychophysiological detection of deception (PDD)”.

The study saw 120 US Army trainees fill out traditional pen ‘n’ paper questionnaires about their lifestyles, before sitting down in front on an automated chat-bot. On the whole, when sat in front of a computer-generated avatar, the trainees were a lot more forthcoming about sensitive topics like drug use, alcohol abuse and psychological problems they might have suffered. The results suggest that chat-bots might be a better alternative than bog-standard questionnaires – though we should point out that ManTech is just the kind of company that would like to build such software.

“Automating this process using a [computer graphics] interview format could save time, and allow agencies to utilize their human interviewers more effectively,” the researchers wrote.

Interestingly the study, which was published in the Computers in Human Behavior scientific journal, didn’t involve any sophisticated AI. The software relied on a simple scripted speech format that would ask follow up questions based on the answers it received. The avatar’s face was ethnically ambiguous and didn’t display any emotion, though almost a quarter of those interviewed claimed they did see an emotional response.

As well as being more open in front of the chat-bot, most interviewees said they felt more comfortable with the chat-bot than filling out an old-fashioned questionnaire on a sheet of paper.

However, the researchers noted that chat-bots are still some way off being able to replace humans for screening interviews. But if they can replace questionnaires, it might help use save a few trees at least :-)

photo credit: Alisa Perne – via photopin cc
]]> 0
Your guide to international Big Data universities: IBM edition Thu, 24 Jul 2014 12:00:20 +0000 Continue reading ]]> college graduatesThe Big Data market is expected to grow to $28.5 billion by the end of 2014 and to top $50 billion by 2017, according to a recent Wikibon report. With the growing market comes millions of new Big Data and Analytics jobs that are being created across the globe. But the amount of new jobs far outweigh the number of scientists and engineers who have the education to fill them.

“By 2015, 4.4 million IT jobs globally will be created to support Big Data,” said Peter Sondergaard, Senior Vice President and Global Head of Research at Gartner, in a statement. “But there is a challenge. There is not enough talent in the industry.”

This is good news for those interested in becoming data scientists, as there are a large number of universities that IBM and other industry vendors are now partnering with to develop undergraduate and graduate degree programs designed to prepare students for Big Data and Analytics careers. Here, we will take a look at some universities who are partnering with IBM to offer degrees for aspiring data scientists.


Athens University of Economics and Business


To help narrow the data scientist skills gap, IBM is partnering with more than 1,000 universities across the globe to develop curriculum to prepare students for Big Data and analytics careers. “We’re…working with…universities globally to actually put together a curriculum—both in the business school as well as in the technical schools—for certifications and advanced…Masters classes around various data type jobs,” said Inhi Cho Suh, Vice President and General Manager of Big Data, Integration & Governance on theCUBE to co-hosts John Furrier and Jeff Kelly at Hadoop Summit 2014.

IBM’s latest educational initiative in Europe came in April 2014 when IBM announced that it is collaborating with the Athens University of Economics and Business (AUEB) in Greece to create the first national Postgraduate Degree in Business Analytics, expected to launch in September 2014.


University of Piraeus in Greece


IBM also announced in April 2014 that it will partner with the University of Piraeus in Greece on the development and design of hands-on practical sessions in Business Intelligence and Business Analytics modules of MBA programs.


Ohio State University


The worldwide shortage of professionals trained in data analysis and critical thinking is occurring at a pivotal moment in history, according to Christine A. Poon, dean of Ohio State University’s Fisher College of Business. “While leaders in all industries have the data at their fingertips,” she wrote in a blog post, “they lack the highly skilled workforce to connect the dots and advance their businesses and organizations to new heights.”

classroom school desks educationTo help with the shortage problem in the United States, IBM announced a collaboration between Ohio State University and the IBM Client Center for Advanced Analytics in Columbus, Ohio to develop new curricula at the undergraduate, graduate and executive education levels to help students (and mid-career professionals) gain the latest skills in analytics. “Our strong collaboration with IBM will help our students across a variety of majors gain the latest skills in this burgeoning Big Data discipline and set them on a path to secure the high skilled jobs of the future,” said Poon in a statement.

In February 2014, Ohio State University announced the details of its new undergraduate major in data analytics. The new major is structured in three parts: core subject matter (mathematical, statistical, and computing foundations), discipline-specific specializations (visual analytics and sense-making, system modeling, pattern recognition, and machine learning), and what the school calls “an integrative experiential education” component.


More U.S.-based Big Data programs


In addition to partnering with Ohio State University, IBM in May 2014 announced that it is also now partnering with Boston University, Case Western Reserve University, Johns Hopkins University and the University of Missouri to offer Big Data and Analytics curricula. Boston University’s Metropolitan College is offering a Master of Science degree in Computer Information Systems with a concentration in Database Management & Business Intelligence. Case Western Reserve University is launching a new undergraduate program in data science and analytics in the Fall 2014 semester. The program includes a major and a minor in applied data science, and eventually a post-baccalaureate certificate program.

Big Data, Worldwide connected Big DataThe Johns Hopkins University’s DC-based Center for Advanced Governmental Studies is offering a Master of Science in Government Analytics and a Certificate in Government Analytics to “provide students with the needed skills to address contemporary political, policy and governance challenges.” And the University of Missouri is developing an interdisciplinary Master of Science in Data Science and Analytics degree, providing students with access to IBM’s Open Cloud Architecture to “have a comprehensive skill set in building, deploying, and managing cloud resources to analyze big data in journalism, engineering, informatics, and learning analytics.”

Other universities in the United States with which IBM is partnering to develop Big Data courses include: Arizona State University, Babson College, Dakota State University, Illinois Institute of Technology, Illinois State University, Indiana University, Iowa State University, Northwestern University, Rensselaer Polytechnic Institute, San Jose State University, Southern Methodist University, University of Arkansas at Little Rock, University of Arkansas at Fayetteville, University of Denver, University of Colorado at Boulder, University of Maryland in College Park, University of Massachusetts in Boston, University of North Carolina at Charlotte, University of Southern California, University of Texas at Austin, University of Tennessee at Chattanooga, University of Tennessee at Knoxville, University of Virginia and Worcester Polytechnic Institute.


Over 30 universities throughout China


Earlier this month, IBM announced a major collaboration with China’s education ecosystem focused on “addressing the Big Data and Analytics skills opportunity” in China. As part of the collaboration, IBM will initially help launch undergraduate and graduate programs in 30 universities to help prepare students for Data Scientist and Chief Data Officer jobs. “Big Data is big business, but its rapid growth has outpaced colleges’ and universities’ ability to develop and implement new curriculums,” said Li Shu Chong, President of CCID Consulting, in a statement. “IBM’s extensive initiative is poised to help develop new talent in China that will be needed to realize the full potential of Big Data.”

The seven pilot schools that will roll out new Big Data and Analytics programs this Fall include the Beijing Institute of Technology, Fudan University, Guizhou University, Huazhong University of Science and Technology, Peking University, South China University of Technology and Xi’an Jiaotong University.


Watch the Hadoop Summit 2014 interview between theCUBE co-hosts John Furrier and Jeff Kelly and IBM’s Inhi Cho Suh:


Photo credit: Herkie via photopin cc
Photo credit: dcJohn via photopin cc
Photo credit: marsmet547 via photopin cc
Video courtesy of theCUBE
]]> 0
The extinction of the CIO | #MITIQ Thu, 24 Jul 2014 11:00:51 +0000 Continue reading ]]> worryThe takeaway from Day 1 of MIT’s CDOIQ Symposium seemed to focus on the redefinition of the CDO in the corporate structure and the expected dissolution of the position of CIO in many organizations.

Commenting on this fact, social media strategist Paul Gillin stated, “I think that is a remarkable comment about the CIO going away. We took it for granted that organizations need a top tech person in charge.” The advent of tech delivered in a service model has hastened this latest development. The CDO seems ready to help companies leverage their competitiveness by helping to translate their assets into data form.

Wikibon’s lead Big Data analyst Jeff Kelly noted how General Electric seemed to have recognized and adopted that strategy early on. “[GE's] value will be less about equipment and more about data coming from that equipment,” he stated. He explained how the CIO’s role was in bringing stakeholder’s into the process early on by engaging them with a generic direction the organization should head. “I don’t know if the CIO is going away, but the CDO is becoming much more strategic going forward.”

While many companies are looking at the new world of data, one thing appears clear: The best use of data will not be realized by throwing money at it and hoping for a solution to appear. Much like the storied Oakland A’s and their use of data to defy the odds, dramatized in the film ‘Moneyball’, the true innovations will arise from organizations that are starved and that have to rely on the data to show them the way forward.

“I think what may be happening in MLB and other industries,” Kelly noted, “is that one team applies it and it works and others want to follow suit. It’s not a tech problem. You can’t buy tech, throw money at it and solve the problem.” He instead feels the true strength of an organization’s data is found in its people and its processes. If a company is unwilling or unable to change their culture around this new paradigm they will be unable to yield the best results from their data.

Alluding to an earlier interview with MIT’s Jeanne Ross, Wikibon co-founder and chief analyst, Dave Vellante, stated, “Companies are not as data driven as they say they are. That is my take away. They are metrics driven.” This distinction is important because those that are metrics driven focus much more on their output while a data driven organization puts a premium on their input. Vellante notes companies like Google and Amazon are good examples of industry transformative companies that will continue to lead the way into the future.

This year’s MIT CDOIQ Symposium will enjoy continued live coverage on SiliconANGLE’s theCUBE at

photo credit: zetson via photopin cc
]]> 0
Can ‘laser air waveguides’ replace fiber-optics? Thu, 24 Jul 2014 10:20:44 +0000 Continue reading ]]> origin_6933867The Internet is delivered to your home, place of work or favorite coffee shop by fiber-optic cables. Lying just underneath our feet, these cables carry data that travels as fast as the speed of light.

But those fiber-optic cables can be a bit of a nuisance for those who actually have to lay them, especially if you’re talking about doing so in remote places like deserts, or even outer space.

So, a team of researchers led by Howard Milchberg, professor of physics and electrical and computer engineering at the University of Maryland, is working on a plan to do away with those fiber-optic cables altogether, using just air to guide the light. That’s not an easy task, because the cables are needed for a very good reason – shoot a laser beam throught the air and it’ll spread apart and lose intensity the further it travels.

What fiber-optic cables do is create a kind of tunnel that allows the laser to travel to its destination without degrading in this fashion. It bounces along the cable without losing its intensity, which means data can travel across vast distances in milliseconds.

But Milchberg and his team have devised a way to make air mimic fiber-optic cables by creating a ‘wall’ of low-density air that’s surrounded by a core of much denser air. Called “air waveguides”, these invisible ‘tubes’ are created by firing short, powerful laser pulses through the air. As the laser beam passes, it heats the air, leaving behind a filament which is less refractive than the air which surrounds it.

These tubes can only sustain themselves for a few milliseconds, but that’s about a million times longer than the laser pulse itself. It’s therefore possible to send a second beam carrying data along the tube formed by the first laser. As Milchberg explains, “milliseconds is infinity” when you’re talking about lasers.

“It’s like you could just take a physical optical fiber and unreel it at the speed of light, put it next to this thing that you want to measure remotely, and then have the signal come all the way back to where you are,” said Milchberg in a release.

According to their tests, signals that passed through the ‘air cable’ were 1.5 times more powerful than those which were beamed through ‘normal’ air. The beams were transmitted a distance of three feet – the next step for the researchers will be to increase this range to 150 feet.

Laser air waveguide

Illustration of an air waveguide. The filaments leave ‘holes’ in the air (red rods) that reflect light. Light (arrows) passing between these holes stays focused and intense.

Image credit: University of Maryland


As far as practical uses for this technology go, the team believes air cables could be used to facilitate communications in some of the world’s most remote locations, where it simply isn’t practical to go and start laying fiber-optics everywhere. They also believe air cables could be used to communicate with humans and machines in space. Furthermore, they say the technology could be used to probe the Earth and other planets to make topographic maps, or to examine chemicals in the atmosphere.

Main photo credit: TheAlieness GiselaGiardino²³ via photopin cc
]]> 0
You may not need Big Data after all | #MITIQ Thu, 24 Jul 2014 00:02:33 +0000 Continue reading ]]> WarningThe business buzzword over the past two years has been “Big Data”. Companies are trying to figure our how they can leverage their collected data and translate it into a competitive advantage. However, according to the Director of MIT’s Sloan School Center for Information Systems Research, Jeanne Ross, this approach is not necessarily a one-size-fits-all for today’s organizations.

Ross, co-author of the article ‘You May Not Need Big Data After All’, cautions businesses against buying into the hype around Big Data.

“I think you grow into Big Data,” Ross notes. She explains that there are companies who find the competitive advantage works within their specific industries. As an example, she notes that the oil and gas industry has long employed Big Data for helping them to decide when and where they should place a billion dollar well. The success in one industry, however, doesn’t necessarily translate into success in others. “Many times we know great things about our customers. We just haven’t figured out a way to address them.”

When asked if the fear is misplaced that some companies feel in that they can’t address the Big Data they have, Ross states, “No, not misplaced at all. If you don’t think you can do it, you probably can’t.” For organizations recognizing the potential value of Big Data for the first time, this news could be disheartening.

Watch the interview in its entirety here:

“I don’t think most companies are data-driven,” explains Ross. “I think they are metric driven.”

This differentiation is important. Today’s companies can respond to certain kinds of data but in order to truly be a data-driven organization, they have to recognize which data is important. As an example, Ross cites Foxtel, a pay TV service based out of Australia.

“They saw what products were going out and what channels people wanted,” she states. Even with that information they were unable to make strategic decisions. “They went back and started looking at segments and realized what ‘data driven’ would be. They didn’t have the stomach to go back and do that.”

Where the CDO fits in


Discussing the emerging role of the CDO, Ross explained that too often there is a propensity to assume that once a CDO is brought into an organization all data issues will be addressed by that role and that little to no further attention is required. With Gartner projecting a 25 percent adoption of a CDO role in companies by next year, Ross claims most companies likely don’t need to create this position.

The key to running a successful organization is identifying and maintaining a single source of truth with respect to data. Many divisions within a company will manipulate data to show that they are running at a profit or contributing significantly to the organization’s bottom line. In the long run, this can be detrimental to the company because different data can show different outcomes.

Once companies adopt a single source of truth in their data, Ross believes it is of utmost importance that it is adopted in a top-down strategy. “We have to let people know mistakes have to be made. The faster you make mistakes, the more you can learn and the faster you can grow.” This strategy is ineffectual, however, if you start in the middle of the organization as people will be less willing to admit mistakes and failure if it hasn’t been adopted into the company’s cultural model.

The swiftly moving current of technology, especially over the previous five years, should be viewed critically by companies hoping to somehow gain a competitive advantage. Leveraging Big Data requires more than just a willingness to throw money at the problem. It requires a full understanding on the part of the company as a whole.

photo credit: Free Grunge Textures – via photopin cc
]]> 0
How a CDO impacts business decisions: Understand the problem first| #MITIQ Wed, 23 Jul 2014 22:00:09 +0000 Continue reading ]]> Deborah NightingaleData is a key enabler in enterprise IT, but there are more aspects to consider before deciding what solutions or systems to implement. Deborah Nightingale, Professor of the Practice of Aeronautics and Astronautics and Engineering Systems and Co-Director, MIT Lean Advancement Initiative, recommends an approach that is “looking at the whole enterprise in a more holistic way. It starts with the strategic objective, and then understanding what’s going on around you” – ecosystem, stakeholders, and then thinking about it from multiple dimensions, she explained in a live interview with theCUBE co-hosts Jeff Kelly and Dave Vellante at the MIT CDOIQ conference.

Vellante stressed that IT permeates every part of our life and our organizations much more so than ten or twenty years ago. He asked if this development has made her mission harder or easier. Nightingale’s answer was, “both.”

Because IT is such a key enabler, organizations can do things they were not able to do before. “You can come up with new business models, new ways of doing things,” but IT needs to be used in an effective manner, otherwise it creates more chaos, Nightingale explained. That requires presenting a more systematic approach based on concepts of system engineering and organizational change that highlight where and organization is going and how it gets there.

Asked to describe the typical situation she encounters in an enterprise, Nightingale said “often times they have taken a very siloed, reductionist approach.” Or they have already gone through failures and want to avoid them. The typical go-to scenario is “When in doubt, let’s reorganize.”

She used Netflix as an example, who a few years back decided to split the online streaming from the DVD business. They did not check with the stakeholders, who were not very happy with the decision, so the company needed to change back.

She also typically helps organizations that have grown really rapidly around a new technology or product, but “don’t know how to design the organization to really take advantage of that.”

Explaining how she goes about having everyone involved better communicate, Nightingale said one of the most effective ways is to get everyone in the same room — top management, IT, HR, finance, and “getting them to understand who are the stakeholders, what is their value. They do not understand what the stakeholders want from them,” although typically they know what they want from the stakeholders.

She then helps them understand what’s working and what’s not working. The next step is “getting them to create a shared vision of where they want to go. Often times they all have a vision of where they want to be. Getting them to collectively define this, and share it, and then getting them to think about the future and which are some options on how to architect it” is the goal, Nightingale explained. In short, “getting them to think about things before they go and change everything.”

Impacting business decisions as CDO


“When you work with an organization, how prescriptive do you get, specifically with the role of the CDO?” Vellante asked. Nightingale responded that “data and information, they run everything within the organization. Once you get the strategic direction and everyone agree on that,” then you can decide on a specific data system.

Asked if there was a consensus for a the need of a CDO in the organization, Nightingale said “there isn’t a consensus yet, but it is evolving. There’s also this movement towards distributed things, and part of the challenge is what things do we need to standardize, and what things does that really not matter.”

Commenting on whether the CDO should report to the CIO, Nightingale stated ”they need to be tightly integrated. Where they report is very much a function of the culture of the enterprise they are working in. They need enough access to the CEO to understand what the strategic drivers are. I think that reporting higher in the organization is an important piece to that.”

CDOs have to be integrated along with a whole lot other things. “I don’t think you can totally separate them,” Nightingale shared, going on to say that CDOs should be given autonomy to do their job. “I think it’s probably better if they report to the same individual,” she concluded.

Asked about the most common mistakes organizations make, Nightingale said that many times they want to solve a problem in a certain way, when they actually have a different solution that is more effective.

She exemplified with a hospital wanting to fix the emergency department. ”They wanted to put in some new technology to check in people faster.” After analyzing the whole hospital, the conclusion was that 30 percent of patients were in the ER because they could not see their primary physicians, or because they were waiting for their rooms to be cleaned and prepared for admittance, or they were waiting for lab results. The solution was to open an after-hours walk in clinic, to fix lab turnaround time, and implement a way of signaling when rooms needed to be made available.

Another prevalent problem is starting with “the IT data system as the solution when they don’t really understand the problem.”

]]> 0
The confusing case of email court rulings: Complexity in the cloud Wed, 23 Jul 2014 21:00:41 +0000 Continue reading ]]> gmail google mailbox inboxCan an entire Gmail account become evidence in a court case, or should access to personal data be limited to relevant content only? The question is one of privacy rights, and has garnered contracting rulings in several courts across America.

In a money laundering hearing, New York District Judge Gabriel Gorenstein ruled that authorities can have access to a user’s entire Gmail account when a warrant is presented. The ruling set a new precedent for law enforcement’s access to a person’s data.

“In the case of electronic evidence, which typically consists of enormous amounts of undifferentiated information and documents, courts have recognised that a search for documents or files responsive to a warrant cannot possibly be accomplished during an on-site search,” Gorenstein explained.

Gorenstein’s ruling contradicts an earlier court ruling wherein a warrant to access a email account was rejected, as its scope was too broad. The warrant did not clearly state what the federal authorities wanted to access and what the service provider, in this case Apple, should give to the authorities.

“[T]he government continues to submit overly broad warrants and makes no effort to balance the law enforcement interests against the obvious expectation of privacy e-mail account holders have in their communications,” Facciola stated.

Throughout the years, access to email has been faced with contradicting court rulings and it’s enough to confuse anyone. Here are a couple additional examples of contradictory rulings, highlighting the need for clarity when it comes to law enforcement’s access to private user data.

Accessing someone’s email could land you in jail…


In 2011, Leon Walker was doomed to prison time after he accessed his wife’s email account without permission to find out if she was cheating on him.  Upon his investigation, Walker determined that his wife was having an affair with her second husband.  Scorned and bitter, Walker used the information he acquired from her email account in their divorce, and gave his then wife’s first husband, who was locked in a custody battle with the wife, ammunition to try and prove neglect on the mother’s part and win the custody case.

Because of his snooping, Walker was charged with felony misuse of a computer but his lawyer tried to argue that the statute which “prohibits the unauthorized access of computers, computer programs, computers systems, and computer networks,” does not apply in his case as an email account does not fall under any of the mentioned categories.

The circuit court countered with the following statement,  “Gmail is a computer system and, although the statute does not refer to email, the district court analogized the case to a felonious assault where a gun was used but an item such as a bullet or gun powder that is not specifically named in the statute harmed the victim.”

…But the cloud doesn’t count


big data analytics cloud mobile apps byod mobile worker analyst analyticsIn the case of Jennings v. Jennings, the court determined that accessing someone else’s email is not illegal.  The tale of Jennings v. Jennings involves a wife who used information obtained from her husband’s email account to prove infidelity and use that evidence for divorce proceedings, and a husband who filed a lawsuit against the wife, her lawyer and private investigator for violating the 1986-era Stored Communications Act (SCA) which only allows for a civil suit to proceed if the emails obtained, were accessed without authorization, were in “electronic storage.”

The court ruled in favor of the wife as the judges presiding over the case stated that the emails obtained by the wife were the read messages on Yahoo, which are stored in the cloud.  They do not consider cloud as a form of electronic storage and added that the read emails were not backup copies.

“All of the discussions regarding backups, temporary copies, and the read/unread distinction seem to have very little to do with the way that most people perceive their use of e-mail. Ultimately, this problem is likely best resolved by the legislature, but the specifics of a politically palatable update to the SCA have yet to be fully agreed upon,” Woodrow Hartzog, a professor at the Cumberland School of Law at Samford University, said.

Technology is dynamic, it is ever changing and it is obviously causing great confusion for those who are not directly involved in it, especially magistrates or judges who needs to determine which way to go for technology-based cases.

New laws may be passed in order to address broader tech issues, but before that happens, we can expect to have more confusing cases appear in the headlines.

Privacy concerns remain


Sweeping warrants, such as the one handed down by Gorenstein, bring up issues such as time limits on access to private accounts.  How can we be sure that authorities will not abuse their power, accessing email accounts long after they’ve found what they’re looking for?

“There clearly need to be limits on the scope of digital searches, whether the government is seizing a hard drive or an account with a third party,” Jim Dempsey, senior counsel at the Center for Democracy & Technology, said.

Dempsey goes on to explain that limits are not determined, and “giving the government access to everything digital is no longer an acceptable approach, even with a warrant.”

Though sweeping warrants are troubling, some believe that Gorenstein’s ruling is fair.  Those siding with the judge believe there’s no difference in the police searching a person’s hard drive that may contain terabytes of data, versus a person’s email account.

The matter is still up for debate, and isn’t likely to be resolved in the near future. The number of devices used to access personal data is only growing, presenting more opportunities for law enforcement to request warrants on such devices’ storage, and more questions as to citizen privacy.

photos by rovlls and coffish via photopin cc
]]> 0
Bitcoin Weekly 2014 July 23: Bitcoin Girl, Winklevoss Bitcoin Index API, NYC Yellow Taxi and Bitcoin, Newegg 10% promotion, and more Wed, 23 Jul 2014 20:03:15 +0000 Continue reading ]]> bitcoin-weeklyRecent weeks have seen some fairly good adoption news for Bitcoin with new businesses climbing on board—most important computer giant Dell—and the market price has stayed relatively stable, residing within the $610 to $630 band. The last major market upheaval being the sale of the US Marshals Service sized Silk Road bitcoins.

This week, check into the Bitcoin Weekly to learn about Naomi Brockwell, The Bitcoin Girl, who released a music video parody of “Uptown Girl” that plays on many of the Bitcoin community’s memes as part of outreach to the public about cryptocurrency. In preparation for their Bitcoin ETF, the Winklevoss twins have released an Internet-available Bitcoin Index called the “Winkdex.”

Bitcoin accepted has been spotted in NYC Yellow Cabs as drivers start to test the waters. And finally, Newgg is making sure that bitcoin customers don’t miss out on a 10% off promotion by unexpectedly extending it until July 30.

Bitcoin Girl Naomi Brockwell argues for education and advocacy

The Moving Pictures Institute and Naomi Brockwell recently made news with the music videoBitcoin Girl.” The video, which is a play on the song “Uptown Girl,” introduces Bitcoin community memes and philosophy in a humorous manner and guides viewers to the website, where Brockwell hopes the public can find education on the subject of virtual currency.

Brockwell’s advocacy has not stopped at making an amusing music video, she also recently appeared in an interview at FreedomFest 2014 in Las Vegas where she spoke about her work changing public perception about Bitcoin.

Brockwell believes that most members of the Bitcoin community don’t experience much of the negative exposure that Bitcoin receives because the community is very self-supportive and passionate. With much of the news reflective of how Bitcoin is being used for money laundering, could be used for terrorism, the bust of Silk Road, the fall of Mt. Gox and other grim headlines, she hopes to counteract this through public outreach and education about what makes the technology work and the community thrive.

She’s a Bitcoin girl, she’s living in her Bitcoin world…

Winklevoss “Winkdex” Bitcoin Index gets the API treatment

It’s open season for developers now that the Winklevoss twins have released the first version of their Bitcoin Index: The Winkdex. The front-end of the whole thing is a lovely, crisp display showing a chart with price points, volume data, and all the wiggly lines everyone enjoys; but the back-end is being opened up to developers with an API.

First, the front-end.

Front-end of

Front-end of

Twenty-four hour high, low, net change, as well as volume are displayed prominently. Bitcoin market stats are right next door with total BTC, market cap (at $8.1 billion), current hash rate. All the sort of metrics that someone curious about the current Bitcoin market might want on a dashboard and all wrapped up in a tidy display.

Finally, the back-end.

The Winkdex API reference is available on its own page and developers can get access immediately. Currently the API appears to be open (no key needed) and replies to GET requests with JSON. While there’s no authentication needed, the API documentation does ask developers to abide by a one-request-a-minute rate limit and reserves the right to enforce that limit and ban abusive API users.

So far so good. The API includes Gzip compatability and uses ISO-8601 for time units. It can reply with the endpoint price (using “/price”), and a time series of data points (using “/series”). There’s a nice host of HTTP error responses.

Looks like the API itself is extremely primitive and, this being version “v0”, chances are there will be new features added as people seek to use it.

Meanwhile, the Winkdex API has some strong competition in the market with the API from BitcoinAverage, Bitcoin Charts, and the APIs released by Coinbase (which is used by Google.)

Bitcoin accepted spotted in NYC Yellow Taxi

Reddit user CodeSquad spotted a cab in New York with a Bitcoin Accepted Here QR code on the divider window.

In a conversation with the cabbie, ColdSquad says, “He pays 5% processing fee for credit card transactions (or $13 for 12 hour shift), over 90% of his customers pay with credit card. It takes him 2-3 business days to get the money in form of a check for every day’s credit cards earnings.”

The benefit argued with Bitcoin would be a much smaller processing fee and remittance is virtually instant. Although there could still be processing fees from BTC to USD and small time lag depending on the service, those fees would still be far less than 5%.

As to the expectations of legality, the cab driver dismisses it. As with many drivers he leases his cab and pays up front for that. How he takes payment for services is not recorded, even if the cost of the trip is logged. The driver joked with ColdSquad that he could agree to take payment in rice should be choose.

According to ColdSquad, the cabbie has had the sign up in his cab for a week, people ask a lot of questions, but nobody has used BTC to pay a fare yet.

Newegg extends 10% off Bitcoin purchases until July 31st

Staying competitive with so many new electronics e-retailers picking up Bitcoin means coupons for Bitcoin users. As a result, Newegg has extended its promotion for Bitcoin buyers by offering 10% off merchandise until July 31st.

During the promotion buyers can save up to $100 (in the form of 10% off products) while using Bitcoin at checkout with the “BITCOIN” promo code.

]]> 0
Inside Nimble’s cloud analytics op | #CubeConversations Wed, 23 Jul 2014 19:42:45 +0000 Continue reading ]]> Rod Bagg of Nimble StorageSolid-state memory offers a number of compelling  advantages over traditional disk, but when there are dozens of different vendors all touting the same value proposition, it becomes much more difficult for CIOs to cut through the jargon. It’s even harder for the suppliers themselves, especially newer players such as hybrid array maker Nimble Storage, which is going against larger and better-established names that already have organizations’ trust.

Yet despite the tremendous challenges, the company managed to break away from the crowd and settle on a stable growth trajectory that culminated in an expectation-breaking stock market market last December. Rodd Bagg, the vice president of support at Nimble, credits much of that success to his firm’s early focus on hardware instrumentation. He appeared in a recent episode of our CubeConversation series to share what that means  for practitioners with SiliconANGLE founder John Furrier.

Deep-diving into the data for better business results


Every day, Nimble collects between 30 to 100 million data points from each individual array in its vast install base into centralized cloud-based repository Bagg referred to as the “stats library”. The system constitutes the backend of the company’s Storage InfoSight platform, a managed service that pushes that information back down to customers in the form of useful insights about the health  of their storage environments. The tool provides much-needed visibility into metrics that Bagg said practitioners have not had access in the past and are becoming more important by the day.

“As an environment gets more and more complex, and there’s more and more demand on the storage, you just need that transparency,” he explained . “A lot of our customers tell us that with other storage devices they’ve used in the past, they haven’t had that kind of transparency, it was a mystery to them, so we opened that up.”

On top of shedding light on system performance, InfoSight automatically spots operational gaps and generates what Bagg described as “detailed instructions” on how to bring a deployment – be it an individual array or an entire environment – up to par with Nimble’s best practices. And that’s only the beginning, he said. The platform also utilizes a set of advanced predictive algorithms to map out usage trends and identify potential problems before they occur.

That capability is applied mainly to alert organizations when they have to add more capacity before that need turns into a bottleneck which bogs down the entire stack, but Bagg said it can be just as useful in other parts of the data center.

“It might be the simple things like needing more cache but it can be a lot more complex. We do correlation analysis around the environment and what’s happening there: we may uncover a network issue in InfoSight that allows the storage guys to get an alert that there’s an issue in the network causing the database problems,” he detailed.

  • “Drinking our own champagne”

Nimble is putting the data is collects from customer deployments to good use internally as well, according to Bagg. The company recently rolled out a sizing tool for its salespeople that taps into stats library to calculate the optimal configuration for any given set of workloads, he said. That functionality allows representatives to provide an accurate assessment of how many and what kind of arrays a prospect needs  based on details they can give over the phone, which streamlines the onboarding process by a great deal.

More automation, less troubleshooting


The fact InfoSight eliminates much of the manual work involved in managing and monitoring storage environments not only makes life easier for users but Nimble’s support organization as well, Bagg highlighted. Not having to wrangle with mundane tasks such as best practice enforcement has enabled his team to do away with the traditional tiered structure of care centers in favor of a much simpler setup wherein the engineer who handles the first call stays with the customer until the issue is resolved, he said.

That approach has saved untold amounts of time and effort for both users and Nimble itself, but implementing it was a challenge. The company serves organizations all over the world, Bagg pointed out, which required establishing a local presence in multiple regions outside North America in order to maintain the level of service they have come to expect.

“Because we’re only hiring very senior people, it’s difficult to hire senior people in Silicon Valley that are gonna work through the night. But you can get great top talent when they’re allowed to work 9AM through 5PM,” he explained. “So we have support centers around the world so that everybody is working regular daylight hours and it’s seamless from that perspective.”

]]> 0
Docker ships virtual container DevOps and orchestration with Orchard acquisition Wed, 23 Jul 2014 18:45:06 +0000 Continue reading ]]> Docker logoDocker, Inc., the commercial sponsor for the Docker open source project, just announced the acquisition of Orchard Laboratories, Ltd. For an undisclosed amount. Orchard has a long history with Docker as a creator of solutions for Docker that include a hosting service and Fig, an application composition and orchestration tool.

About the acquisition, Ben Golub, CEO, Docker said,  “Orchard really stood out to us with their vision of what developers need and their delivery of well-designed services and products.” He emphasized that Docker’s mission is to give developers the best tools to build, deploy, and frictionlessly move web-scale applications between bare metal, virtualized, and cloud environments.

Orchard’s orchestration tool, Fig, is particularly useful to developers because it allows easy composition and management of multi-container Docker applications. With a simple configuration file, a developer can define application dependencies, storage dependencies, application dependencies, and anything else needed to rapidly boot the application from cold start. The entire orchestration progress automatically starts all the components.

Ben Firshman, CEO and co-founder of Orchard, and co-founder Aanand Prasad will lead developer environment initiatives at Docker while continuing to maintain Fig.

DevOps and virtual containers

Linux virtual containers (LXC) have become somewhat vogue in the developer community for providing easily shippable applications for use in virtualized and cloud environments. As a result, tools that can easily move, deploy, configure, and monitor them will become part of the toolset every DevOps team needs.

Docker’s acquisition of Orchard will allow the open source project to include Fig and other tools with its deployment to allow out-of-the-box configuration and orchestration more easily.

When a virtual container is sealed, it may have all the configuration it needs to run itself, but it still knows nothing about where it’s going to deploy. A LXC management and orchestration tool is still needed to oversee the deployment environment and coordinate the different virtual containers when they are activated. Automating the process of checking dependencies, applying network connections, and other configuration means a faster cycle from development, testing, and deployment and easier maintenance.

This follows a trend of DevOps configuration and management tool vendors integrating with Docker. For example, recently Chef Software, Inc. announced a new version of Chef software designed for “web-scale IT”—which is a Gartner coined term for the use of cloud computing and virtualization at scale—that also includes strong Docker and virtual container integration.

]]> 0