SiliconANGLE Extracting the signal from the noise. Fri, 01 Aug 2014 19:00:37 +0000 en-US hourly 1 Future role of the CIO will focus on business optimization | #MITCDOIQ Fri, 01 Aug 2014 19:00:37 +0000 Continue reading ]]> MIT CDOIQ - James Noga

The CIO of a major health care firm speculated last week that the CIO job itself could disappear over the next 10 years, with responsibility accruing to the emerging role of Chief Data Officer.

James, Noga, VP and CIO of Partners HealthCare. , said CIOs will have less responsibility for managing infrastructure complexities as more of those functions move to the cloud. In an interview with Dave Vellante and Jeff Kelly of SiliconANGLE at the MIT Chief Data Officer and Information Quality Symposium, Noga said this kind of change isn’t new.

“In the 1800’s, every manufacturing plant had its own power plant. Yes, we’re still building data centers, but I think you’re going to see less of that,” said Noga.

Noga advises CIO’s to to focus on the ‘optimizations of workflow processes’, offering LEAN and Six Sigma as examples. Although CIO’s are typically enablers of the business strategy, the goal of optimizing the business is going to become more important. Noga said CIO’s are effectively morphing into Chief Operating Officers and that the day is coming when “when a COO is fairly competent in information technology as well as business operations,” with a CDO or CTO reporting to them. The COO and CIO roles are converging he said.

Getting Involved in Business Operations


The challenge for CIOs is to address non-IT issues from a business strategy perspective. Noga recommended that CIO’s actively engage senior leadership, adding that they may even have to push some doors open as opposed to waiting for doors to be opened for them.

“You’re no longer at the end of the line. As decisions are made, you become part of that formation of the strategy rather than a receiver of the strategy,” said Noga.

Advice for CIO’s


Rather than seeing this transition as a threat, Noga thinks that CIO’s should see it as an opportunity for additional influence and visibility.  “Maybe [the CIO] will retain the title, but regardless, I think it will be as I’ve described it – more of a COO role,” he said.

See Noga’s entire segment below:

]]> 0
Software that keeps employees happy, automates deployments faster Fri, 01 Aug 2014 17:34:32 +0000 Continue reading ]]> This week’s Smart IT roundup features a new program to keep employees in the loop, a virtual machine that reduces automated deployment time, and the state of software-defined data centers.

home office desk

Klick Health’s Genome


Klick Health (Klick Inc.), a Canadian company providing digital marketing services for the health care industry, is following in the footsteps of hyperscale leaders like Inc. and Google Inc., improving connection points amongst their workers.

Genome is an enterprise operating system that is built on a ticketing software that allows for a smoother workflow and open communication between its employees.  Every project generates a ticket and every task an employee performs is tied to a ticket as well as all communications.  This allows employees to pitch in ideas or track the progress of a project as easily as an email thread.

Genome also utilizes Big Data and social technologies to customize the employee experience, increase engagement and speed-up training.  It is able to anticipate when an employee is about to perform a task and make appropriate recommendations such as tutorial videos, other employees who can help them with the task, and even alert them if said knowledgeable employee is nearby, as Klick Health uses a card-based security system that is able to track the whereabouts of employees while inside the building.

LinMin’s Server Provisioning Virtual Appliance


LinMin Corp., provider of IT automation software LinMin Bare Metal Provisioning, announced the immediate availability of its Server Provisioning Virtual Appliance. The goal is to accelerate  the automated deployment of systems running Linux, Windows Server, VMware ESXi and other hypervisors in fast-growing or frequently-repurposed data centers.

The Server Provisioning Virtual Appliance is packaged as an Open Virtualization Format (OVF) virtual machine with Linux and LinMin Bare Metal Provisioning 6.5 pre-installed, and significantly reduces the time to implement the automated deployment of servers, blades and VMs running Windows Server, Linux and hypervisors , and can also capture, restore and clone entire systems.

“LinMin is vendor-neutral, offering customers the ultimate flexibility when selecting system manufacturers, OS providers and data center topologies, with the assurance that our Virtual Appliance will meet their data center system deployment requirements,” said Laurent Gharda ( @LinMin ), CEO and founder of LinMin. “The Server Provisioning Virtual Appliance sets new standards for ease of installation, configuration and usability in an arena where traditional data center solutions have been costly and difficult to implement.”

Cirba to lead SDIC revolution


Cirba Inc., a provider of software-defined infrastructure control solutions, has shared its perspective regarding the importance of intelligent control and management of the software-defined data center (SDDC).

According to the statement, SDDC is not achieved by “simply bolting together virtualization, software-defined networking, and other cutting-edge and software-defined technologies,” but by “eliminating current silos of compute, storage, network and software and adopting a new way of managing and controlling all the moving parts within the infrastructure.”

Cirba is hoping to pioneer a movement which it believes that the key to align the capabilities of the infrastructure with the requirements of supply that are true to the goals of SDDC.  Dubbed as the Software-Defined Infrastructure Control (SDIC), it comprises several aspects such as demand management, capacity control, policy, and automation to enable this type of control.

“The ability to make unified, automated decisions that span compute, storage, network and software resources, that are based on the true demands and requirements of the applications, and that are accurate enough to drive automation without fear, is the foundation of the next generation of control of IT infrastructure. SDIC bridges the gap that has opened up in the data center management ecosystem and in many ways is the heart of the SDDC,” Cirba said in its press release.

photo credit: wili_hybrid via photopin cc
]]> 0
The ‘father of data warehousing’ moves on to bigger things | #MITCDOIQ Fri, 01 Aug 2014 16:00:16 +0000 Continue reading ]]> Bill Inmon - MITCDOIQ 2014 - theCUBEThe “father of the data warehouse” hasn’t talked data warehousing in a decade. Instead, Bill Inmon is tackling the thorny problem of making the 80 percent of corporate data that’s in unstructured form easily available for analysis.

Inmon, who is credited with writing the first book about data warehousing as well as coining the term, said his latest venture holds even more promise than his previous two successful startups. Forest Rim Technology Inc. provides textual ETL (extract, transform, load) software that mines insight from free-form text. Inmon dropped by SiliconANGLE’s theCUBE at the recently concluded MIT Chief Data Officer and Information Quality (CDOIQ) Symposium to provide a rare look behind the scenes of the ambitious effort.

The Colorado-based Forest Rim sets itself apart with a unique approach that attempts to discern the meaning of raw text by its context rather than just the contents, Inmon told hosts Dave Vellante and Paul Gillin, a methodology that it claims allows it to tackle even the most unwieldy of datasets. That includes everything from contact center transcripts to the clinical narratives used by doctors to describe a specific medical situation to their colleagues, linguistically complex workloads that necessitate an equally elaborate analytic process to ingest.

“We take language for granted because we speak it and to us language is very natural and normal, but when you start to put language into a computer, it’s anything but natural and normal,” Inmon explained. “So when it comes to the question of how you do contextual analysis, you have a hundred different ways that you do it because in language, there’s a hundred different ways context can occur and appear to us.”

It took Inmon and his team of researchers a full 12 years to develop a set of algorithms can handle that tremendous amount of variety, he detailed, and they’re still no where near the 100 mark. But he said that the offering has nonetheless proven itself in the field to be effective in addressing the formidable challenge of mapping unstructured text into the rows and columns of the relational databases enterprises rely on today.

From there, the information can be easily fed into a downstream analytical process, be it a traditional business intelligence (BI) solution like SAP BusinessObjects or a virtualization product like Tableau or QlikView. Inmon said that simplicity has earned Forest Rim several paying customers across multiple industries, including several household brands he wouldn’t mention by name.

See the entire interview below:

]]> 0
Report: SDN landscape divided among fast-moving majority and cautious pragmatics Fri, 01 Aug 2014 14:28:16 +0000 Continue reading ]]> open flow blue data center infrastructure flying cubes architecture abstract big data analytics cloudThe ability to provision network resources  at the push of a bottom and upgrade the underlying infrastructure without worrying about compatibility issues may seem like a distant dream for practitioners today, but a new report suggests that programmable transport capacity  is on a fast track to becoming enterprise reality.

In a survey of 400 North American technology leaders across the government, education, financial services and healthcare sectors, Juniper Networks Inc. has found that more than half of organizations, or 52.5 percent, plan to implement software-defined networking (SDN) in their environments. Even more surprising was that the majority of the respondents who said they intend to embrace the paradigm noted that they expect  to do so within the next 12 months, which marks a stark departure from the sluggish pace of technology adoption that has historically characterized the networking world.

As Wikibon principal research contributor Stu Miniman observed in a 2013 paper, far less disruptive advancements could take a decade to pass the various standards committees and  propagate through the market in the past. But the benefits of SDN are apparently too great to hold off. Juniper’s study reveals that IT leaders universally acknowledge the tremendous potential of the technology, yet remain divided on what the single biggest new opportunity is.

Topping the list was improved network performance and utilization with 26 percent of the vote, followed by simplified management at 19 percent and operational efficiencies and automation tied for third with 13 percent each. Meanwhile, 9 percent of the participants said that greater security is the most important advantage of SDN and 7 percent named reduced equipment costs as the biggest plus, placing hardware-related cost savings second from last only to the obligatory “Other” category, which accounted for the remaining 3 percent.

The poll indicates that SDN is well-recognized as a game changer for networking, but it also makes clear that most organizations still have a long way to go towards realizing the potential benefits in the field.

Just 27 percent of participants said that their companies are prepared or almost entirely prepared to adopt the technology, considerably less than  the number of respondents who had earlier stated they intend to launch an SDN initiative in the foreseeable future. The education segment rose ahead of the pack in terms of readiness at 34 percent, while government trailed behind with just 13 percent of  leaders belonging to the segment saying that their organizations are more or less set to make the transition. Another 38 percent of those polled gave their companies a “C” for software-defined networking preparedness, saying they are “somewhat”  ready to implement the technology.

Cost remains the biggest barrier


The main barrier to adoption is cost, it turns out, with a full 50 percent of participants naming it as the biggest challenge. That aligns with the fact that 63 percent said they expect business networks will include a mix software-defined and legacy components in five years’ time, which underscores the cross-sector reluctance to jump into a costly “rip-and-replace” initiatives. Meanwhile, 35 percent of respondents named lack of integration with existing systems as the top barrier to adoption, while 34 percent pointed at security.

photo credit: subarcticmike via photopin cc
]]> 0
Twitter notes sharp rise in government requests for user data Fri, 01 Aug 2014 11:37:32 +0000 Continue reading ]]> medium_2821633690Twitter has just released its latest transparency report, revealing a 50 percent rise in the number of user data requests over the last six months.

A total of 54 governments made requests for data tied to specific user accounts, with eight of those nations doing so for the first time. In total, Twitter received 2,058 requests from governments in the first six months of this year, which is 46 percent more than in the previous six months.

Twitter said the requests pertained to 3,131 accounts, which is 48 percent more than in its last transparency report. The company agreed to 52 percent of the requests.

“The continued rise may be attributed to Twitter’s ongoing international expansion, but also appears to follow the industry trend,” the company wrote. “As always, we continue to fight to provide notice to affected users when we’re not otherwise prohibited”.

Not surprisingly the United States government was the greediest of the lot, accounting for 61 percent of all requests that Twitter received. It also seems that U.S. authorities are good at getting their way, since 72 percent of its requests for data were approved, a higher success rate than any other nation.

Coming in second was Japan, which made 192 requests for data – nine percent of the total. Saudi Arabia was in 3rd place with 189 data requests, while the UK government made 78 requests in total, achieving a 46 percent success rate.

Twitter’s figures follow a key legal ruling yesterday that could see law enforcement agencies granted even more power to access people’s data. Microsoft has just been ordered by a U.S. judge to allow investigators to access its data facility in Ireland. Unless Microsoft’s appeal succeeds, the decision could set a precedent for courts to force other companies to open up their offshore data centers as well.

photo credit: TarikB via photopin cc
]]> 0
Stellar debuts with a bang, gives away digital currency for free Fri, 01 Aug 2014 11:00:31 +0000 Continue reading ]]> small__5914558006There’s a new kind of cryptocash in town going by the name of “Stellar”, but unlike other digital monies, it plays well with both existing ones and regular fiat currencies. What’s more, it’s giving away Stellars for free. Sounds intriguing? Read on.

Stellar, as described in its blog post, is a “decentralized protocol for sending and receiving money in any pair of currencies.” Simply put, it’s a service that helps to easily send money anywhere, but you can choose the type of currency it arrives in. For example, if you sent US dollars to a friend in Europe, you can set it so they recieve the cash in euros. What makes it unique from other traditional money sending services is that it has its own digital currency called Stellar, whose value is determined by the market, just like Bitcoin.

The developers behind stellar explain that its main purpose is to provide a conversion path between other currencies. It uses a consensus algorithm instead of mining, and so transactions are completed in seconds.

“Each node in the network communicates with a set of other nodes that it believes will not collude (such as nodes run by universities, governments, and companies). Importantly, it doesn’t need to trust the nodes themselves — it just needs to believe the nodes won’t work together to produce the same malicious result. Consensus is then reached by an iterative process, which results in each new ledger being decided upon every few seconds. Correspondingly, transactions confirm nearly instantly, and no mining is needed,” says an explaination of how its algorithm works.

Another thing you need to know about Stellar is that its supply increases at a fixed rate of one percent per year to account for both economic growth and lost stellars.

Stellar is backed by Stripe, a company that allows users to accept payments via the Internet. It’s invested $3 million in Stellar and in return, it’s received two percent of all Stellars, while its CEO Patrick Collison has been made one of the Board Members at Stellar. Even so, Stripe doesn’t run the new cryptocurrency, but it is bullish on the cryptocurrency space in general and believes that systems such as what Stellar offers should have a place in the world.

Stellar is in the early stages of bridging the gap between traditional and digital currencies, so it’s inviting developers who are interested to work on this new open source project to participate.

If you’re still doubtful about digital currencies and don’t want to risk your own money investing in it or just trying it out, you can start using Stellar for free.

All you have to do is register here, create your own memorable username and password, and there’s an option to include your email so in case you forget your password or username there’s a way to retrieve your account.

After registering, you’ll be presented with a guide that explains how to get your free Stellar. You need to connect your stellar account to Facebook to receive your first 5,000 STR, and if you want another 1,000 STR, you need to confirm your email address and enter a retrieval code which you will receive in your email. The tutorial will then teach you how easy it is to send stellar. Since it’s your first transaction, if you send 1,000 STR, you will get 1,000 STR for your good deed as  Stellar recommends three organizations that either help coders (, end poverty ( or end illiteracy ( in poverty stricken lands.

When you’ve completed all the tasks, you’ll have 6,000 SRT in your account.  So what are you going to do with your SRTs?  You can keep donating to charities or organizations of your choice, keep it, or use it to entice your friends to create their own Stellar accounts and send one another Stellars.  Stellar is still in its early stages so its use is still limited, but just like Bitcoin, it has a huge potential.

There’s no harm in trying when you have nothing to lose, as it’s being given away entirely free. So, what are you waiting for?

photo credit: zcopley via photopin cc
]]> 0
Microsoft ordered to let Feds snoop inside its overseas servers Fri, 01 Aug 2014 10:20:59 +0000 Continue reading ]]> small__6150104847Microsoft has just lost a key court case that could result in it being forced to give U.S. authorities access to its overseas data centers.

On Thursday, District Judge Loretta Preska of the US District Court for the Southern District of New York, said that a warrant issued by US authorities that seeks access to its servers in Dublin, Ireland, was valid. As a result, Microsoft will have to hand over reams of customer’s emails and other data.

Microsoft claimed that because the data was managed by a foreign subsidiary, the U.S. doesn’t have jurisdiction. Judge Preska didn’t buy that argument. “It is a question of control, not a question of the location of that information,” she ruled.

Microsoft has been one of the staunchest defenders of data privacy in recent years. It’s been particularly vocal in opposing requests for customer data, especially in cases where the request is served with a gag order that prevents it from informing the customers concerned.

Last May, Microsoft successfully challenged a National Security Letter it received from the FBI. Following that decision, the company said it would continue to challenge any future letters it received that demand data from its government and enterprise customers.

In this particular case, the search warrant demanded access to one of its European customer’s emails, ostensibly as part of a drug investigation. It’s not known which agency issued the warrant.

Microsoft claimed that the order wasn’t valid because U.S. warrants cannot reach across the country’s borders. It said the order could set a dangerous precedent.

“If the U.S. government prevails in reaching into other countries’ data centers, other governments are sure to follow,” wrote Microsoft general counsel Brad Smith in the Wall Street Journal. “One already is. Earlier this month the British government passed a law asserting its right to require tech companies to produce emails stored anywhere in the world. This would include emails stored in the U.S. by Americans who have never been to the U.K.”

The judge was more convinced by the Justice Department’s argument that U.S. law already recognizes warrants for some types of data held offshore. This includes the financial records of U.S. banks located overseas. It reasoned that because Microsoft is a U.S. company and controls the data held in Ireland, the same rules should apply.

Microsoft further argued that the emails belong to their customers, just as postal letters do, and should therefore be given more privacy protections than business records. Judge Preska wasn’t persuaded by that argument, either.

Microsoft has already announced its intention to appeal the ruling, and Judge Preska has agreed to suspend her decision until it can be challenged in the Second US Circuit Court of Appeals.

photo credit: via photopin cc
]]> 0
BadUSB exploit can hack any device, and there’s no cure in sight Fri, 01 Aug 2014 08:03:10 +0000 Continue reading ]]> small__4861212604Ready for another security scare? Good, because every single USB device that you plug into your computer could pose a threat that’s worse than any malware.

It really is as bad as it sounds. Security researchers Karsten Nohl and Jakob Lell of SR Labs have stumbled upon a flaw in every single USB device ever made, including thumb drives, mice, keyboards and headphones. The flaw makes it possible for the firmware in USB to be reprogrammed by malicious software to launch just about any kind of attack once plugged in to a computer.

Worse is that it appears that there’s no way to protect against this vulnerability.

“No effective defenses from USB attacks are known,” wrote the researchers on the SR Labs site. “Malware scanners cannot access the firmware running on USB devices.”

This is because traditional anti-virus software is designed to scan only the file contents of attached drives. Most programs can even scan hidden files easily, but this is not where Nohl and Lell’s nasty code resides.

Rather, the pair managed to reverse engineer the fundamental firmware on USB devices, which is where the virus hides out. The firmware is a piece of code that tells the PC what to do when a device is plugged into it, and it can’t be scanned by current anti-virus programs.

The bad news is that a compromised USB device could basically do anything it wants to your PC. Speaking to, Nohl said “It can do whatever you do with a keyboard, which is basically everything a computer does.”

Even worse, the BadUSB exploit spreads easily. It can be passed from an infected USB device to a PC, and from there it can infect any other USB device that’s plugged in later. “You can give it to your IT security people, they can scan it, delete some files and give it back to you telling you it’s ‘clean’”, said Nohl. But that isn’t true.

What this means is that you can no longer trust any USB device that’s been plugged into another PC.

Nohl and Lell are expected to describe the exploit in more detail at next month’s Black Hat security conference in Las Vegas. Assuming the threat is as bad as they claim, it could force manufacturers to carry out a massive re-architecting of USB standards to protect against the exploit.

For now, though, Nohl and Lell’s findings have yet to be independently verified, so there’s a chance that it could all be debunked once the experts get a chance to look at their research in more detail. But until then, be careful about what USBs you plug in to your PC, if you care at all about the data that’s on it.

photo credit: ntr23 via photopin cc
]]> 0
XpoLog hopes search makes the difference in log management Thu, 31 Jul 2014 19:47:21 +0000 Continue reading ]]>

big data analytics real time decision makingThe larger a computing infrastructure gets, the more difficult it becomes to pore through server logs to track down anomalies or errors. XpoLog Ltd. has introduced log management tools for Big Data that promise better searches at what it claims is a fraction of the cost of its competitors.

XpoLog, which has been around for a decade, focuses on building log analytics software for applications rather than for networking or security appliances. This has led it to develop an approach to searching that CEO Haim Koschitzky calls “augmented search”.

Augmented search combines machine semantic analytics on application log data with the context of a user’s search to provide layers of results that are organic and can be tagged and naturally filtered according to a user’s previous searches.

“You can create your own saved searches on the data, can tag those searches for later, and when other users search, you will be made aware that those searches coincide with the context of your search,” Koschitzky explained.

This is not only good for traditional system administrators but can also help DevOps continuously deploy new services, triage system anomalies and diagnose and address problems without necessarily being familiar with the contents of a log. The augmented search engine can be tailored to their needs and specific understanding.

This video provides a glimpse of augmented search in action:

XpoLog competes with Splunk, Inc. and several open source options, such as Graylog2. Koschitzky said XpoLog’s advantages are its augmented search, its flexible pricing plan and its interoperability with log management tools like Logstash.

XpoLog currently offers its log analytics software free for systems with 1GB per day or less of log data, and it has already handed out more than 500 free licenses. The free licenses have actually helped XpoLog crowd-source information about usage that ultimately improves the product, Koschitzky said.

Moving forward, XpoLog plans to strengthen its log analytic offerings and also build a cloud service.

“Our roadmap is really around how we help applications running on a variety of infrastructures in DevOps and production support using Hadoop,” Koschitzky said. “We have integration on collecting data sources from Hadoop and also for running the semantic analytical capabilities on the Hadoop logs.”

Does XpoLog have wherewithal to take on Splunk and other log analytic vendors? You can be the judge by downloading a trial version of the software or trying out the free license. It’s clear that the landscape for Big Data log analytics is now blooming with choices.

photo credit: Kris Krug via photopin cc
]]> 0
Report: Amazon feeling the competitive heat in cloud services Thu, 31 Jul 2014 18:25:30 +0000 Continue reading ]]> Cloud TieringMicrosoft Corp. and IBM Corp. have made significant gains in the fast growing markets for platform-as-a-service (PaaS), infrastructure-as-a-service (IaaS), software-as-a-service (SaaS), and related businesses, according to a new report from market watcher Synergy Research Group. (See graph below.)

The report shows Microsoft’s cloud-related business growing 164 percent year-to-year in the second quarter of 2014 (2Q14) while IBM/Softlayer grew 86 percent. Inc.’s Amazon Web Services (AWS), by comparison, grew 49 percent. The AWS growth was on a much larger base, but Synergy now shows that the total actual growth of the four major AWS competitors –, Inc., Microsoft, IBM and Google is larger than AWS, with Microsoft and IBM having the largest share of the increase. A year ago AWS had significantly larger growth than those four competitors combined.

Synergy estimates that quarterly cloud infrastructure service revenues now have reached $3.7 billion with annual revenues exceeding $13 billion. Microsoft and IBM have been gaining market share steadily over the last four quarters while the market shares of AWS and Google have been essentially unchanged.

Cloud_Revenue_GrowthThe report estimates total AWS revenues “well in excess of around $1 billion” per quarter, nearly all of it from cloud infrastructure services. It reports that IBM and Microsoft also claim quarterly revenues of around $1 billion, but those include revenue from SaaS, cloud-related hardware products and associated professional and technical services.

“Microsoft is making huge strides in IaaS and PaaS, while IBM now has clear leadership in the private & hybrid infrastructure services segment,” said Synergy Chief Analyst and Research Director John Dinsdale.

Shift to hybrid


IBM’s hybrid strength could prove particularly significant. For more than a year, analysts and corporate CIOs have been saying that medium-to-large enterprises have a strong preference for moving to private and hybrid clouds rather than making the leap directly to full external IaaS services for their core infrastructure. In October, for example, Gartner Inc. issued a report predicting that nearly half of large enterprises will have hybrid cloud deployments by the end of 2017.

AWS is handicapped in this growth market since it has not announced an on-premise version of its IaaS platform and does not offer services to help companies to integrate their internal private clouds with its service. IBM, Hewlett-Packard Co., Inc., and other enterprise players have on-premise private cloud solutions designed to integrate with their IaaS platforms.

AWS remains the overall leader in the cloud services market and is likely to maintain its lead in IaaS for some time. However, it no longer has the market to itself. The question is, will it be happy with the market share it can capture and hold as a pure PaaS/IaaS provider as these other competitors gain share steadily, or will it seek to branch out into hybrid clouds? So far Amazon has remained in its comfort zone,  but complacency has never been characteristic of the company.
Cloud Infrastructure Revenues Growth graph courtesy Synergy Research Group
]]> 0