SiliconANGLE http://siliconangle.com Extracting the signal from the noise. Mon, 28 Jul 2014 16:05:57 +0000 en-US hourly 1 http://wordpress.org/?v=3.8.3 Big Data needs drive R as a powerful enterprise ready language http://siliconangle.com/blog/2014/07/28/big-data-needs-drive-r-as-a-powerful-enterprise-ready-language/ http://siliconangle.com/blog/2014/07/28/big-data-needs-drive-r-as-a-powerful-enterprise-ready-language/#comments Mon, 28 Jul 2014 16:01:46 +0000 http://siliconangle.com/?p=195282 Continue reading ]]> As Big Data continues to reach larger enterprise adoption, the programming languages that support writing schema and producing Big Data analysis algorithms will rush to keep up. As a result, the open source statistical language R has become a go-to skill for Big Data scientists and developers, with its popularity soaring amid languages and skills.

Combined with Big Data tools, the R language provides a deep statistical handle for large data sets, conducting statistical analysis, and rendering data-driven visualization. R is particularly widely used in the industries of finance, pharmaceuticals, media and marketing, where it can be used to help guide data-driven business decisions.

The popularity of R has grown significantly in recent years. A 2013 survey of data mining professionals conducted by Rexer Analytics indicated that the R programming language is by far the most popular statistical analysis tool, with 70% of respondents saying they use it at least occasionally. Developers interested in learning more about R can look into training on the subject to get a better grasp of its use in the Big Data paradigm.

In the enterprise market numerous companies and projects have risen to harness R and bring it to Big Data scientists and business users alike. These projects and tools include the use of R in Microsoft’s cloud computing Azure Machine Learning platform, IBM’s Big R, Teradata Aster R, Oracle R Enterprise, PivotalR’s Big Data R distribution, and SAP’s R for HANA.

Azure Machine Learning is a game changer with R

Microsoft last month announced the launch of its new platform Azure Machine Learning (ML). It is a platform dedicated to cloud predictive analytics on large volumes of data. Azure ML’s cloud service allows scientists and developers to effectively integrate predictive analytics data into their applications.

What is interesting is that Microsoft is providing APIs and templates based on the R language. Azure ML supports more than 300 packages using the R programming language; and allows users to assemble a model suited to their needs built out of existing pieces rather than forcing developers to build something from scratch. The ease of implementation makes machine learning accessible to a larger number of investigators with various backgrounds–even non-data scientists.

Microsoft says the Azure ML platform can predict future trends in systems such as with search engines, online recommendation, ad targeting, virtual assistants, demand forecasting, fraud detection, spam filters and more.

IBM integration with Big R

IBM InfoSphere BigInsights Big R is a library of functions that provides end-to-end integration with the R language and InfoSphere BigInsights. Big R can be used for comprehensive data analysis on the InfoSphere BigInsights server, lowering some of the complexity of manually writing MapReduce jobs.

Big R provides an end-to-end integration of R within IBM InfoSphere BigInsights. This makes it easy to write and execute R programs that operate on big data. Using Big R, an R user can explore, transform, and analyze big data hosted in a BigInsights cluster using familiar R syntax and paradigm.

Teradata Aster R

The rapid adoption of R and its proven value means that organizations looking to drive new revenue-generating insights should make R a part of their predictive analytics strategy. Teradata, the analytic data platforms, recently introduced Teradata Aster R, which extends the power of open source R analytics by lifting the memory and processing limitations.

Teradata Aster R gives analysts a solution to business analytics enterprise-ready, scalable to the highest degree, reliable and easy to use, allowing you to develop high-speed massive amounts of data to meet the analytical needs of each company. The platform delivers the power of R analytics to the enterprise. To support R analysts, Teradata offers familiar R language and tools, massive processing power, and a rich set of analytics. In addition, analysts have access to an immense volume of integrated data from multiple sources.

Teradata Aster R benefits from a platform of high performance computing and has all the advantages in terms of security, data management, and a set of analytics including Teradata Aster R Library, Teradata Aster R Parallel Constructor, and Teradata Aster SNAP Framework Integration.

Oracle R Enterprise

Oracle R Distribution is Oracle’s free distribution of open source R. The database company offers Oracle R Enterprise integrated with R. Oracle R Enterprise primarily introduces a variant to many R data types by overloading them in order to integrate Oracle database with R.

The company also offers Oracle Big Data Connectors that facilitate interaction and data exchange between a Hadoop cluster and Oracle Database. Oracle R Connector for Hadoop is a set of R packages that supports the interface between a local R environment, Oracle Database, and Hadoop.

Oracle strategy with R Enterprise is to provide in-database analytics capabilities for its widely adopted enterprise RDBMS, and for its Exadata appliance.

R for Big Data with PivotalR

PivotalR is a package that enables users of R to interact with the Pivotal (Greenplum) Database as well as Pivotal HD and HAWQ for Big Data analytics. PivotalR is an R library with a familiar user interface that enables data scientists to perform in-database and in-Hadoop computations.

HAWQ is the key differentiating technology in making Pivotal HD the world’s most powerful Hadoop distribution. With support of R language, it offers Dynamic Pipelining, a world-class query optimizer, horizontal scaling, SQL compliant, interactive query, deep analytics, and support for common Hadoop formats.

SAP integrates R with HANA

SAP has integrated R with their in-memory database HANA as the modern platform for mobile, analytics, data services and cloud integration services. SAP HANA works with R by using Rserve, a package that allows communication to an R Server.

The data exchange between SAP HANA and R is very efficient, because they all use the column storage style. SAP’s strategy for integrating HANA with R is to provide modern platform for all applications, enabling customers to truly innovate and transform their businesses in the cloud. The solutions include a comprehensive set of prepackaged rapid-deployment solutions that aim to automate deployment and simplify journey to the cloud.

Contributors: Kyt Dotson and Saroj Kar.

]]>
http://siliconangle.com/blog/2014/07/28/big-data-needs-drive-r-as-a-powerful-enterprise-ready-language/feed/ 0
SAP’s 4 keys to data governance | #MITCDOIQ http://siliconangle.com/blog/2014/07/28/saps-4-keys-to-data-governance-mitcdoiq/ http://siliconangle.com/blog/2014/07/28/saps-4-keys-to-data-governance-mitcdoiq/#comments Mon, 28 Jul 2014 15:14:12 +0000 http://siliconangle.com/?p=195688 Continue reading ]]> drowning in big data tree underwaterFor some people, data governance is still a little unclear in terms of what it is and how it’s applied to the enterprise. In an interview with Dave Vellante and Jeff Kelly at this year’s MIT CDOIQ Symposium in Cambridge, MA, Tina Rosario, VP of Global Operations for SAP America, offered a brief explanation of data governance by sharing how the company manages its program through keeping things simple and working closely with data analysts.

Focusing key data and Axing the data speak

 

Rosario said that SAP America centers everything it does on data governance around four key capabilities:

  1. Having good organization and practices around data governance, meaning rules, standards and policies.
  2. Looking at the right engineer processes for simplifying how data is created, updated and maintained.
  3. Looking at data from an ongoing maintenance point of view and determining what the right operations and tools are to automate the maintenance of data.
  4. Having good technical and business-driven IT solutions.

 

With this in mind, SAP America drills down the ‘data speak’ into simple business language. This means figuring out the critical bits of information needed to run the business process and the currency of that information. Rosario said this is where the company finds out where it’s going to govern. She added that data governance is ultimately about learning what to do in order to better enable business processes to run more efficiently and how to get the data to businesses faster and with the right level of content.

Governance and Analytics

 

Kelly asked Rosario if there’s any tension between data governance and data analytics. She responded by saying it’s actually the opposite, and the two work very closely together.

“I think it’s our job in terms of governance and management to make sure that the data is at the right level of quality and is at the right level of standards, so the analytics people don’t have to spend time normalizing, rationalizing,” said Rosario. This makes the data easily accessible for analysts.

She described the relationship between the two as symbiotic. For example, before running a report, analytics would ask governance for access to a certain level of data and then help to ensure that it’s from the right source, at the right level of quality and also available. On the flip side, governance needs help to drive data analytics, using tools like SAP’s Information Steward to analyze the current level of data quality.

Data governance is the learning and practice of improving business processes. SAP America has been successful in doing this by following four core capabilities, keeping things simple for businesses and working closely with analytics.

See Rosario’s entire segment below:

photo credit: gideon_wright via photopin cc
]]>
http://siliconangle.com/blog/2014/07/28/saps-4-keys-to-data-governance-mitcdoiq/feed/ 0
PostgreSQL enhancements aimed at luring skittish MySQL users http://siliconangle.com/blog/2014/07/28/postgresql-enhancements-aimed-at-luring-skittish-mysql-users/ http://siliconangle.com/blog/2014/07/28/postgresql-enhancements-aimed-at-luring-skittish-mysql-users/#comments Mon, 28 Jul 2014 14:34:12 +0000 http://siliconangle.com/?p=195686 Continue reading ]]> smart infrastructure architectureFour and a half years after Oracle Corp. completed the acquisition of Sun Microsystems, MySQL continues to cast a formidable shadow over the web database market despite losing much of its original open-source character under Oracle’s new ownership. But to say that the buyout has not been felt in the ecosystem would be an understatement.

A growing number of users, including prominent tech firms such as Google and Red Hat, Inc., have already moved or are in the process of moving their data from MySQL to community-led alternatives that aren’t controlled by any one vendor. While Oracle has committed to maintaining an open source version of MySQL, the user community has had his doubts about the company’s sincerity. This has put wind in the sails of historic underdog PostgreSQL, which is now finally hitting its stride but still lacks many of the capabilities needed to fulfill its potential to plug the hole left in the wake of the Sun acquisition.

EnterpriseDB Corp., a top distributor of the open-source platform, is trying to change that, one new feature and performance improvement at a time. The latest batch of enhancements announced by the company mark another  big step in the right direction.

The most notable addition is a new Foreign Data Wrapper, or FDW, for Hadoop that allows users to pull in data from their analytic clusters utilizing familiar SQL syntax without going through the trouble of cobbling together a connector from scratch. The extension levels the playing field in this area against Oracle, which added support for the batch processing framework to MySQL last April. It also lowers the technical barriers that have made managing databases a painful task  in the past.

The new Hadoop connector, set to hit general availability in fall, is joined by  a revamped wrapper for MongoDB. Both take advantage of the FDW upgrade introduced with the 9.3 release of PostgreSQL, which EnterpriseDB says speeds response times and helps keep code maintainable through the use of a formal client library specification.

The extensions were unveiled in conjunction with a pair of new tools that the company is releasing to the community in a bid to smooth out some of the trickier aspects of managing PostgreSQL environments. The first of the free utilities – pg_catcheck – is a diagnostics engine that scans the metadata used to keep track of database objects for errors and inefficiencies. The other solution – pg_hiberantor – helps maintain consistent performance after a failure  by automatically restoring the data held in cache at the time of the shutdown.

EnterpriseDB is also updating two of its premium products to further simplify life for administrators. Replication Server 5.1 reduces latency, provides more room to scale across clusters, makes it easier to search rows and allows users to define custom policies for how to handle data conflicts, according to the firm. EDB Failover Manager 1,1, meanwhile, adds an advanced authentication capability and comes with new agents that run as operating system services so to stay available even when the database itself goes down.

photo credit: Thomas Hawk via photopin cc
]]>
http://siliconangle.com/blog/2014/07/28/postgresql-enhancements-aimed-at-luring-skittish-mysql-users/feed/ 0
GlobalFoundries reportedly pulls out of deal to buy IBM’s chip business http://siliconangle.com/blog/2014/07/28/ibms-chips-are-down-globalfoundries-says-no-to-semiconductor-deal/ http://siliconangle.com/blog/2014/07/28/ibms-chips-are-down-globalfoundries-says-no-to-semiconductor-deal/#comments Mon, 28 Jul 2014 11:31:42 +0000 http://siliconangle.com/?p=195728 Continue reading ]]> medium_3604357252IBM’s hopes of pulling off another fire-sale of one of its businesses look to have been dashed, with reports suggesting that GlobalFoundries, Inc. has pulled out of a deal to buy IBM’s loss-making semiconductor division.

Bloomberg says that GlobalFoundries refused to budge on its valuation of the business, which was reported listed for sale last February when Big Blue retained the services of Goldman Sachs to put a valuation on it.

GlobalFoundries, which is owned by an investment arm of Abu Dhabi’s government, was long considered to be the front-runner in any such deal. Speculation intensified earlier this month, when the company hired ex-IBM employee Henry DiMarco as its new VP for site construction and facilities. DiMarco was previously responsible designing, building, and running IBM’s 300mm chip factory in New York. GlobalFoundries is also a key supplier of chips to Big Blue.

But no matter how good a relationship the two firms have, it looks like any deal is off the table. Bloomberg cites unnamed “people with knowledge of the matter”, who say that negotiations have completely broken down. Those anonymous sources also say GlobalFoundries wasn’t really interested in the business at all, but was looking to snatch up IBM patents and engineers. As for the businesses’ manufacturing facilities, GlobalFoundries deemed these to be of “little or no value,” Bloomberg said.

It’s not clear if Big Blue has any other buyers lined up, but even if it does find a buyer, it’s unlikely to ditch the chips market altogether. Earlier this month, IBM CEO Ginni Rometty announced said the company will spend a whopping $3 billion on chip R&D over the next five years. The plan is to utilize bleeding-edge technologies like carbon nanotubes and silicon photonics to reduce its transistor sizes to just 7nm.

Rometty didn’t commit to building the chips, though, so it remains to be seen whether IBM will make its own chips or offload them to someone else.

photo credit: Pete Morawski via photopin cc
]]>
http://siliconangle.com/blog/2014/07/28/ibms-chips-are-down-globalfoundries-says-no-to-semiconductor-deal/feed/ 0
MIT CDOIQ day 2 wrap-up: Public cloud to put CIO’s to pasture? | #MITIQ http://siliconangle.com/blog/2014/07/28/mit-cdoiq-day-2-wrap-up-public-cloud-to-put-cios-to-pasture-mitiq/ http://siliconangle.com/blog/2014/07/28/mit-cdoiq-day-2-wrap-up-public-cloud-to-put-cios-to-pasture-mitiq/#comments Mon, 28 Jul 2014 11:00:36 +0000 http://siliconangle.com/?p=195706 Continue reading ]]> retireOften times on SiliconANGLE’s theCUBE, guests will have a final question posited them regarding what they believe the bumper sticker on the vehicle pulling away from the event should read. In short, it is a briefly worded takeaway from the keynotes and breakout sessions that encapsulates the overarching message of the conference. Last week’s MIT CDOIQ Symposium, held in Cambridge, Massachusetts, may herald the sunset on the position of Chief Information Officer in many organizations across several industries.

The canary in the coalmine for the CIO position may likely be in the field of healthcare. This industry, in particular, is moving away from the headaches associated with infrastructure and provisioning, looking outside the organization for cloud providers that help to ease that burden. As author and social media strategist Paul Gillin notes, among current CIO’s, “…there is no detection of regret. They are going to be focusing more on data governance and strategy which they like better anyway.” He believes the CIO role will go away but that those currently in that position will find their skills will likely lead them to become the future COO’s and CDO’s of their organizations.

Watch the Day 2 wrap-up in its entirety here:

Everything’s Easier In The Cloud

Wikibon’s Dave Vellante believes cloud providers like Amazon are going to be more and more instrumental in streamlining business processes. “I’m strongly of the opinion that Amazon is going to be provisioning infrastructure better than anyone else in the next 10 years.” He continued, “You’re seeing companies like Amazon step up in the area of compliance and doing things that are making us comfortable.” He concedes there are still entire industries, like the financial services sector, that haven’t embraced the world of public cloud. “The marginal economics of the public cloud are going to be so compelling over the next 10 years as to overwhelm the business case,” he predicted. “Public cloud will become too good an option to ignore.”

This is likely being hastened by the price war we are currently witness to in the public cloud market. “Why should you worry about investing in hardware that you’re going to have to depreciate and you’re ultimately going to lose a lot of that money,” asked Gillin, “when you can just pay a monthly fee and the prices are just going to keep going down?”

With the likelihood of the infrastructure and provisioning playing fields being leveled for all players, the differentiation in companies will transition to their application development and customization.

As Wikibon’s Jeff Kelly explained, “The differentiation is in data an analytics and how you use it.” He agrees that the growing acceptance of cloud computing is driving this monumental shift in how business will be conducted. “The differentiator is going to be how organizations monetize their data assets. It’s as simple as that,” he stated. As the shift he pointed out continues to progress, the role of CDO will be presented with a dual mandate. The first will be in the area of governance and compliance issues with respect to the data. The second will be to enact mission critical strategies that present new and better ways to leverage the organization’s data as an asset.

As Gillin pointed out, having been around in the late 80′s when the CIO role was making its first appearance in the business world, the role of the CDO appears to be taking an almost identical trajectory. While some question the validity of having a CDO, Gillin believes in five year’s time, it will simply be an accepted norm in most organizations.

“Generally speaking, we are seeing the role of CDO solidify a bit,” Kelly concurred. “We are hearing about emerging best practices like executive buy in. That tells me the role is becoming real. The role is being tied to large strategic initiatives.” He continued, “Both of those things are encouraging to me. Overall, we are moving in the right direction.”

photo credit: Philip Taylor PT via photopin cc
]]>
http://siliconangle.com/blog/2014/07/28/mit-cdoiq-day-2-wrap-up-public-cloud-to-put-cios-to-pasture-mitiq/feed/ 0
Rackspace powers up bare-metal cloud servers http://siliconangle.com/blog/2014/07/28/rackspace-powers-up-its-onmetal-cloud-servers/ http://siliconangle.com/blog/2014/07/28/rackspace-powers-up-its-onmetal-cloud-servers/#comments Mon, 28 Jul 2014 10:06:49 +0000 http://siliconangle.com/?p=195722 Continue reading ]]> small__5258487708Rackspace, Inc. has just revealed pricing for its new OnMetal servers, which are now generally available following a limited trial phase.

The dedicated, single-tenant bare-metal machines are designed for applications that can run without hypervisors. They be spun up in less than a minute using the Rackspace cloud OpenStack API, the company said.

Rackspace used designs provided by the Facebook-led Open Compute Project to build its OnMetal servers and added its own tweaks including external cooling and 100 percent solid-state storage. Customers can choose from three different configurations optimized for different kinds of workloads. Two pricing tiers are offered for each configuration – one comes with Rackspace’s standard Managed Infrastructure support, while the other comes with more inclusive Managed Operations support.

The cheapest configuration is OnMetal Compute. It offers ten Intel Xeon CPUs, 32GB of RAM and no extra storage. These are aimed at those running web and app servers, load balancing, queue processing, and come priced at $550 and $700 per server/month for Managed Infrastructure and Managed Operations, respectively.

Next up are the OnMetal Memory servers, which come with 12 CPU cores and 512GB of memory. These are meant for caching, in-memory analytics and search indexing operations, says Rackspace. These are available for $1,650 or $1,800 per server/month.

At the high end, OnMetal I/O servers provide 20 CPU cores and 128GM of RAM, plus 3.2TB of disk storage. These servers are optimized for online transaction processing (OLTP) and database-intensive applications, and will set users back $1,800 or $1,950 per server/month, depending on the level of support.

There’s a catch in the price of Managed Operations support. While the service costs an extra $200 per server/month, Rackspace demands a minimum service charge of $500/month, so the benefits don’t kick in until the user is running at least three servers.

There are other charges, too, such as outgoing network bandwidth, which starts at $0.12 per GB for the first 10TB and decreases on a sliding scale with volume.

The pricing structure means that Rackspace’s OnMetal servers cost about the same as comparable servers offered by cloud rivals like AWS, Google and Microsoft. However, these all offer pay-as-you-go pricing plans which might make more sense for those with reduced demands for their server.

Rackspace tested the systems with some customers in a limited availability release last month and “customers have shown great interest,” wrote Ev Kontsevoy, director of products at Rackspace, in a post on the company blog. ”Some of them are looking to move away from the unpredictable nature of virtualized, multi-tenant environments, while others are intrigued by our promise of ‘elasticity of the cloud, plus economy of colocation,” he wrote.

While Rackspace has struggled to compete with its rivals in the cloud, these new offerings are aimed at its wheelhouse. The company’s most recent financial results showed that over 70 percent of its revenues come from dedicated, single-tenant hosting. The OnMetal servers should help it to capitalize on that business.

photo credit: bdu via photopin cc
]]>
http://siliconangle.com/blog/2014/07/28/rackspace-powers-up-its-onmetal-cloud-servers/feed/ 0
US consular database crash: Not a good time to renew your passport http://siliconangle.com/blog/2014/07/28/us-consular-database-crash-not-a-good-time-to-renew-your-passport/ http://siliconangle.com/blog/2014/07/28/us-consular-database-crash-not-a-good-time-to-renew-your-passport/#comments Mon, 28 Jul 2014 05:15:21 +0000 http://siliconangle.com/?p=195717 Continue reading ]]> error-102075_640Thousands of travellers awaiting US passports and visas have been left on tenterhooks following an unspecified glitch in a database used by the Bureau of Consular Affairs, according to officials from the State Department.

“The Bureau of Consular Affairs has been experiencing technical problems with our passport and visa system,” said spokewoman Marie Harf in a press briefing last week. “The issue is worldwide, not specific to any particular country.”

Consular Consolidated Database (CCD) is built on Oracle software, and is believed to be one of the largest data warehouses in the world. It stores data on more than 100 million visa applicants and contains over 75 million photographs, with details of around 35,000 new applicants added every day.

The unspecified glitch occurred following schedule maintenance work last week that knocked the database out of action for “a few days”. Technicians have since restored “limited capacity” service, but the downtime has caused a backlog of passport and visa processing that’s going to take time to get through. It’s not clear if the problem was due to Oracle or the State Department’s IT staff, but Harf did at least make it clear that nothing malicious took place.

“We do not believe there was any malicious action or anything untoward here,” said Harf. “This was a technical issue, and again, we are working to correct it and should be fully operational again soon.”

It’s not clear how many people have been left waiting for their visas and passports, but US officials told the Associated Press that up to 50,000 persons were affected in just one unnamed country.

The State Department could not say how long it might take them to get through the backlog. It also refused to say when the database would be back up and running at full capacity.

“It’s going to take a little while, so we ask people to be patient,” added Harf.

Image credit: geralt via Pixabay.com
]]>
http://siliconangle.com/blog/2014/07/28/us-consular-database-crash-not-a-good-time-to-renew-your-passport/feed/ 0
What you missed in Big Data: another spike the analytics investment chart http://siliconangle.com/blog/2014/07/25/what-you-missed-in-big-data-another-spike-the-analytics-investment-chart/ http://siliconangle.com/blog/2014/07/25/what-you-missed-in-big-data-another-spike-the-analytics-investment-chart/#comments Fri, 25 Jul 2014 19:26:41 +0000 http://siliconangle.com/?p=195618 Continue reading ]]> big data anlaytics numbers mosaic It’s been a historic week for the analytics market,  with Hortonworks Inc. bagging $50 million in equity from Hewlett-Packard Co. as part of a broad strategic alliance that will focus on integrating the startup’s version of Hadoop with the HP HAVEn analytics suite.   The move comes hot on the heels of Google’s venture capital arm pouring $80 million into rival MapR Technologies Inc. and four months after Cloudera Inc., another distributor of the batch processing platform, netted a massive $740 from Intel.

That means all three of the top Hadoop distributors now have prominent industry backers, which speaks to how far the project has come from its humble beginnings as an internal project at Yahoo and underscores the consensus that analytics phenomenon is here to stay. But while the most recent funding is certainly a positive indicator for the market as a whole, it raises concerns over the future of open-source community favorite Hortonworks, which  is now neck-deep in a partnership with an incumbent  vendor that naturally puts its own  interests above that of the ecosystem.

HP isn’t the only big-name industry player to have invested in Hadoop this week. Teradata Corp. upped the ante with not one but two strategic acquisitions: Revelytix Inc., which develops solutions for integrating information inside the framework and structured query specialist Hadapt Inc.. The data warehousing giant hopes that the deal will help improve its position against historical rivals such as IBM and SAP, while are also doubling down on functionality, as well as newer threats like HP Vertica.

Over in the startup scene, Zettaset Inc. announced that it’s making the individual encryption, high-availability and role-based access control capabilities of its flagship Orchestrator management solution for Hadoop available as standalone modules. The decision comes as distributors such as the freshly funded Hortonworks work to integrate similar features  directly into their platforms, which is presumably making organizations less willing to buy into a full-fledged data protection and security product from another vendor and shoulder the added complexity that comes with that. Zettaset now faces  tough choices on how to move ahead.

photo credit: krazydad / jbum via photopin cc
]]>
http://siliconangle.com/blog/2014/07/25/what-you-missed-in-big-data-another-spike-the-analytics-investment-chart/feed/ 0
No more ‘army of one’ with Big Data accountability revolution | #MITIQ http://siliconangle.com/blog/2014/07/25/big-data-accountability-ensures-no-more-army-of-one-mitiq/ http://siliconangle.com/blog/2014/07/25/big-data-accountability-ensures-no-more-army-of-one-mitiq/#comments Fri, 25 Jul 2014 18:09:13 +0000 http://siliconangle.com/?p=195657 Continue reading ]]> ArmyAt this week’s MIT CDOIQ Symposium, held in Cambridge, Massachusetts, you wouldn’t have been faulted for recognizing that two industries that have embraced the Big Data revolution are healthcare and the financial sector. The benefits of employing analytics models to these fields are more and more apparent. You might be surprised, however, to learn of one particular public sector player that is seeking an operational advantage via Big Data: the Department of Defense.

Mort Anvari, Director of Programs and Strategy within the Deputy Assistant Secretary of the Army’s Cost and Economics division, has overseen the creation and implementation of directives aimed at the cost culture and cost management of the US Army.

“Private industry is easier because everyone is cost conscious,” Anvari explains. “In government, particularly during a war, that mission is driving everything. Most of our officers are only concerned with how much money they need and how they spend it.” Anvari admits that his task of directing commanders to accede to a cost culture mindset was a pretty big deal. “We had to look at it from the people’s perspective. That includes convincing leadership that attention to cost doesn’t create a bad image for the country.” The main argument centered on the perception that being cost conscious could appear to be putting our servicemembers unnecessarily into greater harm’s way.

Watch the interview in its entirety here:

Surviving the Culture Clash

Anvari claims an early success with the education of commanders and other officers that paying special attention to cost and soldier safety were not mutually exclusive. “You can be cost conscious. You can do more with your resources. You can be more efficient with it and still care about safety and care about the soldiers.”

With projects budgeted above $10 million, an automatic cost/benefit analysis is enacted. Anvari oversees more than 2,000 Army analysts who perform and validate each of these. Knowing the process, commanders will typically consider an additional course of action to what they submit for review. “Talking to commanders and leadership, we ask ‘What is your information need?’,” he stated. “Based on that, we develop a data need for that organization.”

Similar to challenges faced in the private sector, one of the issues overcome by the Army was convincing certain organizations that possessed data to share that property with other organizations. “Communicating the data need from organization A to organization B, telling organization B you need to provide this data that is not for you, it’s for someone else,” Anvari said, “that was a big culture shock.”

However, in explaining the funding structure for the military like an upside down tree, Anvari was able to bring an understanding that all funding came from the top and spread out to all of the “branches” within the Army. “We call it fund centers. They have all the money. The cost centers are the ones using this money. It could be them or it could be others. It’s truly like a neural network of information.”

Unlike private sector companies, the Army realized they had to streamline their budgeting and allocation process due to the fact they are subjected to strict oversight by the Congress. As all of the funding is taxpayer monies, citizens are also privy to the budgeting process through the use of Freedom of Information Act requests.

As noted above, the cost-benefit analysis process is automatic on projects in excess of $10 million. However, Anvari notes that projects under that threshold, undertaken by commanders of smaller outfits, are pretty well self-monitored because those commanders want to show that they are capable of critically applying the new cost culture analysis in the hopes they will be promoted to heading up larger projects in the future.

“Accountability is hard to swallow,” says Anvari, with regard to the early push back from Army leadership, “no matter how normalized the process is.” Anvari’s work is seeing results, however. “Cost management is on its feet and working,” he concluded.

photo credit: The U.S. Army via photopin cc
]]>
http://siliconangle.com/blog/2014/07/25/big-data-accountability-ensures-no-more-army-of-one-mitiq/feed/ 0
IoT upgrades for better translations and security : Software-led trends http://siliconangle.com/blog/2014/07/25/iot-upgrades-for-better-translations-and-security-software-led-trends/ http://siliconangle.com/blog/2014/07/25/iot-upgrades-for-better-translations-and-security-software-led-trends/#comments Fri, 25 Jul 2014 16:00:05 +0000 http://siliconangle.com/?p=195597 Continue reading ]]> This week’s Smart IT roundup features an Internet of Things (IoT) translator emerging from stealth, improved security for OpenStack, and an updated Spend Automation software release from SciQuest Inc.

robots sparks fireworks internet of things connected devices m2m

Octoblu emerges from stealth

 

Octoblu,  a real-time connections and communications management platform across systems, people and physical devices, emerged from stealth this week with the goal to make connected devices better communicate with each other using Meshblu.

Meshblu is an open source machine-to-machine (M2M) instant messaging platform created by Chris Matthieu, cofounder and CTO of Octoblu.  It can be used for the discovery, control and management of any API-based software application, any hardware or appliance, or social media network – connecting devices through a variety of protocols across a common platform.

Octoblu is a member of the AllSeen Alliance, a nonprofit open source consortium dedicated to driving the widespread adoption of products, systems and services that support the Internet of Everything, and is also working on a robust security and rights management architecture to help enterprises of all sizes create innovative IoT services.

Catbird releases v.6.0 to enhance OpenStack security

 

Security vendor Catbird Networks, Inc. claims that data centers built on OpenStack are vulnerable to exploitations as users are abandoning the security policies that come default with branded packages.

To address this issue, the company released Catbird 6.0 which automates the management of security and compliance for VMware ESX and will support Microsoft Hyper-V and software defined networking frameworks, including VMware NSX and OpenStack advanced networks services through Neutron.

Catbird CTO Randal Asay acknowledges the massive enthusiasm for OpenStack, but pointed out there is a “huge development effort required and tons of security decisions to make. And if you know anything about development communities, they are not known for their security policies.”

Analyst David Monahan, research director at Enterprise Management Associates, agreed that the security around OpenStack needs to evolve in order to provide protection for the various components within virtualized environments. He added that there’s a need for new, adaptive security tools to streamline the process.

SciQuest’s upgraded Spend Automation Solution Suite

 

SciQuest, a pure-play provider of cloud-based business automation solutions for spend management, announced the upgraded version of its spend automation suite, to better manage source-to-pay cycles, collaborate with suppliers, extend visibility into procurement activities to ultimately turn spending into a strategic source of savings.

The new version of the suite includes the redesigned Sourcing Director product, which delivers a centrally-managed e-sourcing solution for automating the entire sourcing cycle, from event-creation to supplier evaluation and awarding.  This is integrated with other SciQuest products such as the Total Supplier Manager that enables buyers to further increase their efficiency, while allowing suppliers to manage their profiles and responses to sourcing opportunities from a central location.

The upgrade also includes Salesforce.com integration, accelerated and improved invoice routing and management, general improvement of the user experience, and a new reporting add-on module for extending supplier management capabilities with a focus on usability and setting/tracking against goals.

photo credit: tinkernoonoo via photopin cc
]]>
http://siliconangle.com/blog/2014/07/25/iot-upgrades-for-better-translations-and-security-software-led-trends/feed/ 0