SiliconANGLE Extracting the signal from the noise. Mon, 28 Jul 2014 23:22:50 +0000 en-US hourly 1 MIT proffesor shares his take on the CDO | #MITIQ Mon, 28 Jul 2014 19:08:33 +0000 Continue reading ]]> Stuart Madnick - MITCDOIQ 2014 - theCUBEAbout 25 years ago, a study was conducted at MIT to assess the impact of the education provided by the university on the overall academic performance of students. The results set off a series of probes into the school’s conduct. The researchers found that the average IQ score of graduating seniors was lower by a small but statistically significant number than that of the entering freshmen they had tested four years earlier.

After the investigation ran its course, however, it became apparent that the situation was not at all as it appeared. Management, it turned out, acted on the assumption that the study examined the same students on both occasions, when in fact, the seniors and the sophomores were two different groups. To  Stuart Madnick, a Professor of Information Technology and Engineering Systems at the MIT  Sloan School of Management, the survey serves as a prime example of how important it is to always take a measured and careful approach to processing data. This holds especially true for the organizations pursuing strategically important analytics initiatives today.

“That ability to understand the data we’re using is so important, and we see time and time again where we misunderstand our data and that leads to amusing stories like that, or sometimes serious issues occurring,” Madnick told Wikibon co-founder Dave Vellante and co-host Paul Gillin in his latest appearance on SiliconANGLE’s theCUBE at the recently concluded MITCDOIQ Symposium.

Rushing to automation without the rush to quality


As was the case with the student study, many of the inconsistencies that spring up in the course of enterprise analytics projects these days are not the result of errors in the data but rather an incorrect view of the statistical facts at hand, Madnick highlighted.  That remains a burning challenge even as analytic technologies such as Hadoop take on increasingly central roles in decision making across a wide range of sectors, an obstacle  he blames on what he perceives as the industry’s somewhat rushed approach to the Big Data phenomenon.

“Sometimes innovations go through a gestation period that takes a while to sort out, but the mere fact there are these bumps in the road doesn’t mean you should go in without being open to realizing there are bumps in the road and doing everything possible to realize,” Madnick said. “That’s where things have been left down, the rush to getting automated without the rush to quality.”

Addressing the issue of incorrect data interoperation at the organizational  level requires change from the top, which is where the chief data officer (CDO) is supposed to come in. The emerging role is touted by its proponents, many of whom hail from regulated industries such as healthcare and financial services, as the answer to the information governance woes of the modern enterprise. Chief among those is security, and specifically the majority of breaches that originate internally.

“Let’s go back to two of the widely-discussed cyber incidents: WikiLeaks on one hand and Snowden with the NSA on the other hand. Would a better firewall have stopped that? Would a better cryptographic code would have stopped that? The answer is no, and a lot to do with the organizational structural and training,” Madnick explained.  “That’s not part of the CIO’s role normally, it’s not part of the CSO’s role normally. If you think about what the CDO’s role should be, that should be dead center.”

See Madnick’s entire segment below:

]]> 0
3 options to build private clouds with public cloud perks Mon, 28 Jul 2014 18:00:45 +0000 Continue reading ]]> global worldwide tunnel cloudsAt the OpenStack Enterprise forum, Wikibon founder Dave Vellante proclaimed 2014 “the year of the cloud.” Now that they hype has died down, he said, true change is beginning to occur in large-scale businesses.

Many of those businesses are still wrestling with the question of whether to adopt public clouds, private clouds or a hybrid. Private clouds are considered the safest on-ramp to cloud computing, and for companies that face strict regulations over data ownership or location, they may be the only option.

That doesn’t mean these companies need to sacrifice the flexibility, availability and efficiency of cloud computing, though. Here’s a look at how three companies – Hotlink Corp., Microsoft, and Hewlett-Packard Co., are attempting to offer enterprises ways to build private clouds that function like their public counterparts.

Hotlink Hybrid IT Platforms


Lynn LeBlanc, CEO and founder of the Hotlink, explains that Hotlink enables the enterprise to operate more like a public cloud by compromising: Hotlink offers “hybrid IT platforms that combine the best of both public and private clouds to provide the scalability, flexibility and speed users need at an economical price.”

In concert with VMware’s management infrastructure, Hotlink frees administrators from worrying about “building out costly infrastructure for various projects or creating […]redundant architecture at off-site locations for disaster recovery and business continuity[…].” Instead, enterprises can use VMware’s vCenter to “leverage the public cloud to supplement their existing on-premise data centers.” Using vCenter as “single point of administration management,” ensures that administrators have simultaneous access to their “infrastructure, tools, [and] workflows,” so that interoperation between private and public cloud services is seamless.

When asked what mid-market firms and enterprises are looking for when it comes to infrastructure, LeBlanc described a system that embraces change, one that “evolves with the growth of the company, as well as expands easily to handle the growing and varied compute demand from users.”

Microsoft Win Server


Indeed, while enterprises seek to become more like the public cloud, few want to be “locked in” to it, according to Brad Anderson, Corporate VP of Windows Server & System Center at Microsoft, “they want to be secure.” Because enterprises operate on a mass scale, their services and workloads require different levels of security to manage. Enterprises need the flexibility to move between public and private cloud without a huge hassle.

Microsoft, he says, works to ensure that its clients have all options on the table when it comes to cloud. Clients can move a workload from private to public cloud without adding even a line of code. Anderson explains: “We can promise and prove that when you develop an application to run on Win Server, you can move it across the clouds and you are not locked in any cloud.”

HP Helion Cloud Portfolio


Bill Hilf, senior VP of HP Helion products and services, is in the same camp as Brad Anderson and Lynn LeBlanc. HP Helion also emphasizes flexibility and scalability, giving enterprises the reins when it comes to the type of set up they need: “…we believe that customers will build the cloud that they need and not necessarily the cloud that the vendor is trying to describe. So against that vision, what we built is a cloud portfolio that is very composable,” he said. Based on their needs, HP Helion clients can put together an environment of, “distinct but complementary components” specifically suited to their business requirements.

Solutions from the Hotlink Corporation, Microsoft, and HP Helion allow enterprises to implement key characteristics of public cloud without requiring a total migration. The possibilities a hybrid cloud environment opens up mean enterprises need to consider the environment in which their workload should exist, but also how to move easily between both public and private cloud. Making their internal cloud more like a public one makes it easier to maneuver deftly between public and private options.

photo credit: Marty.FM via photopin cc
]]> 0
Smart kitchens get wave of new tools and funding Mon, 28 Jul 2014 17:00:11 +0000 Continue reading ]]> This week’s Smart Living roundup features a smart baking companion, an immersive movie and lighting experience for home entertainment, and more star power behind smart home products.


PantryChic’s smart baking tool


Nicole Sollazzo Lee, inventor, Founder and President of Nik of Time, Inc., knows the frustrations of bakers, creating a connected kitchen tool for precise measurements, a necessity for any pastry chef.

PantryChic is a smart storage and dispensing system that allows the user to precisely measure ingredients as the base of the system is also a digital scale. It features airtight, stackable containers that fit at the top of the dispensing mechanism.  The dispensing mechanism accurately measures by weight and connects to your smartphone or tablet via Bluetooth for following recipes and tracking which ingredients are most used, or which recipes are frequently prepared.

PantryChic can also help people who are trying to lose weight as they can use the scale to make sure that they consume just the right amount of food, and keep custom mixes of dry ingredients for those who have special dietary needs or restrictions.

You can help PantryChic meet its goals by funding it on Kickstarter, where the campaign launched today.

Syfy + Philips team up for immersive movie experience


Now your TV can talk to your light bulbs, thanks to a partnership between the Syfy channel and Koninklijke Philips N.V. (commonly known as Philips).

On Wednesday, July 30, 2014, “Sharknado 2: The Second One” will have its global premiere on Syfy, and the cable channel has partnered up with Philips to make your viewing experience more interesting.

Sharknado and Sharknado 2 will be aired at 7PM and 9PM EST, respectively, and if you happen to have Hue, Philips’ color-changing connected lightbulb, you can download the Syfy Sync app to experience what it calls a “light track” or a visual soundtrack.  The app will sync with the hue to deliver an immersive movie-watching experience, and it also acts as a second screen where users can get additional information about the two shark infested movies, as well as deliver a social experience to movie fans.

In other Philips news, the company has partnered with Indiegogo for this the second annual Innovation Fellows Competition, a competition aimed at finding the ‘next big thing’ created by average people. The contest seeks out inventors who have no means to get their product to market or have failed to get financial support from the industry or the government.  The projects will be launched on Indiegogo and the winner will be awarded with $60,000 plus the money they raise through the crowdfunding platform.  The winner will also be mentored by Philips executives, and four runners-up will be awarded $10,000 each.

The Orange Chef Co. gets backing from M7 Tech Partners


M7 Tech Partners, the venture capital firm founded by NBA star Carmelo Anthony and his friend Stuart Goldfarb, a former executive at NBC and Bertelsmann, announced that it has invested in The Orange Chef Co., makers of smart connected kitchen tools such as the Prep Pad.

Terms of the investment were not disclosed but Anthony and Goldfarb will become advisors who have the power to influence the expansion of the company’s focus on fitness and health, as well as add to its marketing strategy.

Image via PantryChic
]]> 0
Big Data needs drive R as a powerful enterprise ready language Mon, 28 Jul 2014 16:01:46 +0000 Continue reading ]]> r-project-logoAs Big Data continues to reach larger enterprise adoption, the programming languages that support writing schema and producing Big Data analysis algorithms will rush to keep up. As a result, the open source statistical language R has become a go-to skill for Big Data scientists and developers, with its popularity soaring amid languages and skills.

Combined with Big Data tools, the R language provides a deep statistical handle for large data sets, conducting statistical analysis, and rendering data-driven visualization. R is particularly widely used in the industries of finance, pharmaceuticals, media and marketing, where it can be used to help guide data-driven business decisions.

The popularity of R has grown significantly in recent years. A 2013 survey of data mining professionals conducted by Rexer Analytics indicated that the R programming language is by far the most popular statistical analysis tool, with 70% of respondents saying they use it at least occasionally. Developers interested in learning more about R can look into training on the subject to get a better grasp of its use in the Big Data paradigm.

In the enterprise market numerous companies and projects have risen to harness R and bring it to Big Data scientists and business users alike. These projects and tools include the use of R in Microsoft’s cloud computing Azure Machine Learning platform, IBM’s Big R, Teradata Aster R, Oracle R Enterprise, PivotalR’s Big Data R distribution, and SAP’s R for HANA.

Azure Machine Learning is a game changer with R

Microsoft last month announced the launch of its new platform Azure Machine Learning (ML). It is a platform dedicated to cloud predictive analytics on large volumes of data. Azure ML’s cloud service allows scientists and developers to effectively integrate predictive analytics data into their applications.

What is interesting is that Microsoft is providing APIs and templates based on the R language. Azure ML supports more than 300 packages using the R programming language; and allows users to assemble a model suited to their needs built out of existing pieces rather than forcing developers to build something from scratch. The ease of implementation makes machine learning accessible to a larger number of investigators with various backgrounds–even non-data scientists.

Microsoft says the Azure ML platform can predict future trends in systems such as with search engines, online recommendation, ad targeting, virtual assistants, demand forecasting, fraud detection, spam filters and more.

IBM integration with Big R

IBM InfoSphere BigInsights Big R is a library of functions that provides end-to-end integration with the R language and InfoSphere BigInsights. Big R can be used for comprehensive data analysis on the InfoSphere BigInsights server, lowering some of the complexity of manually writing MapReduce jobs.

Big R provides an end-to-end integration of R within IBM InfoSphere BigInsights. This makes it easy to write and execute R programs that operate on big data. Using Big R, an R user can explore, transform, and analyze big data hosted in a BigInsights cluster using familiar R syntax and paradigm.

Teradata Aster R

The rapid adoption of R and its proven value means that organizations looking to drive new revenue-generating insights should make R a part of their predictive analytics strategy. Teradata, the analytic data platforms, recently introduced Teradata Aster R, which extends the power of open source R analytics by lifting the memory and processing limitations.

Teradata Aster R gives analysts a solution to business analytics enterprise-ready, scalable to the highest degree, reliable and easy to use, allowing you to develop high-speed massive amounts of data to meet the analytical needs of each company. The platform delivers the power of R analytics to the enterprise. To support R analysts, Teradata offers familiar R language and tools, massive processing power, and a rich set of analytics. In addition, analysts have access to an immense volume of integrated data from multiple sources.

Teradata Aster R benefits from a platform of high performance computing and has all the advantages in terms of security, data management, and a set of analytics including Teradata Aster R Library, Teradata Aster R Parallel Constructor, and Teradata Aster SNAP Framework Integration.

Oracle R Enterprise

Oracle R Distribution is Oracle’s free distribution of open source R. The database company offers Oracle R Enterprise integrated with R. Oracle R Enterprise primarily introduces a variant to many R data types by overloading them in order to integrate Oracle database with R.

The company also offers Oracle Big Data Connectors that facilitate interaction and data exchange between a Hadoop cluster and Oracle Database. Oracle R Connector for Hadoop is a set of R packages that supports the interface between a local R environment, Oracle Database, and Hadoop.

Oracle strategy with R Enterprise is to provide in-database analytics capabilities for its widely adopted enterprise RDBMS, and for its Exadata appliance.

R for Big Data with PivotalR

PivotalR is a package that enables users of R to interact with the Pivotal (Greenplum) Database as well as Pivotal HD and HAWQ for Big Data analytics. PivotalR is an R library with a familiar user interface that enables data scientists to perform in-database and in-Hadoop computations.

HAWQ is the key differentiating technology in making Pivotal HD the world’s most powerful Hadoop distribution. With support of R language, it offers Dynamic Pipelining, a world-class query optimizer, horizontal scaling, SQL compliant, interactive query, deep analytics, and support for common Hadoop formats.

SAP integrates R with HANA

SAP has integrated R with their in-memory database HANA as the modern platform for mobile, analytics, data services and cloud integration services. SAP HANA works with R by using Rserve, a package that allows communication to an R Server.

The data exchange between SAP HANA and R is very efficient, because they all use the column storage style. SAP’s strategy for integrating HANA with R is to provide modern platform for all applications, enabling customers to truly innovate and transform their businesses in the cloud. The solutions include a comprehensive set of prepackaged rapid-deployment solutions that aim to automate deployment and simplify journey to the cloud.

Contributors: Kyt Dotson and Saroj Kar.

]]> 0
SAP’s 4 keys to data governance | #MITCDOIQ Mon, 28 Jul 2014 15:14:12 +0000 Continue reading ]]> drowning in big data tree underwaterFor some people, data governance is still a little unclear in terms of what it is and how it’s applied to the enterprise. In an interview with Dave Vellante and Jeff Kelly at this year’s MIT CDOIQ Symposium in Cambridge, MA, Tina Rosario, VP of Global Operations for SAP America, offered a brief explanation of data governance by sharing how the company manages its program through keeping things simple and working closely with data analysts.

Focusing key data and Axing the data speak


Rosario said that SAP America centers everything it does on data governance around four key capabilities:

  1. Having good organization and practices around data governance, meaning rules, standards and policies.
  2. Looking at the right engineer processes for simplifying how data is created, updated and maintained.
  3. Looking at data from an ongoing maintenance point of view and determining what the right operations and tools are to automate the maintenance of data.
  4. Having good technical and business-driven IT solutions.


With this in mind, SAP America drills down the ‘data speak’ into simple business language. This means figuring out the critical bits of information needed to run the business process and the currency of that information. Rosario said this is where the company finds out where it’s going to govern. She added that data governance is ultimately about learning what to do in order to better enable business processes to run more efficiently and how to get the data to businesses faster and with the right level of content.

Governance and Analytics


Kelly asked Rosario if there’s any tension between data governance and data analytics. She responded by saying it’s actually the opposite, and the two work very closely together.

“I think it’s our job in terms of governance and management to make sure that the data is at the right level of quality and is at the right level of standards, so the analytics people don’t have to spend time normalizing, rationalizing,” said Rosario. This makes the data easily accessible for analysts.

She described the relationship between the two as symbiotic. For example, before running a report, analytics would ask governance for access to a certain level of data and then help to ensure that it’s from the right source, at the right level of quality and also available. On the flip side, governance needs help to drive data analytics, using tools like SAP’s Information Steward to analyze the current level of data quality.

Data governance is the learning and practice of improving business processes. SAP America has been successful in doing this by following four core capabilities, keeping things simple for businesses and working closely with analytics.

See Rosario’s entire segment below:

photo credit: gideon_wright via photopin cc
]]> 0
PostgreSQL enhancements aimed at luring skittish MySQL users Mon, 28 Jul 2014 14:34:12 +0000 Continue reading ]]> smart infrastructure architectureFour and a half years after Oracle Corp. completed the acquisition of Sun Microsystems, MySQL continues to cast a formidable shadow over the web database market despite losing much of its original open-source character under Oracle’s new ownership. But to say that the buyout has not been felt in the ecosystem would be an understatement.

A growing number of users, including prominent tech firms such as Google and Red Hat, Inc., have already moved or are in the process of moving their data from MySQL to community-led alternatives that aren’t controlled by any one vendor. While Oracle has committed to maintaining an open source version of MySQL, the user community has had his doubts about the company’s sincerity. This has put wind in the sails of historic underdog PostgreSQL, which is now finally hitting its stride but still lacks many of the capabilities needed to fulfill its potential to plug the hole left in the wake of the Sun acquisition.

EnterpriseDB Corp., a top distributor of the open-source platform, is trying to change that, one new feature and performance improvement at a time. The latest batch of enhancements announced by the company mark another  big step in the right direction.

The most notable addition is a new Foreign Data Wrapper, or FDW, for Hadoop that allows users to pull in data from their analytic clusters utilizing familiar SQL syntax without going through the trouble of cobbling together a connector from scratch. The extension levels the playing field in this area against Oracle, which added support for the batch processing framework to MySQL last April. It also lowers the technical barriers that have made managing databases a painful task  in the past.

The new Hadoop connector, set to hit general availability in fall, is joined by  a revamped wrapper for MongoDB. Both take advantage of the FDW upgrade introduced with the 9.3 release of PostgreSQL, which EnterpriseDB says speeds response times and helps keep code maintainable through the use of a formal client library specification.

The extensions were unveiled in conjunction with a pair of new tools that the company is releasing to the community in a bid to smooth out some of the trickier aspects of managing PostgreSQL environments. The first of the free utilities – pg_catcheck – is a diagnostics engine that scans the metadata used to keep track of database objects for errors and inefficiencies. The other solution – pg_hiberantor – helps maintain consistent performance after a failure  by automatically restoring the data held in cache at the time of the shutdown.

EnterpriseDB is also updating two of its premium products to further simplify life for administrators. Replication Server 5.1 reduces latency, provides more room to scale across clusters, makes it easier to search rows and allows users to define custom policies for how to handle data conflicts, according to the firm. EDB Failover Manager 1,1, meanwhile, adds an advanced authentication capability and comes with new agents that run as operating system services so to stay available even when the database itself goes down.

photo credit: Thomas Hawk via photopin cc
]]> 0
GlobalFoundries reportedly pulls out of deal to buy IBM’s chip business Mon, 28 Jul 2014 11:31:42 +0000 Continue reading ]]> medium_3604357252IBM’s hopes of pulling off another fire-sale of one of its businesses look to have been dashed, with reports suggesting that GlobalFoundries, Inc. has pulled out of a deal to buy IBM’s loss-making semiconductor division.

Bloomberg says that GlobalFoundries refused to budge on its valuation of the business, which was reported listed for sale last February when Big Blue retained the services of Goldman Sachs to put a valuation on it.

GlobalFoundries, which is owned by an investment arm of Abu Dhabi’s government, was long considered to be the front-runner in any such deal. Speculation intensified earlier this month, when the company hired ex-IBM employee Henry DiMarco as its new VP for site construction and facilities. DiMarco was previously responsible designing, building, and running IBM’s 300mm chip factory in New York. GlobalFoundries is also a key supplier of chips to Big Blue.

But no matter how good a relationship the two firms have, it looks like any deal is off the table. Bloomberg cites unnamed “people with knowledge of the matter”, who say that negotiations have completely broken down. Those anonymous sources also say GlobalFoundries wasn’t really interested in the business at all, but was looking to snatch up IBM patents and engineers. As for the businesses’ manufacturing facilities, GlobalFoundries deemed these to be of “little or no value,” Bloomberg said.

It’s not clear if Big Blue has any other buyers lined up, but even if it does find a buyer, it’s unlikely to ditch the chips market altogether. Earlier this month, IBM CEO Ginni Rometty announced said the company will spend a whopping $3 billion on chip R&D over the next five years. The plan is to utilize bleeding-edge technologies like carbon nanotubes and silicon photonics to reduce its transistor sizes to just 7nm.

Rometty didn’t commit to building the chips, though, so it remains to be seen whether IBM will make its own chips or offload them to someone else.

photo credit: Pete Morawski via photopin cc
]]> 0
MIT CDOIQ day 2 wrap-up: Public cloud to put CIO’s to pasture? | #MITIQ Mon, 28 Jul 2014 11:00:36 +0000 Continue reading ]]> retireOften times on SiliconANGLE’s theCUBE, guests will have a final question posited them regarding what they believe the bumper sticker on the vehicle pulling away from the event should read. In short, it is a briefly worded takeaway from the keynotes and breakout sessions that encapsulates the overarching message of the conference. Last week’s MIT CDOIQ Symposium, held in Cambridge, Massachusetts, may herald the sunset on the position of Chief Information Officer in many organizations across several industries.

The canary in the coalmine for the CIO position may likely be in the field of healthcare. This industry, in particular, is moving away from the headaches associated with infrastructure and provisioning, looking outside the organization for cloud providers that help to ease that burden. As author and social media strategist Paul Gillin notes, among current CIO’s, “…there is no detection of regret. They are going to be focusing more on data governance and strategy which they like better anyway.” He believes the CIO role will go away but that those currently in that position will find their skills will likely lead them to become the future COO’s and CDO’s of their organizations.

Watch the Day 2 wrap-up in its entirety here:

Everything’s Easier In The Cloud

Wikibon’s Dave Vellante believes cloud providers like Amazon are going to be more and more instrumental in streamlining business processes. “I’m strongly of the opinion that Amazon is going to be provisioning infrastructure better than anyone else in the next 10 years.” He continued, “You’re seeing companies like Amazon step up in the area of compliance and doing things that are making us comfortable.” He concedes there are still entire industries, like the financial services sector, that haven’t embraced the world of public cloud. “The marginal economics of the public cloud are going to be so compelling over the next 10 years as to overwhelm the business case,” he predicted. “Public cloud will become too good an option to ignore.”

This is likely being hastened by the price war we are currently witness to in the public cloud market. “Why should you worry about investing in hardware that you’re going to have to depreciate and you’re ultimately going to lose a lot of that money,” asked Gillin, “when you can just pay a monthly fee and the prices are just going to keep going down?”

With the likelihood of the infrastructure and provisioning playing fields being leveled for all players, the differentiation in companies will transition to their application development and customization.

As Wikibon’s Jeff Kelly explained, “The differentiation is in data an analytics and how you use it.” He agrees that the growing acceptance of cloud computing is driving this monumental shift in how business will be conducted. “The differentiator is going to be how organizations monetize their data assets. It’s as simple as that,” he stated. As the shift he pointed out continues to progress, the role of CDO will be presented with a dual mandate. The first will be in the area of governance and compliance issues with respect to the data. The second will be to enact mission critical strategies that present new and better ways to leverage the organization’s data as an asset.

As Gillin pointed out, having been around in the late 80′s when the CIO role was making its first appearance in the business world, the role of the CDO appears to be taking an almost identical trajectory. While some question the validity of having a CDO, Gillin believes in five year’s time, it will simply be an accepted norm in most organizations.

“Generally speaking, we are seeing the role of CDO solidify a bit,” Kelly concurred. “We are hearing about emerging best practices like executive buy in. That tells me the role is becoming real. The role is being tied to large strategic initiatives.” He continued, “Both of those things are encouraging to me. Overall, we are moving in the right direction.”

photo credit: Philip Taylor PT via photopin cc
]]> 0
Rackspace powers up bare-metal cloud servers Mon, 28 Jul 2014 10:06:49 +0000 Continue reading ]]> small__5258487708Rackspace, Inc. has just revealed pricing for its new OnMetal servers, which are now generally available following a limited trial phase.

The dedicated, single-tenant bare-metal machines are designed for applications that can run without hypervisors. They be spun up in less than a minute using the Rackspace cloud OpenStack API, the company said.

Rackspace used designs provided by the Facebook-led Open Compute Project to build its OnMetal servers and added its own tweaks including external cooling and 100 percent solid-state storage. Customers can choose from three different configurations optimized for different kinds of workloads. Two pricing tiers are offered for each configuration – one comes with Rackspace’s standard Managed Infrastructure support, while the other comes with more inclusive Managed Operations support.

The cheapest configuration is OnMetal Compute. It offers ten Intel Xeon CPUs, 32GB of RAM and no extra storage. These are aimed at those running web and app servers, load balancing, queue processing, and come priced at $550 and $700 per server/month for Managed Infrastructure and Managed Operations, respectively.

Next up are the OnMetal Memory servers, which come with 12 CPU cores and 512GB of memory. These are meant for caching, in-memory analytics and search indexing operations, says Rackspace. These are available for $1,650 or $1,800 per server/month.

At the high end, OnMetal I/O servers provide 20 CPU cores and 128GM of RAM, plus 3.2TB of disk storage. These servers are optimized for online transaction processing (OLTP) and database-intensive applications, and will set users back $1,800 or $1,950 per server/month, depending on the level of support.

There’s a catch in the price of Managed Operations support. While the service costs an extra $200 per server/month, Rackspace demands a minimum service charge of $500/month, so the benefits don’t kick in until the user is running at least three servers.

There are other charges, too, such as outgoing network bandwidth, which starts at $0.12 per GB for the first 10TB and decreases on a sliding scale with volume.

The pricing structure means that Rackspace’s OnMetal servers cost about the same as comparable servers offered by cloud rivals like AWS, Google and Microsoft. However, these all offer pay-as-you-go pricing plans which might make more sense for those with reduced demands for their server.

Rackspace tested the systems with some customers in a limited availability release last month and “customers have shown great interest,” wrote Ev Kontsevoy, director of products at Rackspace, in a post on the company blog. ”Some of them are looking to move away from the unpredictable nature of virtualized, multi-tenant environments, while others are intrigued by our promise of ‘elasticity of the cloud, plus economy of colocation,” he wrote.

While Rackspace has struggled to compete with its rivals in the cloud, these new offerings are aimed at its wheelhouse. The company’s most recent financial results showed that over 70 percent of its revenues come from dedicated, single-tenant hosting. The OnMetal servers should help it to capitalize on that business.

photo credit: bdu via photopin cc
]]> 0
US consular database crash: Not a good time to renew your passport Mon, 28 Jul 2014 05:15:21 +0000 Continue reading ]]> error-102075_640Thousands of travellers awaiting US passports and visas have been left on tenterhooks following an unspecified glitch in a database used by the Bureau of Consular Affairs, according to officials from the State Department.

“The Bureau of Consular Affairs has been experiencing technical problems with our passport and visa system,” said spokewoman Marie Harf in a press briefing last week. “The issue is worldwide, not specific to any particular country.”

Consular Consolidated Database (CCD) is built on Oracle software, and is believed to be one of the largest data warehouses in the world. It stores data on more than 100 million visa applicants and contains over 75 million photographs, with details of around 35,000 new applicants added every day.

The unspecified glitch occurred following schedule maintenance work last week that knocked the database out of action for “a few days”. Technicians have since restored “limited capacity” service, but the downtime has caused a backlog of passport and visa processing that’s going to take time to get through. It’s not clear if the problem was due to Oracle or the State Department’s IT staff, but Harf did at least make it clear that nothing malicious took place.

“We do not believe there was any malicious action or anything untoward here,” said Harf. “This was a technical issue, and again, we are working to correct it and should be fully operational again soon.”

It’s not clear how many people have been left waiting for their visas and passports, but US officials told the Associated Press that up to 50,000 persons were affected in just one unnamed country.

The State Department could not say how long it might take them to get through the backlog. It also refused to say when the database would be back up and running at full capacity.

“It’s going to take a little while, so we ask people to be patient,” added Harf.

Image credit: geralt via
]]> 0