

New research has shown that it may be possible to cut the costs of cloud computing by 30%, but only if your data center server is using the right kind of chip.
A team of researchers from Finland and Germany discovered that some server clusters are as much as 40% faster than others, with the main difference in speed coming down to the type of processor that individual servers run on. The analysis, which was carried out by a team of data scientists, took place on Amazon’s E2 cloud service.
The news will certainly be well received by the cloud computing industry, especially since the cost benefits of the cloud have already become apparent for many SMEs. By being able to slice an additional 30% off their running day-to-day costs, cloud companies will be able to pump money into developing other aspects of their business.
The BBC states that, “Amazon promotes its service as using generic hardware that offers the same performance to every customer”, but this would appear not to be the case.
During the study, researchers analyzed a number of different servers used by Amazon’s customers over a period of 12 months. They employed a software program that could “interrogate” programs behind each group of servers used by individual customers, and identify the speed of the chips that each group were running.
The researchers found that for the most part, newer servers running the latest hardware performed far better than older models.
“Through a set of detailed micro-benchmark and application-level benchmark measurements, we observe that the performance variation within the same sub-type of instance is relatively small, whilst the variation between different sub-types can be up to 60%.”
Cloud computing companies would be able to make significant savings by investing in the latest hardware, as faster and newer chips are generally able to process data at much greater speeds than older models, concludes the report.
You can download the report in full here.
THANK YOU