Computing at CERN

One of my favorite technology and startup blogs, GigaOm, wrote a great piece today on CERN and how they plan to use cloud computing to support the physicists and other scientists that are conducting some of the most advanced research in the world. Through the article, we learn that CERN currently uses some massive compute resources, and they are hungry for more!

The Helix Nebula Science Cloud aims to bring enough firepower to solve very hard problems and deal with tons of data churned out by the second. As an example of the size of the task, CERN alone stores about 15 petabytes of data per year, uses 150,000 CPUs continuously, and writes data at 6 GB per second.

The article goes on to say that CERN currently consumes compute power from 150 public (non-government) data centers. That got me thinking about how much CERN is likely spending on compute infrastructure and I did the quick math:

At typical Amazon EC2 rates: 150,000 cpu’s x 8,766 hrs x $0.25 = $328,725,000

At rock bottom Amazon EC2 rates: 150,000 cpu’s x 8766 x $0.04 = $52,596,000

Amazing, isn’t it? Now imagine how large the industry is. Actually, you don’t have to imagine. Gartner tells us that Infrastructure-as-a-Service (IaaS) market was near $4 billion in 2011, and projected to be over $10 billion by 2014!

This is just one reason why we are doing what we are at CPUsage. To capture a piece of the massive market, but also to fundamentally change it.

Comments

Comments

Tags: , , , ,