Saturday, July 21, 2007

How Much Data Will Be Collected By CERN?

According to Slashdot:

"Think your storage headaches are big? When it goes live in 2008, CERN's ALICE experiment will use 500 optical fiber links to feed particle collision data to hundreds of PCs at a rate of 1GB/second, every second, for a month. 'During this one month, we need a huge disk buffer,' says Pierre Vande Vyvre, CERN's project leader for data acquisition. One might call that an understatement.'s story has more details about the project and the SAN tasked with catching the flood of data."

Now according to my estimate that is 3600*24*30=2,592,000 GB or 2.592 Petabytes of Data in one month. That is a lot of data. Only 2,592/15= 172.8 Marylou4's in a mounth.


  1. While 2.592 petabytes of data seems like an awful lot, by supercomputing standards it's really not that much. For example, the San Diego Supercomputing Center has over 125 terabytes of disk space and over 25 petabytes of tape storage. It seems to me, then, that the real issue isn't storing 2.5 PB for a month but rather processing that 2.5 PB in a month to be ready for the next month's 2.5 PB.

  2. Yes, good old SDSC. They have my runs on their tape storage.


To add a link to text:
<a href="URL">Text</a>