Is Two Million Gigabytes of Data Enough to Uncover the Mystery of the Big Bang?


Here’s a little clip about the Grid Computing facility that nabs and stores the data from the CERN Hadron Collider project.  Having forgotten 99% of my college physics math I still do not understand why it takes such massive power to analyze data from particles so small that, if they were dollars, you could pay off the US national debt with a grain of sand worth of them.

Advertisements

About JoeDuck

Internet Travel Guy, Father of 2, small town Oregon life. BS Botany from UW Madison Wisconsin, MS Social Sciences from Southern Oregon. Top interests outside of my family's well being are: Internet Technology, Online Travel, Globalization, China, Table Tennis, Real Estate, The Singularity.
This entry was posted in news, Singularity and tagged , . Bookmark the permalink.

One Response to Is Two Million Gigabytes of Data Enough to Uncover the Mystery of the Big Bang?

  1. glenn says:

    Maybe we should change our currency to be these small particles. Sounds like we can solve a lot quickly 🙂

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s