CERN Transfers 500TB Across Europe In 10 Days
On the 25th of April 2005, in a significant milestone for scientific grid computing, eight major computing centres successfully completed a challenge to sustain a continuous data flow of 600 megabytes per second (MB/s) on average for 10 days from CERN in Geneva, Switzerland to seven sites in Europe and the US.
The total amount of data transmitted during this challenge 500 terabytes would take about 250 years to download using a typical 512 kilobit per second household broadband connection.
This exercise was part of a series of service challenges designed to test the global computing infrastructure for the Large Hadron Collider (LHC) currently being built at CERN to study the fundamental properties of subatomic particles and forces. The service challenge participants included Brookhaven National Laboratory and Fermilab in the US, Forschungszentrum Karlsruhe in Germany, CCIN2P3 in France, INFN-CNAF in Italy, SARA/NIKHEF in the Netherlands and Rutherford Appleton Laboratory in the UK.
“This service challenge is a key step on the way to managing the torrents of data anticipated from the LHC,” said Jamie Shiers, manager of the service challenges at CERN. “When the LHC starts operating in 2007, it will be the most data-intensive physics instrument on the planet, producing more than 1500 megabytes of data every second for over a decade.”
The goal of LHC computing is to use a world-wide grid infrastructure of computing centres to provide sufficient computational, storage and network resources to fully exploit the scientific potential of the four major LHC experiments: ALICE, ATLAS, CMS and LHCb. The infrastructure relies on several national and regional science grids. The service challenge used resources from the LHC Computing Grid (LCG) project, the Enabling Grids for E-SciencE (EGEE) project, Grid3/Open Science Grid (OSG), INFNGrid and GridPP.
Fermilab Computing Division head Vicky White welcomed the results of the service challenge. “High energy physicists have been transmitting large amounts of data around the world for years,” said White. “But this has usually been in relatively brief bursts and between two sites. Sustaining such high rates of data for days on end to multiple sites is a breakthrough, and augurs well for achieving the ultimate goals of LHC computing.”
The next service challenge, due to start in the summer, will extend to many other computing centres and aim at a three-month period of stable operations. That challenge will allow many of the scientists involved to test their computing models for handling and analyzing the data from the LHC experiments.
- Global Digital Signature Market To See Huge Leap: Study
- Using Cloud, AI To Transform Banking Sector
- Tips To Transform Your Business With Cloud Migration
- Fog Computing Market To Cross USD 700 Mn By 2024
- Datamato Bets On AI, IoT; Focuses On Newer Markets
- Cloud Computing Driving Innovation For Next Generation
- Cognitive Computing: An Essential Science In HR
- What's Stopping Banks From Embracing Latest Technologies?
- Microsoft Gears Up For Quantum Computing Ecosystem
- Weekly Rewind: Top 10 Stories On CXO Today (Sep18-23)