The Impact of Modern, Large-scale Science on the Design of National and International Research and Education Network

Date: 
Wednesday, September 24, 2008 - 17:30
Location: 
TH 331
Presenter: 
William E. Johnston, Senior Scientist and Adviser, ESnet
Abstract: 
Modern science is completely dependent on high-speed networking. As the instruments of science get larger and more sophisticated, the cost goes up to the point where only a very few (e.g. one LHC, one ITER, one James Webb Space Telescope, etc.) are built. Further, these instruments are mostly based on solid state sensors and so follow the same Moore’s Law as do computer CPUs, though the technology refresh cycle for instruments is 10-20 years rather than 1.5 years for CPUs. This means that the volume of data is going up greatly (exponentially, in fact), sometimes to the point where modern computing and storage technology are at their very limits trying to manage the data. Further, it takes world-wide collaborations of large numbers of scientists to conduct the science and analyze the data from a single instrument, and so the data from the instrument must be distributed all over the world. The volume of data generated by such instruments has reached the level of many petabytes/year – the point where dedicated 10 – 100 Gb/s networks that span the country and internationally are required to distribute the data. Designing and building networks and providing suitable network services to support science data movement has pushed R&E networks to the forefront of network technology: There are currently no commercial networks that handle the size of the data flows generated by modern science. (The aggregate of small flows in commercial networks is, of course, much larger – but not by as much as one might think.) In this talk I will describe several large-scale science projects that generate petabytes of data/year and require multiple 10s of Gb/s network paths to transport that data around the world. I will also describe the design of both the network infrastructure needed to handle this data and the services that must be provided, both to the network operators and to the users, in order to effectively use this new generation of network.