Presentation is loading. Please wait.

Presentation is loading. Please wait.

OptIPuter Goal: Removing Bandwidth Barriers to e-Science ATLAS Sloan Digital Sky Survey LHC ALMA.

Similar presentations

Presentation on theme: "OptIPuter Goal: Removing Bandwidth Barriers to e-Science ATLAS Sloan Digital Sky Survey LHC ALMA."— Presentation transcript:

1 OptIPuter Goal: Removing Bandwidth Barriers to e-Science ATLAS Sloan Digital Sky Survey LHC ALMA

2 Why Optical Networks Will Become the 21 st Century Driver Scientific American, January 2001 Number of Years Performance per Dollar Spent Data Storage (bits per square inch) (Doubling time 12 Months) Optical Fiber (bits per second) (Doubling time 9 Months) Silicon Computer Chips (Number of Transistors) (Doubling time 18 Months)

3 The OptIPuter Project – Removing Bandwidth as an Obstacle In Data Intensive Sciences NSF Large Information Technology Research Proposal –UCSD and UIC Lead CampusesLarry Smarr PI –USC, UCI, SDSU, NW, TA&M Partnering Campuses Industrial Partners: IBM, Sun, Telcordia/SAIC, Chiaro, Calient $13.5 Million Over Five Years Optical IP Streams From Lab Clusters to Large Data Objects NIH Biomedical Informatics Research Network NSF EarthScope

4 Application Barrier One: Gigabyte Data Objects Need Interactive Visualization Montages--Hundred-Million Pixel 2-D Images –Microscopy or Telescopes –Remote Sensing GigaZone 3-D Objects –Seismic or Medical Imaging –Supercomputer Simulations Interactive Analysis and Visualization of Such High Resolution Data Objects Requires: –Scalable Visualization Displays –Montage and Volumetric Visualization Software –UIC EVLs JuxtaView and Vol-a-Tile

5 OptIPuter Project Goal: Scaling to 100 Million Pixels JuxtaView (UIC EVL) on PerspecTile LCD Wall –Digital Montage Viewer –8000x3600 Pixel Resolution~30M Pixels Display Is Powered By –16 PCs with Graphics Cards –2 Gigabit Networking per PC Source: Jason Leigh, EVL, UIC; USGS EROS NCMIR – Brain Microscopy (2800x layers)

6 Application Barrier Two: Campus Grid Infrastructure is Inadequate Campus Infrastructure is Designed for Web Objects –Being Swamped by Sharing of Digital Multimedia Objects –Little Strategic Thinking About Needs of Data Researchers Challenge of Matching Storage to Bandwidth –Need To Ingest And Feed Data At Multi-Gbps –Scaling to Enormous Capacity –Use Standards-Based Commodity Clusters (Rocks) OptIPuter Aims at Prototyping a National Architecture –Federated National and Global Data Repositories –Lambdas on Demand –Campus Laboratories Using Clusters with TeraBuckets –Campus Eventually with a Shared PetaCache

7 OptIPuter UCSD Coupling Linux Clusters with High Resolution Visualization

8 OptIPuter is Studying the Best Application Usage for Both Routed vs. Switched Lambdas OptIPuter Evaluating Both: –Routers –Chiaro, Juniper, Cisco, Force10 –Optical Switches –Calient, Glimmerglass –Lightpath Accelerators –BigBandWidth UCSD Focusing on Routing Initially UIC Focusing on Switching Initially Next Year Merge into Mixed Optical Fabric Chiaro Estara Glimmerglass

9 Application Barrier Three: Shared Internet Makes Interactive Gigabyte Impossible NASA Earth Observation System –Over 100,000 Users Pull Data from Federated Repositories –Two Million Data Products Delivered per Year –10-50 Mbps (May 2003) Throughput to Campuses –Typically Over Abilene From Goddard, Langley, or EROS Biomedical Informatics Research Network (BIRN) –Between UCSD and Boston –Similar Story –Lots of Specialized Networking Tuning Used –50-80 Mbps Remote Interactive Megabyte is Possible But Interactive Gigabyte is Impossible IP over Lambdas with Alternate Protocols

10 Multi-Latency OptIPuter Laboratory National-Scale Experimental Network Source: Tom West, CEO NLR (Booth 3409) Chicago OptIPuter StarLight NU, UIC SoCal OptIPuter USC, UCI UCSD, SDSU 2000 Miles 10 ms =1000x Campus Latency National Lambda Rail Partnership Serves Very High-End Experimental and Research Applications 4 x 10GB Wavelengths Initially Capable of 40 x 10Gb wavelengths at Buildout

11 An International-Scale OptIPuter is Operational over the First Set of 76 International GE TransLight Lambdas European lambdas to US –8 GEs Amsterdam Chicago –8 GEs LondonChicago Canadian lambdas to US –8 GEs Chicago Canada NYC –8 GEs Chicago Canada Seattle US lambdas to Europe –4 GEs Chicago Amsterdam –3 GEs Chicago CERN European lambdas –8 GEs Amsterdam CERN –2 GEs Prague Amsterdam –2 GEs Stockholm Amsterdam –8 GEs London Amsterdam TransPAC lambda –1 GE ChicagoTokyo IEEAF lambdas (blue) –8 GEs NYCAmsterdam –8 GEs SeattleTokyo UKLight CERN Northern Light Source: Tom DeFanti, EVL, UIC

Download ppt "OptIPuter Goal: Removing Bandwidth Barriers to e-Science ATLAS Sloan Digital Sky Survey LHC ALMA."

Similar presentations

Ads by Google