Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Wide Range of Scientific Disciplines Will Require a Common Infrastructure Example--Two e-Science Grand Challenges –NSF’s EarthScope—US Array –NIH’s Biomedical.

Similar presentations


Presentation on theme: "A Wide Range of Scientific Disciplines Will Require a Common Infrastructure Example--Two e-Science Grand Challenges –NSF’s EarthScope—US Array –NIH’s Biomedical."— Presentation transcript:

1 A Wide Range of Scientific Disciplines Will Require a Common Infrastructure Example--Two e-Science Grand Challenges –NSF’s EarthScope—US Array –NIH’s Biomedical Informatics Research Network Common Needs –Large Number of Sensors / Instruments –Daily Generation of Large Data Sets –Data is on Multiple Length and Time Scales –Automatic Archiving in Distributed Federated Repositories –Large Community of End Users –Multi-Megapixel and Immersive Visualization –Collaborative Analysis From Multiple Sites –Complex Simulations Needed to Interpret Data

2 NSF’s EarthScope--USArray Resolution of Crust & Upper Mantle Structure to Tens of kms. Transportable Array –Fixed Design Broadband Array –400 Broadband Seismometers –~70 Km Spacing –~1500 X 1500 Km Grid –~2 Year Deployments at Each Site –Rolling Deployment Over More Than 10 Years Permanent Reference Network –GSN/NSN Quality Seismometers –Geodetic Quality GPS Receivers All Data to Community in Near Real Time –Bandwidth Will Be Driven by Visual Analysis in Federated Repositories Source: Frank Vernon (IGPP SIO, UCSD)

3 Rollout Over 14 Years Starting With Existing Broadband Stations

4 Federated Repositories Are Needed to Link Brain Multi-Scale Structure and Function Filling Information Gaps with Advanced 3 & 4D Microscopies and New Labeling Technologies Leveraging on Advances in Computational Capabilities Electron Tomography Over Multiple Scales Source: Mark Ellisman, UCSD

5 NIH is Funding a National-Scale Grid Federating Multi-Scale Biomedical Data National Partnership for Advanced Computational Infrastructure Part of the UCSD CRBS Center for Research on Biological Structure Biomedical Informatics Research Network (BIRN) NIH Plans to Expand to Other Organs and Many Laboratories

6 Similar Needs for Many Other e-Science Community Resources ATLAS Sloan Digital Sky Survey LHC ALMA

7 CONTROLPLANECONTROLPLANE Clusters Dynamically Allocated Lightpaths Switch Fabrics Physical Monitoring Apps Middleware A LambdaGrid Will Be the Backbone for an e-Science Network Metro Area Laboratories Springing Up Worldwide Developing GigE and 10GigE Applications and Services Testing Optical Switches Metro Optical Testbeds-the next GigaPOP?

8 Campus Laboratory LambdaGrid “On-Ramps” are Needed to Link to MetroGrid TND2 = Datamining Clusters at NU and UIC Lab. for Advanced Computing –32 Deerfield processors with 10GigE networking each, NetRam storage TNV2 = Visualization Clusters at NU and UIC EVL –27 Deerfield processors with 10GigE networking each, 25 screens TNC2 = TeraGrid Computing Clusters at EVL –32 Deerfield processors with 10GigE networking each LAC 10x 10GigE UIC O-O-O switch EVL TNV2TNC2 router 10x10GigE 2x40GigE TND2 router StarLight/Northwestern TND2 TNV2 router DWDM O-O-O switch … Source: Tom DeFanti, EVL, UIC

9 Research Topics for Building an e-Science LambdaGrid Provide Integrated Services in the Tbit/s Range –Lambda-Centric Communication & Computing Resource Allocation –Middleware Services for Real-Time Distributed Programs –Extend Internet QoS Provisioning Over a WDM-Based Network Develop a Common Control-Plane Optical Transport Architecture: –Transport Traffic Over Multiple User Planes With Variable Switching Modes –Lambda Switching –Burst Switching –Inverse Multiplexing (One Application Uses Multiple Lambdas) –Extend GMPLS: –Routing –Resource Reservation –Restoration UCSD, UCI, USC, UIC, & NW

10 Research Topics for Building an e-Science LambdaGrid Enhance Security Mechanisms: –End-to-End Integrity Check of Data Streams –Access Multiple Locations With Trusted Authentication Mechanisms –Use Grid Middleware for Authentication, Authorization, Validation, Encryption and Forensic Analysis of Multiple Systems and Administrative Domains Distribute Storage While Optimizing Storewidth: –Distribute Massive Pools of Physical RAM (Network Memory) –Develop Visual TeraMining Techniques to Mine Petabytes of Data –Enable Ultrafast Image Rendering –Create for Optical Storage Area Networks (OSANs) –Analysis and Modeling Tools –OSAN Control and Data Management Protocols –Buffering Strategies and Memory Hierarchies for WDM Optical Networks UCSD, UCI, USC, UIC, & NW

11 A Layered Software Architecture is Needed for Defense and Civilian Applications www.ndia-sd.org/docs/NDIA_20June00.pdf SPAWAR Systems Center San Diego


Download ppt "A Wide Range of Scientific Disciplines Will Require a Common Infrastructure Example--Two e-Science Grand Challenges –NSF’s EarthScope—US Array –NIH’s Biomedical."

Similar presentations


Ads by Google