University of Illinois at Chicago STAR TAP Present/Future Applications: Enabling Grid Technologies and e-Science Advanced Applications Maxine Brown Co-Principal.

Slides:



Advertisements
Similar presentations
University of Illinois at Chicago Annual Update Thomas A. DeFanti Principal Investigator, STAR TAP Director, Electronic Visualization Laboratory.
Advertisements

CCIRN Meeting Douglas Gatchell US NSF/CISE/SCI July 3, 2004.
1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
SACNAS, Sept 29-Oct 1, 2005, Denver, CO What is Cyberinfrastructure? The Computer Science Perspective Dr. Chaitan Baru Project Director, The Geosciences.
GENI: Global Environment for Networking Innovations Larry Landweber Senior Advisor NSF:CISE Joint Techs Madison, WI July 17, 2006.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CIF21) NSF-wide Cyberinfrastructure Vision People, Sustainability, Innovation,
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
The Open Science Grid: Bringing the power of the Grid to scientific research
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Milos Kobliha Alejandro Cimadevilla Luis de Alba Parallel Computing Seminar GROUP 12.
An Earth science program to explore the 4D structure of the North American continent.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
Knowledge Environments for Science and Engineering: Overview of Past, Present and Future Michael Pazzani, Information and Intelligent Systems Division,
Introduction to Grid Computing Ann Chervenak Carl Kesselman And the members of the Globus Team.
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
NORDUnet NORDUnet The Fibre Generation Lars Fischer CTO NORDUnet.
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
Sept 29-30, 2005 Cambridge, MA 1 Grand Challenges Workshop for Computer Systems Software Brett D. Fleisch Program Director National Science Foundation.
University of Illinois at Chicago March 15, 1999 Tele-Immersion and High-Speed Networking Maxine Brown Associate Director, Electronic Visualization Laboratory.
SAN DIEGO SUPERCOMPUTER CENTER NUCRI Advisory Board Meeting November 9, 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director.
Overview of EarthScope 2010 May, 2010 Slides that may be used and modified for presentations involving instrumentation, data, science results, and education.
What is Internet2? Ted Hanss, Internet2 5 March
1 What is the history of the Internet? ARPANET (Advanced Research Projects Agency Network) TCP/IP (Transmission Control Protocol/Internet Protocol) NSFNET.
University of Illinois at Chicago CAVERN The CAVE Research Network Maxine D. Brown Electronic Visualization Laboratory.
National Computational Science Alliance Tele-Immersion - The Killer Application for High Performance Networks Panel Talk at a Vanguard Meeting in San Francisco,
20 October 2015 Internet2 International Activities Heather Boyles Director, International Relations, Internet2 Internet2 Industry Strategy Council Meeting.
DOE 2000, March 8, 1999 The IT 2 Initiative and NSF Stephen Elbert program director NSF/CISE/ACIR/PACI.
A Wide Range of Scientific Disciplines Will Require a Common Infrastructure Example--Two e-Science Grand Challenges –NSF’s EarthScope—US Array –NIH’s Biomedical.
Spring 2003 Internet2 Meeting Cyberinfrastructure - Implications for the Future of Research Alan Blatecky ANIR National Science Foundation.
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
GriPhyN EAC Meeting (Jan. 7, 2002)Carl Kesselman1 University of Southern California GriPhyN External Advisory Committee Meeting Gainesville,
National Ecological Observatory Network
Geosciences - Observations (Bob Wilhelmson) The geosciences in NSF’s world consists of atmospheric science, ocean science, and earth science Many of the.
LIGO-G M Summary Remarks: Management of LIGO Gary Sanders California Institute of Technology NRC Committee on Organization and Management of Research.
National Center for Supercomputing Applications Barbara S. Minsker, Ph.D. Associate Professor National Center for Supercomputing Applications and Department.
GRIDS Center Middleware Overview Sandra Redman Information Technology and Systems Center and Information Technology Research Center National Space Science.
…building the next IT revolution From Web to Grid…
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
3 December 2015 Examples of partnerships and collaborations from the Internet2 experience Interworking2004 Ottawa, Canada Heather Boyles, Internet2
LIGO-G M Organization and Budget Gary Sanders NSF Operations Review Caltech, February 26, 2001.
August 3, March, The AC3 GRID An investment in the future of Atlantic Canadian R&D Infrastructure Dr. Virendra C. Bhavsar UNB, Fredericton.
The OptIPuter Project Tom DeFanti, Jason Leigh, Maxine Brown, Tom Moher, Oliver Yu, Bob Grossman, Luc Renambot Electronic Visualization Laboratory, Department.
ALMA and the Call for Early Science The Atacama Large (Sub)Millimeter Array (ALMA) is now under construction on the Chajnantor plain of the Chilean Andes.
Contact: Junwei Cao SC2005, Seattle, WA, November 12-18, 2005 The authors gratefully acknowledge the support of the United States National.
ALMA’s Roots in Three Scientific Visions Paul Vanden Bout NRAO The Dusty & Molecular Universe – Paris 28/10/04.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
December 10, 2003Slide 1 International Networking and Cyberinfrastructure Douglas Gatchell Program Director International Networking National Science Foundation,
Securing the Grid & other Middleware Challenges Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
NORDUnet NORDUnet e-Infrastrucure: Grids and Hybrid Networks Lars Fischer CTO, NORDUnet Fall 2006 Internet2 Member Meeting, Chicago.
Internet2 Spring Meeting NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions.
National Science Foundation Blue Ribbon Panel on Cyberinfrastructure Summary for the OSCER Symposium 13 September 2002.
APAN Meeting Douglas Gatchell US NSF/CISE/SCI July 4, 2004.
March, The C3 GRID An investment in the future of Canadian R&D Infrastructure.
LIGO-G M Managing LIGO: Lessons for a Collaboratory Gary Sanders LIGO/Caltech NEES Awardees Meeting NSF, December 19, 2001.
DutchGrid KNMI KUN Delft Leiden VU ASTRON WCW Utrecht Telin Amsterdam Many organizations in the Netherlands are very active in Grid usage and development,
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
ATACAMA LARGE MILLIMETER ARRAY
Grid Application Model and Design and Implementation of Grid Services
Stanford Linear Accelerator
Presentation transcript:

University of Illinois at Chicago STAR TAP Present/Future Applications: Enabling Grid Technologies and e-Science Advanced Applications Maxine Brown Co-Principal Investigator, STAR TAP Associate Director, Electronic Visualization Laboratory

University of Illinois at Chicago EVL VR &Tele-Immersion Display Devices Large Rooms and Shared Environments CAVE®ImmersaDesk2® Introduced CAVE in 1992~600 Projection VR devices in 2001

University of Illinois at Chicago Tele-Immersion Networking

University of Illinois at Chicago Tele-Immersion and CAVERNsoft Tele-Immersion requires expertise in graphics, VR, audio/video compression, networking, databases Rapidly build new tele- immersive applications Retro-fit old applications CAVERNsoft enables applications!

University of Illinois at Chicago CAVERNsoft uses Grid Software The Grid is a global software development effort to enable the use of high-performance networks like Abilene (Internet2), APAN, SURFnet, etc. Grids are changing the way we do science and engineering: computation to large-scale data Grids are designed to schedule, allocate, authenticate and manage advanced networking, computing and collaboration services on [optical] networks.

University of Illinois at Chicago The Grid Blueprint for a New Computing Infrastructure I. Foster, C. Kesselman (editors), Morgan Kaufmann, 1999 ISBN chapters by expert authors including Andrew Chien, Jack Dongarra, Tom DeFanti, Andrew Grimshaw, Roch Guerin, Ken Kennedy, Paul Messina, Cliff Neuman, Jon Postel, Larry Smarr, Rick Stevens, and many others “A source book for the history of the future” -- Vint Cerf

University of Illinois at Chicago Today’s Information Infrastructure Network-centric: Simple, fixed end systems; few embedded capabilities; few services; no user- level quality of service O(10 6 ) nodes

University of Illinois at Chicago Tomorrow’s Information Infrastructure: Not Just “Fatter and More Reliable” Application-centric: Heterogeneous, mobile end- systems; many embedded capabilities; rich services; user-level quality of service QoS Resource Discovery Caching O(10 9 ) nodes

University of Illinois at Chicago iGrid 2000 at INET 2000 July 18-21, 2000, Yokohama, Japan STAR TAP International Grid Demonstrations

University of Illinois at Chicago Cyber Infrastructure NSF Computer and Information Science Directorate (CISE) NSF is proposing a “Cyber Infrastructure” initiative – fund a number of Major Research Equipment (MRE) facilities that require a similar distributed storage and networked computing information infrastructure. Large MRE projects are the result of strategic planning by the broad university research community. NSF FY 2001 Budget Request to Congress: $138,540,000.

University of Illinois at Chicago Earthscope NSF Major Research Equipment USArray: a dense array of high-capability seismometers to be deployed throughout the US to improve our resolution of the subsurface structure. San Andreas Fault Observatory at Depth (SAFOD): a 4km- deep hole into the San Andreas fault zone close to the hypocenter of the 1966 M~6 Parkfield earthquake, to access a major active fault at depth To provide input to NSF’s Network for Earthquake Engineering Simulation (NEES) project that studies the response of the built environment to earthquakes. EarthScope: USArray and SAFOD $ 74.81M FY01-04 NSF support requested NSF, USGS, NASA Consortium EarthScope will bring real-time Earth Science data to our desktops, to provide unprecedented opportunities to unravel the structure, evolution, and dynamics of the North American continent, and to better understand earthquakes and fault systems, volcanoes and magmatic processes, and links between tectonics and surfical processes.

University of Illinois at Chicago Network for Earthquake Engineering Simulation NSF Major Research Equipment Network for Earthquake Engineering Simulation (NEES) $ 81.8M FY01-04 NSF support requested Scoping study managed by NCSA; sponsored by NSF NEES will provide a networked, national resource of geographically-distributed, shared-use, next- generation, experimental research equipment installations, with tele-observation and tele-operation capabilities. NEES will shift the emphasis of earthquake engineering research from current reliance on physical testing to integrated experimentation, computation, theory, databases, and model-based simulation using input data from EarthScope and other sources. NEES will be a collaboratory – an integrated experimental, computational, communications, and curated repository system, developed to support collaboration in earthquake engineering research and education.

University of Illinois at Chicago Terascale Computing Systems NSF Major Research Equipment Terascale Computing Systems $ 136M FY00-02 NSF support requested Distributed Terascale System to be awarded in June 2001; PSC Terascale awarded 2000 As part of the ITR initiative, the Terascale project enables US researchers to gain access to leading-edge computing capabilities. The project is aligned with NSF’s Partnerships for Advanced Computational Infrastructure (PACI) initiative, and is coordinated with other agencies, such as DOE, to leverage the software, tools, and technology investments. The two Terascale Computing Systems will receive regular upgrades to take advantage of technology trends in speed and performance while providing the most advanced, stable systems possible to the research users.

University of Illinois at Chicago 10 geographically distributed observatories nationwide to serve as national research platforms for integrated, cutting-edge research in field biology To enable scientists to conduct experiments on ecological systems at all levels of biological organization – from molecular genetics to whole ecosystems, and across scales – from seconds to geological time, and from microns to regions and continents. Observatories will have scalable computation capabilities and will be networked via satellite and landlines – to each other and to specialized facilities, such as supercomputer centers. By creating one virtual installation via a cutting-edge computational network, all members of the field biology research community will be able to access NEON remotely. National Ecological Observatory Network NSF Major Research Equipment National Ecological Observatory Network (NEON) $ 100M FY01-06 NSF support requested 10 observatories nationwide; sponsored by NSF, Archbold Biological Station and SDSC

University of Illinois at Chicago Atacama Large Millimeter Array NSF Major Research Equipment Prior to developing ALMA, the US conceived the MMA as an aperture-synthesis radio telescope operating in the wavelength range from 3 to 0.4 mm. ALMA will be the world’s most sensitive, highest resolution, millimeter-wavelength telescope. It will combine an angular resolution comparable to that of the Hubble Space Telescope with the sensitivity of a single antenna nearly 100 meters in diameter. ALMA will consist of no less than meter antennas located at an elevation of 16,400 feet in Llano de Chajnantor, Chile Atacama Large Millimeter Array (ALMA), an expanded Millimeter Array (MMA) $ 32M FY98-01 NSF support requested for Design and Development US: National Radio Astronomy Observatory and Associated Universities, Inc. with NSF funding Europe: European Southern Observatory, Centre National de la Recherche Scientifique, Max-Planck- Gesellschaft, Netherlands Foundation for Research in Astronomy and Nederlandse Onderzoekschool Voor Astronomie, and the UK Particle Physics and Astronomy Research Council

University of Illinois at Chicago Construction of two detectors of the LHC: ATLAS (A Toroidal Large Angle Spectrometer) and CMS (Compact Muon Solenoid) The research, design, and prototyping of Petascale Virtual Data Grids, which will support the LHC as well as the SDSS (Sloan Digital Sky Survey) and LIGO (Laser Interferometer Gravitational-wave Observatory), is being carried out by GriPhyN, a multi- institutional team that received the largest NSF ITR grant in FY00. Large Hadron Collider NSF Major Research Equipment Large Hadron Collider (LHC) $ 80.9M FY99-04 NSF support requested CERN (Switzerland) and international Consortium

University of Illinois at Chicago Grid Physics Network Team of physicists and computer scientists who plan to implement Petascale Virtual Data Grids (PVDGs) computational environments for data-intensive science Four physics experiments – CMS, ATLAS, LIGO, SDSS – share common challenges: massive datasets, large-scale computational resources and diverse communities of thousands of scientists spread across the globe –The LHC CMS and ATLAS experiments will search for the origins of mass and probe matter at the smallest length scales –LIGO will detect the gravitational waves of pulsars, supernovae and in-spiraling binary stars –SDSS will carry out an automated sky survey enabling systematic studies of stars, galaxies, nebula, and large-scale structure GriPhyN estimates 20 Tier 2 sites (6 CMS, 6 ATLAS, 5 LIGO and 2 SDSS), with a projected five-year cost of ~$85M-90M, half of which is for hardware SDSS – Apache Point Observatory, Cloudcroft, New Mexico LIGO Livingston Observatory, Louisiana (Caltech/MIT project) Grid Physics Network (GriPhyN) $ 11.9M FY00 NSF for R&D Development (largest ITR award) Led by Univ Florida and Univ Chicago; includes US institutions

University of Illinois at Chicago

Future Directions of the US National Science Foundation’s Division of Advanced Networking Infrastructure & Research Tom Greene National Science Foundation