SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery1 University of Florida Open Science Grid Linking Universities and Laboratories.

Slides:



Advertisements
Similar presentations
SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
Advertisements

 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
Assessment of Core Services provided to USLHC by OSG.
Open Science Ruth Pordes Fermilab, July 17th 2006 What is OSG Where Networking fits Middleware Security Networking & OSG Outline.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
CANS Meeting (December 1, 2004)Paul Avery1 University of Florida UltraLight U.S. Grid Projects and Open Science Grid Chinese American.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Experiment Requirements for Global Infostructure Irwin Gaines FNAL/DOE.
CASC Meeting (July 14, 2004)Paul Avery1 University of Florida Physics Grids and Open Science Grid CASC Meeting Washington, DC July 14,
GridChem Workshop (March 9, 2006)Paul Avery1 University of Florida Open Science Grid Linking Universities and Laboratories in National.
OSG Operations and Interoperations Rob Quick Open Science Grid Operations Center - Indiana University EGEE Operations Meeting Stockholm, Sweden - 14 June.
ARGONNE  CHICAGO Ian Foster Discussion Points l Maintaining the right balance between research and development l Maintaining focus vs. accepting broader.
OSG Services at Tier2 Centers Rob Gardner University of Chicago WLCG Tier2 Workshop CERN June 12-14, 2006.
OSG Middleware Roadmap Rob Gardner University of Chicago OSG / EGEE Operations Workshop CERN June 19-20, 2006.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
GGF12 – 20 Sept LCG Incident Response Ian Neilson LCG Security Officer Grid Deployment Group CERN.
D0SAR Workshop (March 30, 2006)Paul Avery1 University of Florida Open Science Grid Linking Universities and Laboratories in National.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
GriPhyN EAC Meeting (Jan. 7, 2002)Carl Kesselman1 University of Southern California GriPhyN External Advisory Committee Meeting Gainesville,
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Data Intensive Science Network (DISUN). DISUN Started in May sites: Caltech University of California at San Diego University of Florida University.
Partnerships & Interoperability - SciDAC Centers, Campus Grids, TeraGrid, EGEE, NorduGrid,DISUN Ruth Pordes Fermilab Open Science Grid Joint Oversight.
The Open Science Grid OSG Ruth Pordes Fermilab. 2 What is OSG? A Consortium of people working together to Interface Farms and Storage to a Grid and Researchers.
Open Science Grid An Update and Its Principles Ruth Pordes Fermilab.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
Open Science Grid Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab September 1, 2005.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
OSG Consortium Meeting (January 23, 2006)Paul Avery1 University of Florida Open Science Grid Progress Linking Universities and Laboratories.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
The Grid Effort at UF Presented by Craig Prescott.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery1 Integration with iVDGL è International Virtual-Data Grid Laboratory  A global Grid laboratory (US, EU, Asia,
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
G Z LIGO's Physics at the Information Frontier Grant and OSG: Update Warren Anderson for Patrick Brady (PIF PI) OSG Executive Board Meeting Caltech.
Alain Roy Computer Sciences Department University of Wisconsin-Madison Condor & Middleware: NMI & VDT.
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
The OSG and Grid Operations Center Rob Quick Open Science Grid Operations Center - Indiana University ATLAS Tier 2-Tier 3 Meeting Bloomington, Indiana.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab October 25, 2005.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
Designing Cyberinfrastructure (Jan , 2007)Paul Avery 1 Paul Avery University of Florida Cyberinfrastructure Workshop National.
Internet2 Spring Meeting NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions.
1 An update on the Open Science Grid for IHEPCCC Ruth Pordes, Fermilab.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
Towards deploying a production interoperable Grid Infrastructure in the U.S. Vicky White U.S. Representative to GDB.
Victoria A. White Head, Computing Division, Fermilab Fermilab Grid Computing – CDF, D0 and more..
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Mardi Gras Conference (February 3, 2005)Paul Avery1 University of Florida Grid3 and Open Science Grid Mardi Gras Conference Louisiana.
OSG Status and Rob Gardner University of Chicago US ATLAS Tier2 Meeting Harvard University, August 17-18, 2006.
Defining the Technical Roadmap for the NWICG – OSG Ruth Pordes Fermilab.
Grid Colombia Workshop with OSG Week 2 Startup Rob Gardner University of Chicago October 26, 2009.
R. Ruchti DPF October 2006 Broader Impacts in Particle Physics A Responsibility and an Opportunity R. Ruchti National Science Foundation and University.
Open Science Grid Interoperability
Bob Jones EGEE Technical Director
Open Science Grid Progress and Status
Leigh Grundhoefer Indiana University
Open Science Grid at Condor Week
Presentation transcript:

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery1 University of Florida Open Science Grid Linking Universities and Laboratories in National CyberInfrastructure SURA Infrastructure Workshop Austin, TX December 7, 2005

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery2 Bottom-up Collaboration: “Trillium”  Trillium = PPDG + GriPhyN + iVDGL  PPDG:$12M (DOE)(1999 – 2006)  GriPhyN:$12M (NSF)(2000 – 2005)  iVDGL:$14M (NSF)(2001 – 2006)  ~150 people with large overlaps between projects  Universities, labs, foreign partners  Strong driver for funding agency collaborations  Inter-agency: NSF – DOE  Intra-agency: Directorate – Directorate, Division – Division  Coordinated internally to meet broad goals  CS research, developing/supporting Virtual Data Toolkit (VDT)  Grid deployment, using VDT-based middleware  Unified entity when collaborating internationally

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery3 Common Middleware: Virtual Data Toolkit Sources (CVS) Patching GPT src bundles NMI Build & Test Condor pool 22+ Op. Systems Build Test Package VDT Build Many Contributors Build Pacman cache RPMs Binaries Test A unique laboratory for testing, supporting, deploying, packaging, upgrading, & troubleshooting complex sets of software!

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery4 VDT Growth Over 3 Years (1.3.8 now) # of components VDT 1.0 Globus 2.0b Condor VDT Switch to Globus 2.2 VDT Grid3 VDT First real use by LCG

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery5 Components of VDT  Globus  Condor  RLS 3.0  ClassAds  Replica  DOE/EDG CA certs  ftsh  EDG mkgridmap  EDG CRL Update  GLUE Schema 1.0  VDS 1.3.5b  Java  Netlogger  Gatekeeper-Authz  MyProxy1.11  KX509  System Profiler  GSI OpenSSH 3.4  Monalisa  PyGlobus  MySQL  UberFTP 1.11  DRM 1.2.6a  VOMS  VOMS Admin  Tomcat  PRIMA 0.2  Certificate Scripts  Apache  jClarens  New GridFTP Server  GUMS 1.0.1

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery6 VDT Collaborative Relationships Computer Science Research Virtual Data Toolkit Partner science projects Partner networking projects Partner outreach projects Science, ENG, Education Communities Globus, Condor, NMI, TeraGrid, OSG EGEE, WLCG, Asia, South America QuarkNet, CHEPREO, Digital Divide Deployment, Feedback Tech Transfer Techniques & software Requirements Prototyping & experiments Other linkages  Work force  CS researchers  Industry U.S.Grids International Outreach

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery7 Search for  Origin of Mass  New fundamental forces  Supersymmetry  Other new particles  2007 – ? TOTEM LHCb ALICE  27 km Tunnel in Switzerland & France CMS ATLAS Major Science Driver: Large Hadron Collider CERN

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery8 LHC: Petascale Global Science  Complexity:Millions of individual detector channels  Scale:PetaOps (CPU), 100s of Petabytes (Data)  Distribution:Global distribution of people & resources CMS Example Physicists 250+ Institutes 60+ Countries BaBar/D0 Example Physicists 100+ Institutes 35+ Countries

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery9 CMS Experiment LHC Global Data Grid (2007+) Online System CERN Computer Center USA Korea Russia UK Maryland MB/s >10 Gb/s Gb/s Gb/s Tier 0 Tier 1 Tier 3 Tier 2 Physics caches PCs Iowa UCSDCaltech U Florida  5000 physicists, 60 countries  10s of Petabytes/yr by 2008  1000 Petabytes in < 10 yrs? FIU Tier 4

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery10 Grid3 and Open Science Grid

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery11 Grid3: A National Grid Infrastructure  October 2003 – July 2005  32 sites, 3,500 CPUs: Universities + 4 national labs  Sites in US, Korea, Brazil, Taiwan  Applications in HEP, LIGO, SDSS, Genomics, fMRI, CS Brazil

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery12 Grid3 Lessons Learned  How to operate a Grid as a facility  Tools, services, error recovery, procedures, docs, organization  Delegation of responsibilities (Project, VO, service, site, …)  Crucial role of Grid Operations Center (GOC)  How to support people  people relations  Face-face meetings, phone cons, 1-1 interactions, mail lists, etc.  How to test and validate Grid tools and applications  Vital role of testbeds  How to scale algorithms, software, process  Some successes, but “interesting” failure modes still occur  How to apply distributed cyberinfrastructure  Successful production runs for several applications

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery13

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery14 Sao Paolo Taiwan, S.Korea  Production Grid: 50+ sites, 15,000 CPUs “present” (available but not at one time)  Sites in US, Korea, Brazil, Taiwan  Integration Grid: sites Open Science Grid: July 20, 2005

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery15 OSG Operations Snapshot November 7: 30 days

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery16 OSG Participating Disciplines Computer Science Condor, Globus, SRM, SRBTest and validate innovations: new services & technologies Physics LIGO, Nuclear Physics, Tevatron, LHC Global Grid: computing & data access Astrophysics Sloan Digital Sky SurveyCoAdd: multiply-scanned objects Spectral fitting analysis Bioinformatics Argonne GADU project Dartmouth Psychological & Brain Sciences BLAST, BLOCKS, gene sequences, etc Functional MRI University campus Resources, portals, apps  CCR(U Buffalo)  GLOW(U Wisconsin)  TACC(Texas Advanced Computing Center)  MGRID(U Michigan)  UFGRID(U Florida)  Crimson Grid(Harvard)  FermiGrid(FermiLab Grid)

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery17 OSG Grid Partners TeraGrid “DAC2005”: run LHC apps on TeraGrid resources TG Science Portals for other applications Discussions on joint activities: Security, Accounting, Operations, Portals EGEE Joint Operations Workshops, defining mechanisms to exchange support tickets Joint Security working group US middleware federation contributions to core- middleware gLITE Worldwide LHC Computing Grid OSG contributes to LHC global data handling and analysis systems Other partners SURA, GRASE, LONI, TACC Representatives of VOs provide portals and interfaces to their user groups

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery18 Example of Partnership: WLCG and EGEE

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery19 OSG Technical Groups & Activities  Technical Groups address and coordinate technical areas  Propose and carry out activities related to their given areas  Liaise & collaborate with other peer projects (U.S. & international)  Participate in relevant standards organizations.  Chairs participate in Blueprint, Integration and Deployment activities  Activities are well-defined, scoped tasks contributing to OSG  Each Activity has deliverables and a plan  … is self-organized and operated  … is overseen & sponsored by one or more Technical Groups TGs and Activities are where the real work gets done

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery20 OSG Technical Groups (deprecated!) GovernanceCharter, organization, by-laws, agreements, formal processes PolicyVO & site policy, authorization, priorities, privilege & access rights SecurityCommon security principles, security infrastructure Monitoring and Information Services Resource monitoring, information services, auditing, troubleshooting StorageStorage services at remote sites, interfaces, interoperability Support CentersInfrastructure and services for user support, helpdesk, trouble ticket Education / OutreachTraining, interface with various E/O projects Networks (new)Including interfacing with various networking projects

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery21 OSG Activities BlueprintDefining principles and best practices for OSG DeploymentDeployment of resources & services ProvisioningConnected to deployment Incidence responsePlans and procedures for responding to security incidents IntegrationTesting & validating & integrating new services and technologies Data Resource Management (DRM) Deployment of specific Storage Resource Management technology DocumentationOrganizing the documentation infrastructure AccountingAccounting and auditing use of OSG resources InteroperabilityPrimarily interoperability between OperationsOperating Grid-wide services

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery22 OSG Integration Testbed: Testing & Validating Middleware Brazil Taiwan Korea

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery23 Networks

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery24 Evolving Science Requirements for Networks (DOE High Performance Network Workshop) Science Areas Today End2End Throughput 5 years End2End Throughput 5-10 Years End2End Throughput Remarks High Energy Physics 0.5 Gb/s100 Gb/s 1000 Gb/s High bulk throughput Climate (Data & Computation) 0.5 Gb/s Gb/s N x 1000 Gb/s High bulk throughput SNS NanoScience Not yet started 1 Gb/s1000 Gb/s + QoS for Control Channel Remote control and time critical throughput Fusion Energy0.066 Gb/s (500 MB/s burst) 0.2 Gb/s (500MB/ 20 sec. burst) N x 1000 Gb/s Time critical throughput Astrophysics0.013 Gb/s (1 TB/week) N*N multicast 1000 Gb/s Computational steering and collaborations Genomics Data & Computation Gb/s (1 TB/day) 100s of users1000 Gb/s + QoS for Control Channel High throughput and steering See /

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery25 UltraLight 10 Gb/s+ network Caltech, UF, FIU, UM, MIT SLAC, FNAL Int’l partners Level(3), Cisco, NLR Integrating Advanced Networking in Applications

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery26 Education Training Communications

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery27 iVDGL, GriPhyN Education/Outreach Basics  $200K/yr  Led by UT Brownsville  Workshops, portals, tutorials  Partnerships with QuarkNet, CHEPREO, LIGO E/O, …

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery28 Grid Training Activities  June 2004: First US Grid Tutorial (South Padre Island, Tx)  36 students, diverse origins and types  July 2005: Second Grid Tutorial (South Padre Island, Tx)  42 students, simpler physical setup (laptops)  Reaching a wider audience  Lectures, exercises, video, on web  Students, postdocs, scientists  Coordination of training activities  “Grid Cookbook” (Trauner & Yafchak)  More tutorials, 3-4/year  CHEPREO tutorial in 2006?

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery29 QuarkNet/GriPhyN e-Lab Project

CHEPREO: Center for High Energy Physics Research and Educational Outreach Florida International University  Physics Learning Center  CMS Research  iVDGL Grid Activities  AMPATH network (S. America)  Funded September 2003  $4M initially (3 years)  MPS, CISE, EHR, INT

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery31 Grids and the Digital Divide Background  World Summit on Information Society  HEP Standing Committee on Inter- regional Connectivity (SCIC) Themes  Global collaborations, Grids and addressing the Digital Divide  Focus on poorly connected regions  Brazil (2004), Korea (2005)

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery32 Science Grid Communications Broad set of activities  (Katie Yurkewicz)  News releases, PR, etc.  Science Grid This Week  OSG Newsletter  Not restricted to OSG

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery33 Grid Timeline GriPhyN, $12M PPDG, $9.5M UltraLight, $2M CHEPREO, $4M DISUN, $10M Grid Communications Grid Summer Schools Grid3 operations OSG operations VDT 1.0 First US-LHC Grid Testbeds Digital Divide Workshops LIGO Grid Start of LHC iVDGL, $14M

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery34 Future of OSG CyberInfrastructure  OSG is a unique national infrastructure for science  Large CPU, storage and network capability crucial for science  Supporting advanced middleware  Long-term support of the Virtual Data Toolkit (new disciplines & international collaborations  OSG currently supported by a “patchwork” of projects  Collaborating projects, separately funded  Developing workplan for long-term support  Maturing, hardening facility  Extending facility to lower barriers to participation  Oct. 27 presentation to DOE and NSF

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery35 Sao Paolo Taiwan, S.Korea OSG Consortium Meeting: Jan  University of Florida (Gainesville)  About 100 – 120 people expected  Funding agency invitees  Schedule  Monday Morning: Applications plenary (rapporteurs)  Monday Afternoon:Partner Grid projects plenary  Tuesday Morning:Parallel  Tuesday Afternoon:Plenary  Wednesday Morning:Parallel  Wednesday Afternoon:Plenary  Thursday:OSG Council meeting

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery36 Disaster Planning Emergency Response

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery37 Grids and Disaster Planning / Emergency Response  Inspired by recent events  Dec tsunami in Indonesia  Aug Katrina hurricane and subsequent flooding  (Quite different time scales!)  Connection of DP/ER to Grids  Resources to simulate detailed physical & human consequences of disasters  Priority pooling of resources for a societal good  In principle, a resilient distributed resource  Ensemble approach well suited to Grid/cluster computing  E.g., given a storm’s parameters & errors, bracket likely outcomes  Huge number of jobs required  Embarrassingly parallel

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery38 DP/ER Scenarios  Simulating physical scenarios  Hurricanes, storm surges, floods, forest fires  Pollutant dispersal: chemical, oil, biological and nuclear spills  Disease epidemics  Earthquakes, tsunamis  Nuclear attacks  Loss of network nexus points (deliberate or side effect)  Astronomical impacts  Simulating human responses to these situations  Roadways, evacuations, availability of resources  Detailed models (geography, transportation, cities, institutions)  Coupling human response models to specific physical scenarios  Other possibilities  “Evacuation” of important data to safe storage

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery39 DP/ER and Grids: Some Implications  DP/ER scenarios are not equally amenable to Grid approach  E.g., tsunami vs hurricane-induced flooding  Specialized Grids can be envisioned for very short response times  But all can be simulated “offline” by researchers  Other “longer term” scenarios  ER is an extreme example of priority computing  Priority use of IT resources is common (conferences, etc)  Is ER priority computing different in principle?  Other implications  Requires long-term engagement with DP/ER research communities  (Atmospheric, ocean, coastal ocean, social/behavioral, economic)  Specific communities with specific applications to execute  Digital Divide: resources to solve problems of interest to 3 rd World  Forcing function for Grid standards?  Legal liabilities?

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery40 Grid Project References  Open Science Grid   Grid3   Virtual Data Toolkit   GriPhyN   iVDGL   PPDG   CHEPREO   UltraLight   Globus   Condor   WLCG   EGEE 

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery41 Extra Slides

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery42 Grid3 Use by VOs Over 13 Months

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery43 CMS: “Compact” Muon Solenoid Inconsequential humans

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery44 LHC: Beyond Moore’s Law Moore’s Law (2000) LHC CPU Requirements

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery45 Grids and Globally Distributed Teams  Non-hierarchical: Chaotic analyses + productions  Superimpose significant random data flows

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery46 Sloan Digital Sky Survey (SDSS) Using Virtual Data in GriPhyN Galaxy cluster size distribution Sloan Data

SURA Infrastructure Workshop (Dec. 7, 2005)Paul Avery47 The LIGO Scientific Collaboration (LSC) and the LIGO Grid  LIGO Grid: 6 US sites + 3 EU sites (Cardiff/UK, AEI/Germany) * LHO, LLO: LIGO observatory sites * LSC: LIGO Scientific Collaboration  Cardiff AEI/Golm Birmingham