Presentation is loading. Please wait.

Presentation is loading. Please wait.

LIGO Plans for OSG J. Kent Blackburn LIGO Laboratory California Institute of Technology Open Science Grid Technical Meeting UCSD December 15-17, 2004.

Similar presentations


Presentation on theme: "LIGO Plans for OSG J. Kent Blackburn LIGO Laboratory California Institute of Technology Open Science Grid Technical Meeting UCSD December 15-17, 2004."— Presentation transcript:

1 LIGO Plans for OSG J. Kent Blackburn LIGO Laboratory California Institute of Technology Open Science Grid Technical Meeting UCSD December 15-17, 2004

2 LSC Data Grid The LIGO Scientific Collaboration’s Data Grid: The LIGO Scientific Collaboration’s Data Grid: Nine Clusters (CIT, MIT, LHO, LLO, UWM, PSU, AEI, ISI, Birmingham)Nine Clusters (CIT, MIT, LHO, LLO, UWM, PSU, AEI, ISI, Birmingham) Close to 2000 CPUs within LSCClose to 2000 CPUs within LSC Condor is primary toolCondor is primary tool LDAS used for data reduction, database, and other analysesLDAS used for data reduction, database, and other analyses Learn more at … http://www.lsc-group.phys.uwm.edu/lscdatagrid/Learn more at … http://www.lsc-group.phys.uwm.edu/lscdatagrid/

3 Issues with LSC Data Grid Many LIGO analysis groups take a local approach to using LSC Data Grid Many LIGO analysis groups take a local approach to using LSC Data Grid Concrete “DAG” workflows have been work horse targeting specific sites Concrete “DAG” workflows have been work horse targeting specific sites Culture developed out of particulars of analysis methods and nature of compute resources, e.g., …Culture developed out of particulars of analysis methods and nature of compute resources, e.g., … Where is the data I need most likely to be? Where is the data I need most likely to be? Where are the compute nodes the fastest? Where are the compute nodes the fastest? Scientific results from inspiral & pulsar analyses limited by available compute resources Scientific results from inspiral & pulsar analyses limited by available compute resources

4 Supercomputing 2004 Deployed an “inspiral” analysis across the LSC Data Grid Deployed an “inspiral” analysis across the LSC Data Grid Used Pegasus to “plan execution” across this distributed gridUsed Pegasus to “plan execution” across this distributed grid First use of abstract “DAX” in LIGO analysis First use of abstract “DAX” in LIGO analysis Included use of LSU clusterIncluded use of LSU cluster Considered very successful by LSCConsidered very successful by LSC Encountered transfer client timeouts due to large number of connections to any single gridftp server – solution currently under development by Pegasus team Encountered transfer client timeouts due to large number of connections to any single gridftp server – solution currently under development by Pegasus team

5 Going Beyond SC04 SC04 demonstrated that non-localized usage of LSC Data Grid by LSC analysis groups possible! SC04 demonstrated that non-localized usage of LSC Data Grid by LSC analysis groups possible! Pegasus will soon efficiently support LIGO dataset challenges through bundled transfer support on single connection Pegasus will soon efficiently support LIGO dataset challenges through bundled transfer support on single connection In January, a workshop on Pegasus is planned for LSC to bootstrap other analysis groups on using “DAX” workflows on a distributed grid. In January, a workshop on Pegasus is planned for LSC to bootstrap other analysis groups on using “DAX” workflows on a distributed grid.

6 Migration to Grid3 January workshop will also include a tutorial on using Grid3 January workshop will also include a tutorial on using Grid3 Goal is to carryout inspiral analysis utilizing Grid3 when possible Goal is to carryout inspiral analysis utilizing Grid3 when possible Hope to deploy stochastic analysis across the LSC Data Grid and onto Grid3 as well Hope to deploy stochastic analysis across the LSC Data Grid and onto Grid3 as well LIGO plans to build up in-house technical expertise for running on Grid3 LIGO plans to build up in-house technical expertise for running on Grid3

7 On to OSG Based on experiences running on Grid3 in late winter 2005, plan to migrate inspiral and if available stochastic analysis onto OSG0 once up and available in the spring Based on experiences running on Grid3 in late winter 2005, plan to migrate inspiral and if available stochastic analysis onto OSG0 once up and available in the spring


Download ppt "LIGO Plans for OSG J. Kent Blackburn LIGO Laboratory California Institute of Technology Open Science Grid Technical Meeting UCSD December 15-17, 2004."

Similar presentations


Ads by Google