Towards deploying a production interoperable Grid Infrastructure in the U.S. Vicky White U.S. Representative to GDB.

Slides:



Advertisements
Similar presentations
SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
Advertisements

Dec 14, 20061/10 VO Services Project – Status Report Gabriele Garzoglio VO Services Project WBS Dec 14, 2006 OSG Executive Board Meeting Gabriele Garzoglio.
 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
1 IWLSC, Kolkata India 2006 Jérôme Lauret for the Open Science Grid consortium The Open-Science-Grid: Building a US based Grid infrastructure for Open.
CMS Applications Towards Requirements for Data Processing and Analysis on the Open Science Grid Greg Graham FNAL CD/CMS for OSG Deployment 16-Dec-2004.
May 9, 2008 Reorganization of the OSG Project The existing project organization chart was put in place at the beginning of It has worked very well.
A conceptual model of grid resources and services Authors: Sergio Andreozzi Massimo Sgaravatto Cristina Vistoli Presenter: Sergio Andreozzi INFN-CNAF Bologna.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Open Science Grid Frank Würthwein UCSD. 2/13/2006 GGF 2 “Airplane view” of the OSG  High Throughput Computing — Opportunistic scavenging on cheap hardware.
Assessment of Core Services provided to USLHC by OSG.
F Run II Experiments and the Grid Amber Boehnlein Fermilab September 16, 2005.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
RomeWorkshop on eInfrastructures 9 December LCG Progress on Policies & Coming Challenges Ian Bird IT Division, CERN LCG and EGEE Rome 9 December.
Open Science Grid Software Stack, Virtual Data Toolkit and Interoperability Activities D. Olson, LBNL for the OSG International.
Experiment Requirements for Global Infostructure Irwin Gaines FNAL/DOE.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
May 8, 20071/15 VO Services Project – Status Report Gabriele Garzoglio VO Services Project – Status Report Overview and Plans May 8, 2007 Computing Division,
Tier 1 Facility Status and Current Activities Rich Baker Brookhaven National Laboratory NSF/DOE Review of ATLAS Computing June 20, 2002.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
Mar 28, 20071/9 VO Services Project Gabriele Garzoglio The VO Services Project Don Petravick for Gabriele Garzoglio Computing Division, Fermilab ISGC 2007.
Data Grid projects in HENP R. Pordes, Fermilab Many HENP projects are working on the infrastructure for global distributed simulated data production, data.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Δ Storage Middleware GridPP10 What’s new since GridPP9? CERN, June 2004.
Responsibilities of ROC and CIC in EGEE infrastructure A.Kryukov, SINP MSU, CIC Manager Yu.Lazin, IHEP, ROC Manager
David Adams ATLAS ADA, ARDA and PPDG David Adams BNL June 28, 2004 PPDG Collaboration Meeting Williams Bay, Wisconsin.
LCG LHC Computing Grid Project – LCG CERN – European Organisation for Nuclear Research Geneva, Switzerland LCG LHCC Comprehensive.
Partnerships & Interoperability - SciDAC Centers, Campus Grids, TeraGrid, EGEE, NorduGrid,DISUN Ruth Pordes Fermilab Open Science Grid Joint Oversight.
SA1/SA2 meeting 28 November The status of EGEE project and next steps Bob Jones EGEE Technical Director EGEE is proposed as.
Virtual Batch Queues A Service Oriented View of “The Fabric” Rich Baker Brookhaven National Laboratory April 4, 2002.
CERN LCG Deployment Overview Ian Bird CERN IT/GD LHCC Comprehensive Review November 2003.
EGEE is a project funded by the European Union under contract IST Middleware Planning for LCG/EGEE Bob Jones EGEE Technical Director e-Science.
Open Science Grid An Update and Its Principles Ruth Pordes Fermilab.
BNL Tier 1 Service Planning & Monitoring Bruce G. Gibbard GDB 5-6 August 2006.
INFSO-RI Enabling Grids for E-sciencE OSG-LCG Interoperability Activity Author: Laurence Field (CERN)
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
LCG ARDA status Massimo Lamanna 1 ARDA in a nutshell ARDA is an LCG project whose main activity is to enable LHC analysis on the grid ARDA is coherently.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
April 26, Executive Director Report Executive Board 4/26/07 Things under control Things out of control.
VO Membership Registration Workflow, Policies and VOMRS software (VOX Project) Tanya Levshina Fermilab.
Jan 2010 OSG Update Grid Deployment Board, Feb 10 th 2010 Now having daily attendance at the WLCG daily operations meeting. Helping in ensuring tickets.
Virtual Organization Membership Service eXtension (VOX) Ian Fisk On behalf of the VOX Project Fermilab.
Sep 25, 20071/5 Grid Services Activities on Security Gabriele Garzoglio Grid Services Activities on Security Gabriele Garzoglio Computing Division, Fermilab.
Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab October 25, 2005.
CERN LCG Deployment Overview Ian Bird CERN IT/GD LCG Internal Review November 2003.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
US ATLAS – new grid initiatives John Huth Harvard University US ATLAS Software Meeting: BNL Aug 03.
1 A Scalable Distributed Data Management System for ATLAS David Cameron CERN CHEP 2006 Mumbai, India.
VOX Project Status T. Levshina. 5/7/2003LCG SEC meetings2 Goals, team and collaborators Purpose: To facilitate the remote participation of US based physicists.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
LCG User, Site & VO Registration in EGEE/LCG Bob Cowles OSG Technical Meeting Dec 15-17, 2004 UCSD.
Victoria A. White Head, Computing Division, Fermilab Fermilab Grid Computing – CDF, D0 and more..
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Summary of OSG Activities by LIGO and LSC LIGO NSF Review November 9-11, 2005 Kent Blackburn LIGO Laboratory California Institute of Technology LIGO DCC:
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
VOX Project Status Report Tanya Levshina. 03/10/2004 VOX Project Status Report2 Presentation overview Introduction Stakeholders, team and collaborators.
Open Science Grid Interoperability
Bob Jones EGEE Technical Director
Gene Oleynik, Head of Data Storage and Caching,
Status of Task Forces Ian Bird GDB 8 May 2003.
EGEE Middleware Activities Overview
Open Science Grid Progress and Status
Ian Bird GDB Meeting CERN 9 September 2003
Leigh Grundhoefer Indiana University
Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002
Open Science Grid at Condor Week
Presentation transcript:

Towards deploying a production interoperable Grid Infrastructure in the U.S. Vicky White U.S. Representative to GDB

May 18, 2004GDB - US2 Grid Deployment and Data Challenges u US CMS participated in DC04. Generally a lot of “glue” development and operational work u simulation production benefited from 50% increase in resources through opportunistic use of Grid3. u US ATLAS DC2 is deployed and ramping up production. u Don Quixote isolates application from different Replica Location Service infrastructures. u End to end “virtual data”-based US production on Grid3 u US Edge Computing deployed at CERN to facilitate Tier-0 to Tier 1 data movement

May 18, 2004GDB - US3 Daily data transferred to FNAL (Y. Wu, CPT week)

May 18, 2004GDB - US4 Fermilab Dcache nodes network traffic (Y. Wu, CPT week)

May 18, 2004GDB - US5

May 18, 2004GDB - US6 Progress on Storage Services u dCache/SRM used on US CMS Tier-1 and Tier-2s for DC04 analysis. u dCache/SRM being evaluated at BNL u LBNL DRM being tested in VDT. LBNL HRM (SRM) is use for STAR production data movement between BNL and LBNL. Plans for interoperability tests between LBNL HRM and dCache at BNL. u DESY/Fermilab collaboration providing support for LCG deployment of dCache/SRM u Separation of SRM layer underway at Fermilab. Collaboration with University of Wisconsin to enable NeST back end. u Starting to see other uses: dCache in use for Biology applications on at Wisconsin; Interest from BTeV at Vanderbilt. u In the process of sorting out the detailed issues of license of source code and ongoing development and support.

May 18, 2004GDB - US7 Virtual Organization Services u EDG-VOMS in use on Grid3. u VOX VO registration service (VOMRS) in test at BNL for Nuclear Physics Experiments. u VOMRS being evaluated by LCG for short term interim use. u We are ready to collaborate further on request. u Authorization Privilege Management joint project across ATLAS, CMS, Run II, Facility storage services. u Plan for deployment on US grid by the end of the year. u Computer scientist technology lead is Markus Lorch from Virginia Tech u We are ready to collaborate further on request.

May 18, 2004GDB - US8 Computation Services u US CMS is testing the sharing and publication of compute resources to Grid3 and LCG. u Job execution Policy provided by the batch queue interface and supports sharing of the CE between local batch user use, Grid3 and LCG jobs.

May 18, 2004GDB - US9 Security Services u US people (Dane Skow and Rich Baker) are Participating in the LCG security group. u Comments provided on User Registration Document. u The result seems reasonable to us. u In the US possible response to Teragrid site security incident seems to be a move towards the KCA-Like model deployed at Fermilab u LCG/EGEE security group  need a broader group to evolve policy and technology choices u Aware of security aspects of ARDA prototype, Dane Skow talking with Miron.

May 18, 2004GDB - US10 Software Services u Pacman3 in use for US ATLAS DC2. u Pacman3 evaluation planned soon for VDT. u Boston U & VDT agreement on support and maintenance of Pacman near.

May 18, 2004GDB - US11 Grid3 Status and Evolution u Grid infrastructure remained stable to completion of CMS DC04 and initial use for reconstruction. u Little operational support load, see VO jobs rise and fall at level of several 100. u Split Grid into development and operational grids. u Run II converting to use of VDT over next couple of months (lesson of running experiments - would benefit from an SGI port)

May 18, 2004GDB - US12 Stability of Grid3 infrastructure u Currently SDSS doing a 2 week ~40,000 job, individual investigator science run “opportunistic science” The coadd itself consists of transferring all the images of a location in the sky out to a grid node, remapping the images into a standard coordinate system, and peforming a variance weighted average. There are roughly 7000 fields, in 5 colors, so there are 35,000 jobs to be run per coadd (we are doing two to explore parameter space). Each job takes about minutes to run. After the coadd itself, we run the object detection and measurement code, Photo, on the output, and bring everything back to Fermilab.

May 18, 2004GDB - US13 Development and Operational parallel environments u Enables testing of VDT and use of earlier versions of VDT in parallel

May 18, 2004GDB - US14 Participation in ARDA, EGEE/Middleware? u US LHC funds Miron Livny on ARDA/EGEE middleware u US CMS/CMS working on end to end ARDA prototypes for the fall. u US CMS will participate in CCS milestone for analysis across grids. u NSF funds VDT/EGEE joint effort. u Wisconsin certification testbeds for national NMI testing, VDT, LCG and EGEE middleware... etc. encourage coherence across the middleware versions. u PPDG helps fund ATLAS ARDA representative. u PPDG extension proposal is funded. Enables some immediate attention to Open Science Grid needs.

May 18, 2004GDB - US15 Open Science Grid u Open Science Grid is the Roadmap in the US for ensuring all of our Grid efforts, including and in particular the LHC ones, come together towards a coherent and sustained Grid infrastructure that u Provides computing for LHC science – US contribution to LCG u Is open to other experiments and other sciences u Will work with, collaborate with and interoperate with the Grid infrastructure provided through EGEE u Open Science Grid consortium is being formed - hope to come out of this weeks joint Trillium Steering, Laboratory Facility and Experiment software and computing management meeting with more formal structures in place. (Not a project) u Open Science Grid “Technical Groups” scope explicitly includes cooperation with the EGEE/LCG peers groups.