OSG site utilization by VOs Ilya Narsky, Caltech.

Slides:



Advertisements
Similar presentations
WALT What are the critical skills? What are the critical skills? What are the tools for challenges? What are the tools for challenges? What is a challenge?
Advertisements

Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Torsten Antoni – LCG Operations Workshop, CERN 02-04/11/04 Global Grid User Support - GGUS -
1 IWLSC, Kolkata India 2006 Jérôme Lauret for the Open Science Grid consortium The Open-Science-Grid: Building a US based Grid infrastructure for Open.
OSG Area Coordinators Meeting Security Team Report Mine Altunay 04/02/2014.
May 9, 2008 Reorganization of the OSG Project The existing project organization chart was put in place at the beginning of It has worked very well.
Open Science Grid and Gluex Richard Jones Gluex Collaboration Meeting, Newport News, Jan , 2010.
Jan 2010 Current OSG Efforts and Status, Grid Deployment Board, Jan 12 th 2010 OSG has weekly Operations and Production Meetings including US ATLAS and.
CSCD 555 Research Methods for Computer Science
Usage Policy (UPL) Research for GriPhyN & iVDGL Catalin L. Dumitrescu, Michael Wilde, Ian Foster The University of Chicago.
The SAM-Grid Fabric Services Gabriele Garzoglio (for the SAM-Grid team) Computing Division Fermilab.
SCD FIFE Workshop - GlideinWMS Overview GlideinWMS Overview FIFE Workshop (June 04, 2013) - Parag Mhashilkar Why GlideinWMS? GlideinWMS Architecture Summary.
Grid Information Systems. Two grid information problems Two problems  Monitoring  Discovery We can use similar techniques for both.
Key Project Drivers - FY11 Ruth Pordes, June 15th 2010.
OSG Public Storage and iRODS
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
OSG Operations and Interoperations Rob Quick Open Science Grid Operations Center - Indiana University EGEE Operations Meeting Stockholm, Sweden - 14 June.
Integration and Sites Rob Gardner Area Coordinators Meeting 12/4/08.
OSG RA plans Doug Olson, LBNL May Contents RA, agent, sponsor layout & OU=People use case Sample web form Agent Role GridAdmin Role Questions.
OSG Status & Accomplishments Kent Blackburn California Institute of Technology Open Science Grid Joint Oversight Team Meeting February 20th 2007.
DOSAR VO ACTION AGENDA ACTION ITEMS AND GOALS CARRIED FORWARD FROM THE DOSAR VI WORKSHOP AT OLE MISS APRIL 17-18, 2008.
Concept: Well-managed provisioning of storage space on OSG sites owned by large communities, for usage by other science communities in OSG. Examples –Providers:
Tier 1 Facility Status and Current Activities Rich Baker Brookhaven National Laboratory NSF/DOE Review of ATLAS Computing June 20, 2002.
Security Area in GridPP2 4 Mar 2004 Security Area in GridPP2 “Proforma-2 posts” overview Deliverables – Local Access – Local Usage.
Use of Condor on the Open Science Grid Chris Green, OSG User Group / FNAL Condor Week, April
OSG Area Coordinators Meeting Proposal Chander Sehgal Fermilab
DOSAR Workshop at Sao Paulo Dick Greenwood What’s Next for DOSAR? Dick Greenwood Louisiana Tech University 1 st DOSAR Workshop at the Sao Paulo, Brazil.
OSG Project Manager Report for OSG Council Meeting OSG Project Manager Report for OSG Council Meeting October 14, 2008 Chander Sehgal.
Report by the Open Science Grid Council Subcommittee to Address At- Large VO Representation on the Consortium Council Shaowen Wang (on behalf of the committee)
Data Intensive Science Network (DISUN). DISUN Started in May sites: Caltech University of California at San Diego University of Florida University.
13 May 2004EB/TB Middleware meeting Use of R-GMA in BOSS for CMS Peter Hobson & Henry Nebrensky Brunel University, UK Some slides stolen from various talks.
Enabling Grids for E-sciencE System Analysis Working Group and Experiment Dashboard Julia Andreeva CERN Grid Operations Workshop – June, Stockholm.
June 10, D0 Use of OSG D0 relies on OSG for a significant throughput of Monte Carlo simulation jobs, will use it if there is another reprocessing.
The Open Science Grid OSG Ruth Pordes Fermilab. 2 What is OSG? A Consortium of people working together to Interface Farms and Storage to a Grid and Researchers.
Early Thinking on ARDA in the Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager PEB Dec 9, 2003.
EGEE-III INFSO-RI Enabling Grids for E-sciencE Overview of STEP09 monitoring issues Julia Andreeva, IT/GS STEP09 Postmortem.
Michael Fenn CPSC 620, Fall 09.  Grid computing is the process of allowing loosely-coupled virtual organizations to share resources over a wide area.
Open Science Grid Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab September 1, 2005.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
Julia Andreeva, CERN IT-ES GDB Every experiment does evaluation of the site status and experiment activities at the site As a rule the state.
Portal Update Plan Ashok Adiga (512)
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
9 Oct Overview Resource & Project Management Current Initiatives  Generate SOWs  8 written and 6 remain;  drafts will be complete next week 
External Communication & Coordination Ruth Pordes
Farms User Meeting April Steven Timm 1 Farms Users meeting 4/27/2005
OSG RA, DOEGrids CA features Doug Olson, LBNL August 2006.
April 26, Executive Director Report Executive Board 4/26/07 Things under control Things out of control.
INDIANAUNIVERSITYINDIANAUNIVERSITY Fall 2002 HEPN Working Group Goal #8 Update Grid Operations Center James Williams Indiana University
CMS Usage of the Open Science Grid and the US Tier-2 Centers Ajit Mohapatra, University of Wisconsin, Madison (On Behalf of CMS Offline and Computing Projects)
Doug Benjamin Duke University. 2 ESD/AOD, D 1 PD, D 2 PD - POOL based D 3 PD - flat ntuple Contents defined by physics group(s) - made in official production.
Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab October 25, 2005.
Global ADC Job Monitoring Laura Sargsyan (YerPhI).
Parag Mhashilkar Computing Division, Fermi National Accelerator Laboratory.
OSG Area Coordinator’s Report: Workload Management October 6 th, 2010 Maxim Potekhin BNL
April 25, 2006Parag Mhashilkar, Fermilab1 Resource Selection in OSG & SAM-On-The-Fly Parag Mhashilkar Fermi National Accelerator Laboratory Condor Week.
External Communication & Coordination Ruth Pordes
David Adams ATLAS ATLAS Distributed Analysis (ADA) David Adams BNL December 5, 2003 ATLAS software workshop CERN.
ICFA DDW'06 - Cracow1 (HEP) GRID Activities in Hungary Csaba Hajdu KFKI RMKI Budapest.
INRNE's participation in LCG Elena Puncheva Preslav Konstantinov IT Department.
CERN - IT Department CH-1211 Genève 23 Switzerland t Grid Reliability Pablo Saiz On behalf of the Dashboard team: J. Andreeva, C. Cirstoiu,
Opensciencegrid.org Operations Interfaces and Interactions Rob Quick, Indiana University July 21, 2005.
NanoHUB.org online simulations and more Network for Computational Nanotechnology NCN nanoHUB experiences Scheduling and Managing Work and Jobs on OSG Open.
II EGEE conference Den Haag November, ROC-CIC status in Italy
Summary of OSG Activities by LIGO and LSC LIGO NSF Review November 9-11, 2005 Kent Blackburn LIGO Laboratory California Institute of Technology LIGO DCC:
Managing LIGO Workflows on OSG with Pegasus Karan Vahi USC Information Sciences Institute
LCG Accounting Update John Gordon, CCLRC-RAL 10/1/2007.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
VO Experiences with Open Science Grid Storage OSG Storage Forum | Wednesday September 22, 2010 (10:30am)
A Web Based Job Submission System for a Physics Computing Cluster David Jones IOP Particle Physics 2004 Birmingham 1.
OSG User Group August 14, Progress since last meeting OSG Users meeting at BNL (Jun 16-17) –Core Discussions on: Workload Management; Security.
James Casey, CERN IT-GD WLCG Workshop 1st September, 2007
Presentation transcript:

OSG site utilization by VOs Ilya Narsky, Caltech

Ilya Narsky Seattle OSG Meeting, Aug Help VOs utilize OSG sites Query VO reps at regular time intervals (monthly or quarterly) Learn what prevents VOs from running jobs on OSG sites, help solve problems, report status to the OSG Council Motivated by the fact that utilization of OSG resources is far below 100% Effort on my part started in early June Communication through osg-users list (low traffic). 11 VOs currently on the list, 2 more may be added in the near future.

Ilya Narsky Seattle OSG Meeting, Aug Surveyed VOs VORep ATLASTorre Wenaus CMSFrank Wuerthwein CDFMatt Norman D0Parag Mhashilkar LIGOKent Blackburn David Meyers GADUDina Sulakhe NANOHUBSteve Clark SDSS/DESNickolai Kouropatkine STARJerome Lauret GLOWDan Bradley FERMILABKeith Chadwick iVDGL? MARIACHI?

Ilya Narsky Seattle OSG Meeting, Aug Feedback All feedback summarized at VO twiki: (look at the right “status” column) VO enthusiasm and response time vary Various problems, both technical and sociological I classify VOs into 3 groups:  1) ATLAS, CMS  2) LIGO, GADU, NANOHUB (fighting technical issues)  3) everyone else (planning to expand their usage of OSG sites some time in the future but no immediate requests/complaints)

Ilya Narsky Seattle OSG Meeting, Aug LIGO and GADU VOs that are actively trying to run on more OSG sites but need to solve various technical problems LIGO (see Kent Blackburn’s talk today)  Rely on VDS tools. Working with the VDS team for development/support of these tools.  New site-verify.pl in ITB  Jobs need TB data on disk. Work on partitioning jobs.  1% LIGO jobs on OSG, 99% on LIGO grid GADU  Also rely on VDS tools. Unable to stage out to worker node location (because it Is not reported by VDS); problems similar to LIGO’s. Interacting with LIGO and VDS to solve this.

Ilya Narsky Seattle OSG Meeting, Aug NANOHUB Jobs take too long (6-10 days on a typical OSG site). Does not work so well on sites with runtime limits. Jobs run successfully at Purdue and UNM. They recently implemented suspend-and- resume functionality at the application level. A single simulation job is split into dozens sweeps. Can now run jobs on 3 sites with runtime limits (GRASE-CCR-U2, OSG_LIGO_PSU, VAMPIRE-Vanderbilt). At the moment, ¾ of their jobs are evicted.

Ilya Narsky Seattle OSG Meeting, Aug Talks at Seattle Meeting Torre – OSG extensions (slides) Ajit – CMS MC Production on OSG (slides) Kent – LIGO experience (slides) Michael Miller – STAR  For now STAR jobs are confined to STAR sites. MC Production will likely need OSG sites. Analysis users not likely to start using OSG resources out of STAR soon. Need a substantial amount of disk space for STAR apps. Maytal – TACC  Learning curve for scientists to start using grid resources Notes will be posted at

Ilya Narsky Seattle OSG Meeting, Aug Future It has been agreed that one VO will be selected for reporting at every Monday Operations meeting We also asked VOs to make web pages:  VO scientific goals, rules of membership, info on how to join, how to submit jobs, who to complain to when jobs are not running  CDF, ATLAS, D0, and LIGO have some approximations to those web pages Hopefully more feedback from VOs

Ilya Narsky Seattle OSG Meeting, Aug Questions Is our way of collecting info from VOs (my reports and Operations meetings) adequate? What else can/should we do? Does your VO want to participate in these surveys?