OSG User Support OSG User Support March 13, 2013 Chander Sehgal

Slides:



Advertisements
Similar presentations
Dec 14, 20061/10 VO Services Project – Status Report Gabriele Garzoglio VO Services Project WBS Dec 14, 2006 OSG Executive Board Meeting Gabriele Garzoglio.
Advertisements

Role Based VO Authorization Services Ian Fisk Gabriele Carcassi July 20, 2005.
Campus High Throughput Computing (HTC) Infrastructures (aka Campus Grids) Dan Fraser OSG Production Coordinator Campus Grids Lead.
Introduction The Open Science Grid (OSG) is a consortium of more than 100 institutions including universities, national laboratories, and computing centers.
Implementing Finer Grained Authorization in the Open Science Grid Gabriele Carcassi, Ian Fisk, Gabriele, Garzoglio, Markus Lorch, Timur Perelmutov, Abhishek.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Feb 23, 20111/22 LSST Simulations on OSG Feb 23, 2011 Gabriele Garzoglio for the OSG Task Force on LSST Computing Division, Fermilab Overview The Open.
NEES, NEES2, and OSG Thomas Hacker Purdue University.
Assessment of Core Services provided to USLHC by OSG.
GlobusWorld 2012: Experience with EXPERIENCE WITH GLOBUS ONLINE AT FERMILAB Gabriele Garzoglio Computing Sector Fermi National Accelerator.
OSG End User Tools Overview OSG Grid school – March 19, 2009 Marco Mambelli - University of Chicago A brief summary about the system.
OSG Grid Workshop in KNUST, Kumasi, Ghana August 6-8, 2012 following the AFRICAN SCHOOL OF FUNDAMENTAL PHYSICS AND ITS APPLICATIONS July 15-Aug 04, 2012.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Data Management Subsystem: Data Processing, Calibration and Archive Systems for JWST with implications for HST Gretchen Greene & Perry Greenfield.
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
OSG Public Storage and iRODS
OSG Area Coordinators Campus Infrastructures Update Dan Fraser Miha Ahronovitz, Jaime Frey, Rob Gardner, Brooklin Gore, Marco Mambelli, Todd Tannenbaum,
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
OSG Site Provide one or more of the following capabilities: – access to local computational resources using a batch queue – interactive access to local.
SAN DIEGO SUPERCOMPUTER CENTER NUCRI Advisory Board Meeting November 9, 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director.
YAN, Tian On behalf of distributed computing group Institute of High Energy Physics (IHEP), CAS, China CHEP-2015, Apr th, OIST, Okinawa.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
Sep 21, 20101/14 LSST Simulations on OSG Sep 21, 2010 Gabriele Garzoglio for the OSG Task Force on LSST Computing Division, Fermilab Overview OSG Engagement.
CSIU Submission of BLAST jobs via the Galaxy Interface Rob Quick Open Science Grid – Operations Area Coordinator Indiana University.
OSG Project Manager Report for OSG Council Meeting OSG Project Manager Report for OSG Council Meeting October 14, 2008 Chander Sehgal.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
Mar 28, 20071/9 VO Services Project Gabriele Garzoglio The VO Services Project Don Petravick for Gabriele Garzoglio Computing Division, Fermilab ISGC 2007.
ALICE-USA Grid-Deployment Plans (By the way, ALICE is an LHC Experiment, TOO!) Or (We Sometimes Feel Like and “AliEn” in our own Home…) Larry Pinsky—Computing.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Global Grid Forum GridWorld GGF15 Boston USA October Abhishek Singh Rana and Frank Wuerthwein UC San Diegowww.opensciencegrid.org The Open Science.
09/02 ID099-1 September 9, 2002Grid Technology Panel Patrick Dreher Technical Panel Discussion: Progress in Developing a Web Services Data Analysis Grid.
Ames Research CenterDivision 1 Information Power Grid (IPG) Overview Anthony Lisotta Computer Sciences Corporation NASA Ames May 2,
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
What is SAM-Grid? Job Handling Data Handling Monitoring and Information.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Evolution of a High Performance Computing and Monitoring system onto the GRID for High Energy Experiments T.L. Hsieh, S. Hou, P.K. Teng Academia Sinica,
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
Open Science Grid (OSG) Introduction for the Ohio Supercomputer Center Open Science Grid (OSG) Introduction for the Ohio Supercomputer Center February.
GLIDEINWMS - PARAG MHASHILKAR Department Meeting, August 07, 2013.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
AliEn AliEn at OSC The ALICE distributed computing environment by Bjørn S. Nilsen The Ohio State University.
OSG User Support Update OSG Area Coordinators Call February 12, 2014 Chander Sehgal - FNAL.
The OSG and Grid Operations Center Rob Quick Open Science Grid Operations Center - Indiana University ATLAS Tier 2-Tier 3 Meeting Bloomington, Indiana.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
Auditing Project Architecture VERY HIGH LEVEL Tanya Levshina.
The GridPP DIRAC project DIRAC for non-LHC communities.
Open Science Grid as XSEDE Service Provider Open Science Grid as XSEDE Service Provider December 4, 2011 Chander Sehgal OSG User Support.
Computing Sector, Fermi National Accelerator Laboratory 4/12/12GlobusWorld 2012: Experience with
© Copyright AARNet Pty Ltd PRAGMA Update & some personal observations James Sankar Network Engineer - Middleware.
Accelerating Campus Research with Connective Services for Cyberinfrastructure Rob Gardner Steve Tuecke.
April 25, 2006Parag Mhashilkar, Fermilab1 Resource Selection in OSG & SAM-On-The-Fly Parag Mhashilkar Fermi National Accelerator Laboratory Condor Week.
© Geodise Project, University of Southampton, Workflow Support for Advanced Grid-Enabled Computing Fenglian Xu *, M.
The GridPP DIRAC project DIRAC for non-LHC communities.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
SAM architecture EGEE 07 Service Availability Monitor for the LHC experiments Simone Campana, Alessandro Di Girolamo, Nicolò Magini, Patricia Mendez Lorenzo,
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
Running User Jobs In the Grid without End User Certificates - Assessing Traceability Anand Padmanabhan CyberGIS Center for Advanced Digital and Spatial.
March 2014 Open Science Grid Operations A Decade of HTC Infrastructure Support Kyle Gross Operations Support Lead Indiana University / Research Technologies.
Grid Colombia Workshop with OSG Week 2 Startup Rob Gardner University of Chicago October 26, 2009.
Patrick Dreher Research Scientist & Associate Director
Leigh Grundhoefer Indiana University
High Throughput Computing for Astronomers
Presentation transcript:

OSG User Support OSG User Support March 13, 2013 Chander Sehgal

March 13, 2013 Focus of User Support Enable new communities to quickly adopt the OSG DHTC model and to improve productivity for all VOs as OSG services and capabilities evolve  Research Communities **  OSG as a service provider in XSEDE  New Sites  Supporting technologies ** Researcher are also supported by the distributed OSG consortium members including: GLOW/CHTC, HCC, UCSDGrid, CSIU, UC3, RENCI/Engage 2

March 13, 2013 User Support in OSG 3 User Support Contact us at

March 13, 2013 Our Method  Inform science communities about OSG and how best to join OSG and leverage DHTC in their own environment  Active support for communities who want to join and use OSG  Understand the goals of the research community  Possibly embed our staff into their team for a limited period to help adapt their applications to DHTC  Support the community in resolving technical issues and achieving production goals  Serve as a general entry point to OSG for information and user support 4

March 13, Research Communities supported by User Support 5 ScienceResearcherAffiliationProject TitleHoursTeamContact Astronomy Ewa Deelman, Bruce BerrimanUSC ISI / NASA IPAC Atlas of Periodicities present in the time-series data sets released by the Kepler satellite355,382OSG-XSEDEMats Rynge BiologyPaul WolbergUniversity of Michigan Multi-scale Computational Models to study the Human Immune Response to infection with M. tuberculosis2,836OSG-XSEDEMats Rynge BiologyDon KriegerUniversity of PittsburghVery high resolution functional brain mapping1,107,068OSG-XSEDEMats Rynge Civil Engineering Andre Barbosa, Patricia Clayton Oregon State University, University of Washington Simulation of structures' responses to earthquakes, Slyz MathematicsAlexander Arlange Rochester Institute of TechnologyRamsey Numbers R(C4,Km)140,121User-SupportMats Rynge MedicineMartin PurschkeBrookhaven National Lab Positron Emission Tomography (PET) at BNL, Zaytsev PhysicsArmando Fella SuperB experiment; CNRS–Orsay Test jobs in preparation for designing SuperB accelerator, Slyz PhysicsTobias TollBrookhaven National Lab Electron Ion Collider (EIC) at BNL, Zaytsev Physics Don Petravick, Brian YannyNCSA & FNAL Basic processing of DES exposures, Garzoglio PhysicsJohn PetersonPurdue Software development for LSST telescope, Garzoglio PhysicsRobert McIntosh University of Texas at Dallas Global Distribution of Characteristics of Auroral Particles49,684OSG-XSEDEMats Rynge PhysicsStefan HoecheSLAC Validation and use of software for particle physics phenomenology, Slyz PhysicsPran NathNortheastern University Search for Beyond the Standard Model Physics at the LHC103,679OSG-XSEDEMats Rynge Total =3,122,384

March 13, 2013 Example #1 - NEES 6 Science Goals The Network for Earthquake Engineering Simulation (NEES) studies the response of buildings and other structures to earthquakes. A. R. Barbosa, J. P. Conte, J. I. Restrepo, UCSD Workflow 1.Use Condor/glideinWMS to submit the OpenSees simulation application to sites. Condor transfers input data. 2.Return the data using Condor to the submit host. 3.Use Globus Online to transfer the data to the user’s archive.

March 13, 2013 Example #2 - LSST 7 Science Goals Produce simulated images for the Large Synoptic Survey Telescope (LSST) project for use in verifying the LSST software. LSST will be an 8 meter wide- field telescope that will image the entire visible sky every few nights for 10 years. Workflow 1.(Only once) Pre-stage star catalog and focal plane configuration files. 2.Submit 1 job to trim the pre-staged catalog file into 189 files, one per CCD chip in the camera. 3.Submit 2 x 189 jobs: simulate 1 image pair (same image with 2 exposures). Transfer “instance catalog” (telescope position, wind speed, etc.) with each job. 4.Gather output, perform bookkeeping, etc. LSST Image Simulation team

March 13, 2013 Example #3 – EIC 8 Science Goals The Electron Ion Collider (EIC) is a proposed facility for studying the structure of nuclei. Engineers need a large amount of computations to define its design. Workflow 1.(Only once) Pre-stage a 1GB read-only file to each site. This way, that file does not need to be repeatedly transferred over the wide area network. 2.Submit the jobs to the sites with the pre-staged files. Transfer the application. 3.Jobs run and read the pre-staged files. 4.Condor transfers output data back to submit host. 5.User takes possession of the results. Tobias Toll, Thomas Ullrich Brookhaven National Lab

March 13, 2013 OSG as an XSEDE Service Provider 9 Login host osg-xsede.grid.iu.edu OSG Sites XSEDE User Portal XSEDE Admin Systems Operational 4/1/2012

March 13, 2013 OSG/XSEDE Integration Challenges  XSEDE is based on allocations / OSG on opportunistic use  XSEDE has a central database with allocations and users / OSG has distributed VOs  XSEDE users assume there is a login host for each resource  Integration of XSEDE and OSG software stacks  Collecting and reporting accounting data to both XSEDE and OSG 10

March 13, 2013 Site Support Assist communities in connecting new resources to OSG; in collaboration with Production (Marco Mambelli)  Pacific Northwest National Lab - Complete  University of Maryland – Institute for Genome Sciences  Ohio Supercomputer Center  North Dakota State University 11

March 13, 2013 OSG Public Storage using iRODS  Enable non-LHC VOs whose computation requires “large” data to use OSG sites more easily  Ease the task of VO data management:  Providing quota management  Moving data and software to the sites for caching  Retrieving the output data from the sites  Providing metadata catalog  Demonstrated for certain use cases: EIC, Pheno, SAGA  Available as a “alpha” grade service today; working to determine next steps for the prototype 12

March 13, 2013 OSG Public Storage Architecture 13 Site A Site B SE A SE B CE WN iRODS Server iCAT Disk Cache GOC rules Production manager UnivMSS Interface OSG APP OSG APP OSG DATA OSG DATA Submit host

March 13, 2013 What’s Next Continue active support of current communities and identify and assist new communities Recently Active  SAGA - Community portal back-end connection to OSG  iPlant – Community portal back-end connection to OSG  Snowmass Group – Theoretical Physics Make it easier for researchers to leverage OSG Initiatives  Streamline the process of integrating new sites into OSG  Improve the time for new VOs to be accepted at most sites  Monitor production of VOs that use via gWMS; pro-actively identify issues; and assist VOs in resolving those issues  Improved support for Egress (a.k.a. “flocking”) from Campus Grids to OSG Production Fabric 14

March 13, 2013 The User Support Team Gabriele Garzoglio - FNAL Tanya Levshina - FNAL Mats Rynge – USC ISI Marko Slyz - FNAL Alex Zaytsev - BNL Chander Sehgal – FNAL (Coordinator) 15 Contact us at