OSG Area Coordinators Campus Infrastructures Update Dan Fraser Miha Ahronovitz, Jaime Frey, Rob Gardner, Brooklin Gore, Marco Mambelli, Todd Tannenbaum,

Slides:



Advertisements
Similar presentations
Jaime Frey Computer Sciences Department University of Wisconsin-Madison OGF 19 Condor Software Forum Routing.
Advertisements

Campus Grids & Campus Infrastructures Community Rob Gardner Computation Institute / University of Chicago July 17, 2013.
Campus Grids & Campus Infrastructures Community Rob Gardner Computation Institute / University of Chicago June 4, 2013.
Bosco: Enabling Researchers to Expand Their HTC Resources The Bosco Team: Dan Fraser, Jaime Frey, Brooklin Gore, Marco Mambelli, Alain Roy, Todd Tannenbaum,
Building Campus HTC Sharing Infrastructures Derek Weitzel University of Nebraska – Lincoln (Open Science Grid Hat)
Campus High Throughput Computing (HTC) Infrastructures (aka Campus Grids) Dan Fraser OSG Production Coordinator Campus Grids Lead.
May 9, 2008 Reorganization of the OSG Project The existing project organization chart was put in place at the beginning of It has worked very well.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
A tool to enable CMS Distributed Analysis
Assessment of Core Services provided to USLHC by OSG.
SCD FIFE Workshop - GlideinWMS Overview GlideinWMS Overview FIFE Workshop (June 04, 2013) - Parag Mhashilkar Why GlideinWMS? GlideinWMS Architecture Summary.
HEPiX Catania 19 th April 2002 Alan Silverman HEPiX Large Cluster SIG Report Alan Silverman 19 th April 2002 HEPiX 2002, Catania.
OSG Area Coordinators Meeting Operations Rob Quick 2/22/2012.
Welcome to CW 2007!!!. The Condor Project (Established ‘85) Distributed Computing research performed by.
Open Science Grid Software Stack, Virtual Data Toolkit and Interoperability Activities D. Olson, LBNL for the OSG International.
Key Project Drivers - FY11 Ruth Pordes, June 15th 2010.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Connect.usatlas.org ci.uchicago.edu ATLAS Connect Technicals & Usability David Champion Computation Institute & Enrico Fermi Institute University of Chicago.
OSG Area Coordinators Meeting Operations Rob Quick 2/22/2012.
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
OSG Operations and Interoperations Rob Quick Open Science Grid Operations Center - Indiana University EGEE Operations Meeting Stockholm, Sweden - 14 June.
Enabling Cloud and Grid Powered Image Phenotyping Nirav Merchant iPlant Collaborative
Integration and Sites Rob Gardner Area Coordinators Meeting 12/4/08.
Integration Program Update Rob Gardner US ATLAS Tier 3 Workshop OSG All LIGO.
Campus Grids Report OSG Area Coordinator’s Meeting Dec 15, 2010 Dan Fraser (Derek Weitzel, Brian Bockelman)
OSG Area Coordinators’ Meeting LIGO Applications (NEW) Kent Blackburn Caltech / LIGO October 29 th,
High Throughput Parallel Computing (HTPC) Dan Fraser, UChicago Greg Thain, UWisc Condor Week April 13, 2010.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
SG - OSG Improving Campus Research CI Through Leveraging and Integration: Developing a SURAgrid-OSG Collaboration John McGee, RENCI/OSG Engagement Coordinator.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
Offline Coordinators  CMSSW_7_1_0 release: 17 June 2014  Usage:  Generation and Simulation samples for run 2 startup  Limited digitization and reconstruction.
Sep 21, 20101/14 LSST Simulations on OSG Sep 21, 2010 Gabriele Garzoglio for the OSG Task Force on LSST Computing Division, Fermilab Overview OSG Engagement.
CSIU Submission of BLAST jobs via the Galaxy Interface Rob Quick Open Science Grid – Operations Area Coordinator Indiana University.
Discussion Topics DOE Program Managers and OSG Executive Team 2 nd June 2011 Associate Executive Director Currently planning for FY12 XD XSEDE Starting.
Production Coordination Staff Retreat July 21, 2010 Dan Fraser – Production Coordinator.
The ILC And the Grid Andreas Gellrich DESY LCWS2007 DESY, Hamburg, Germany
June 10, D0 Use of OSG D0 relies on OSG for a significant throughput of Monte Carlo simulation jobs, will use it if there is another reprocessing.
1October 9, 2001 Sun in Scientific & Engineering Computing Grid Computing with Sun Wolfgang Gentzsch Director Grid Computing Cracow Grid Workshop, November.
OSG Tier 3 support Marco Mambelli - OSG Tier 3 Dan Fraser - OSG Tier 3 liaison Tanya Levshina - OSG.
OSG Technology Area Brian Bockelman Area Coordinator’s Meeting February 15, 2012.
GCRC Meeting 2004 BIRN Coordinating Center Software Development Vicky Rowley.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 14 February 2006.
Open Science Grid (OSG) Introduction for the Ohio Supercomputer Center Open Science Grid (OSG) Introduction for the Ohio Supercomputer Center February.
OSG Executive Retreat Update Brooklin Gore, Project Manager June 4, 2013 Dan Fraser (Prog. Mgr), Miha Ahronovitz (UX), Derek Weitzel, Marco Mambelli, Jaime.
Production Coordination Area VO Meeting Feb 11, 2009 Dan Fraser – Production Coordinator.
Mar 27, gLExec Accounting Solutions in OSG Gabriele Garzoglio gLExec Accounting Solutions in OSG Mar 27, 2008 Middleware Security Group Meeting Igor.
GSI: Security On Teragrid A Introduction To Security In Cyberinfrastructure By Dru Sepulveda.
Production Oct 31, 2012 Dan Fraser. Current Production Focus Transition to RPMs 52(44) sites using RPM based installs 52(44) sites using RPM based installs.
OSG Area Report Production – Operations – Campus Grids Jan 11, 2011 Dan Fraser.
CMS Usage of the Open Science Grid and the US Tier-2 Centers Ajit Mohapatra, University of Wisconsin, Madison (On Behalf of CMS Offline and Computing Projects)
Doug Benjamin Duke University. 2 ESD/AOD, D 1 PD, D 2 PD - POOL based D 3 PD - flat ntuple Contents defined by physics group(s) - made in official production.
Open Science Grid as XSEDE Service Provider Open Science Grid as XSEDE Service Provider December 4, 2011 Chander Sehgal OSG User Support.
The ATLAS Computing Model and USATLAS Tier-2/Tier-3 Meeting Shawn McKee University of Michigan Joint Techs, FNAL July 16 th, 2007.
Accelerating Campus Research with Connective Services for Cyberinfrastructure Rob Gardner Steve Tuecke.
OSG Area Coordinators Meeting Security Team Report Mine Altunay 02/13/2012.
OSG Area Report Production – Operations – Campus Grids June 19, 2012 Dan Fraser Rob Quick.
Tier 3 Support and the OSG US ATLAS Tier2/Tier3 Workshop at UChicago August 20, 2009 Marco Mambelli –
User Support of WLCG Storage Issues Rob Quick OSG Operations Coordinator WLCG Collaboration Meeting Imperial College, London July 7,
Opensciencegrid.org Operations Interfaces and Interactions Rob Quick, Indiana University July 21, 2005.
NanoHUB.org online simulations and more Network for Computational Nanotechnology NCN nanoHUB experiences Scheduling and Managing Work and Jobs on OSG Open.
Campus Grid Technology Derek Weitzel University of Nebraska – Lincoln Holland Computing Center (HCC) Home of the 2012 OSG AHM!
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
March 2014 Open Science Grid Operations A Decade of HTC Infrastructure Support Kyle Gross Operations Support Lead Indiana University / Research Technologies.
Job submission overview Marco Mambelli – August OSG Summer Workshop TTU - Lubbock, TX THE UNIVERSITY OF CHICAGO.
Quarterly Meeting Spring 2007 NSTG: Some Notes of Interest Adapting Neutron Science community codes for TeraGrid use and deployment. (Lynch, Chen) –Geared.
HTCondor Accounting Update
Miron Livny John P. Morgridge Professor of Computer Science
OSG Rob Gardner • University of Chicago
Welcome to (HT)Condor Week #19 (year 34 of our project)
Presentation transcript:

OSG Area Coordinators Campus Infrastructures Update Dan Fraser Miha Ahronovitz, Jaime Frey, Rob Gardner, Brooklin Gore, Marco Mambelli, Todd Tannenbaum, Derek Weitzel March 27, 2013

OSG Campus Program Focus on the Researcher (…or Artist) What can we do to increase the throughput of your computing? What can we do to increase the throughput of your computing? Bosco ( ) Help ease the motivated researcher into a distributed environment Help ease the motivated researcher into a distributed environment Make Bosco bullet proof Broadening the user base Campus Infrastructure Community Community building focal point Community building focal point Rob Gardner

Campus Infrastructure Community Center where researchers can learn about and adopt campus technologies. Led by Rob Gardner Led by Rob Gardner First meeting held in November UC Santa Cruz UC Santa Cruz Second meeting at the OSG AHM (Mar) Monthly “Interactive Forum” meetings now moving to bi-weekly Attendance has been good (~15-20) Attendance has been good (~15-20)

Bosco Plan (release ~quarterly) Current Version v1.1.2 Invitation sent out to Condor users mailing list Invitation sent out to Condor users mailing list Previous Campus grids have been upgraded to Bosco Previous Campus grids have been upgraded to Bosco Nebraska, Virginia Tech View on OSG Display View on OSG Display( Working on v1.2 (April, 2013) Focus on reliability, reducing annoyances Focus on reliability, reducing annoyances Scaling to 1000 submissions Scaling to 1000 submissions Start on v1.3 (July) Gratia Accounting (integrated probe) Gratia Accounting (integrated probe) SLURM support SLURM support Additional features TBD Additional features TBD

New Bosco Developments Bosco invite sent to Condor list New Bosco downloads site: BoscoAnalytics BoscoAnalytics BoscoAnalytics BoscoAnalytics New use cases: CMS submission without a gatekeeper. CMS submission without a gatekeeper. Pegasus HTPC workflow to XSEDE resources Pegasus HTPC workflow to XSEDE resources Exploring an HTC scientific/artistic competition

Campus Grid Issues Emerging concern is that Bosco, while making it easier to submit HTC jobs, does not yet meet the needs of the end users. May require integration with an infrastructure May require integration with an infrastructure Currently exploring “R” This conclusion is in part due to Miha’s efforts so far in user interviews. This conclusion is in part due to Miha’s efforts so far in user interviews.

Campus HTC Infrastructure Direction Help the researcher use local HTC resources Run on a local campus cluster Run on a local campus cluster Run on several local clusters Run on several local clusters Use local authentication credentials Use/share resources with a collaborator on another campus Access to the national cyberinfrastructure OSG (and also XSEDE) resources OSG (and also XSEDE) resources Submit Locally, Run Globally Grid Campus Local