Alain Roy Computer Sciences Department University of Wisconsin-Madison Condor & Middleware: NMI & VDT.

Slides:



Advertisements
Similar presentations
Andrew McNab - Manchester HEP - 24 May 2001 WorkGroup H: Software Support Both middleware and application support Installation tools and expertise Communication.
Advertisements

SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
Andrew McNab - Manchester HEP - 6 November Old version of website was maintained from Unix command line => needed (gsi)ssh access.
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
Status of Globus activities within INFN (update) Massimo Sgaravatto INFN Padova for the INFN Globus group
1 Open Science Grid Middleware at the WLCG LHCC review Ruth Pordes, Fermilab.
Zach Miller Computer Sciences Department University of Wisconsin-Madison What’s New in Condor.
Todd Tannenbaum Computer Sciences Department University of Wisconsin-Madison What’s New in Condor.
The GRIDS Center, part of the NSF Middleware Initiative The GRIDS Center: Defining and Deploying Grid Middleware presented by Tom.
Open Science Grid Software Stack, Virtual Data Toolkit and Interoperability Activities D. Olson, LBNL for the OSG International.
1 Todd Tannenbaum Department of Computer Sciences University of Wisconsin-Madison
Peter Couvares Computer Sciences Department University of Wisconsin-Madison Metronome and The NMI Lab: This subtitle included solely to.
10/20/05 LIGO Scientific Collaboration 1 LIGO Data Grid: Making it Go Scott Koranda University of Wisconsin-Milwaukee.
HEP Experiment Integration within GriPhyN/PPDG/iVDGL Rick Cavanaugh University of Florida DataTAG/WP4 Meeting 23 May, 2002.
Slide 1 Experiences with NMI R2 Grids Software at Michigan Shawn McKee April 8, 2003 Internet2 Spring Meeting.
CTSS 4 Strategy and Status. General Character of CTSSv4 To meet project milestones, CTSS changes must accelerate in the coming years. Process –Process.
ARGONNE  CHICAGO Ian Foster Discussion Points l Maintaining the right balance between research and development l Maintaining focus vs. accepting broader.
OSG Services at Tier2 Centers Rob Gardner University of Chicago WLCG Tier2 Workshop CERN June 12-14, 2006.
OSG Middleware Roadmap Rob Gardner University of Chicago OSG / EGEE Operations Workshop CERN June 19-20, 2006.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
The National Grid Cyberinfrastructure Open Science Grid and TeraGrid John-Paul “JP” Navarro TeraGrid Area Co-Director for Software Integration Mike Wilde.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
The (US) National Grid Cyberinfrastructure Open Science Grid and TeraGrid.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
Peter F. Couvares Associate Researcher, Condor Team Computer Sciences Department University of Wisconsin-Madison
Building, Monitoring and Maintaining a Grid. Jorge Luis Rodriguez 2 Grid Summer Workshop 2006, June What we’ve already learned –What are grids,
D. Olson, L B N L 1 STAR Collab. Mtg. 13 Aug 2003 Grid Enabling a small Cluster Doug Olson Lawrence Berkeley National Laboratory STAR Collaboration Meeting.
1 The Roadmap to New Releases Todd Tannenbaum Department of Computer Sciences University of Wisconsin-Madison
Miron Livny Computer Sciences Department University of Wisconsin-Madison Welcome and Condor Project Overview.
OSG Software and Operations Plans Rob Quick OSG Operations Coordinator Alain Roy OSG Software Coordinator.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Condor : A Concept, A Tool and.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
José D. Zamora, Sean R. Morriss and Manuela Campanelli.
GriPhyN EAC Meeting (Jan. 7, 2002)Carl Kesselman1 University of Southern California GriPhyN External Advisory Committee Meeting Gainesville,
Grid Physics Network & Intl Virtual Data Grid Lab Ian Foster* For the GriPhyN & iVDGL Projects SCI PI Meeting February 18-20, 2004 *Argonne, U.Chicago,
Nick LeRoy & Jeff Weber Computer Sciences Department University of Wisconsin-Madison Managing.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
VDT 1 The Virtual Data Toolkit 7.th EU DataGrid Internal Project Conference Heidelberg / Germany Todd Tannenbaum (Miron Livny) (Alain.
Todd Tannenbaum Computer Sciences Department University of Wisconsin-Madison Condor RoadMap.
The Roadmap to New Releases Derek Wright Computer Sciences Department University of Wisconsin-Madison
Turning Software Projects into Production Solutions Dan Fraser, PhD Production Coordinator Open Science Grid OU Supercomputing Symposium October 2009.
The Open Science Grid OSG Ruth Pordes Fermilab. 2 What is OSG? A Consortium of people working together to Interface Farms and Storage to a Grid and Researchers.
Alain Roy Computer Sciences Department University of Wisconsin-Madison Packaging & Testing: NMI & VDT.
LCG EGEE is a project funded by the European Union under contract IST LCG PEB, 7 th June 2004 Prototype Middleware Status Update Frédéric Hemmer.
GRIDS Center Middleware Overview Sandra Redman Information Technology and Systems Center and Information Technology Research Center National Space Science.
INFSO-RI Enabling Grids for E-sciencE OSG-LCG Interoperability Activity Author: Laurence Field (CERN)
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
Testing Grid Software on the Grid Steven Newhouse Deputy Director.
VDT 1 The Virtual Data Toolkit Todd Tannenbaum (Alain Roy)
Miron Livny Computer Sciences Department University of Wisconsin-Madison The Role of Scientific Middleware in the Future of HEP Computing.
Peter F. Couvares Associate Researcher, Condor Team Computer Sciences Department University of Wisconsin-Madison
Globus and PlanetLab Resource Management Solutions Compared M. Ripeanu, M. Bowman, J. Chase, I. Foster, M. Milenkovic Presented by Dionysis Logothetis.
The National Grid Cyberinfrastructure Open Science Grid and TeraGrid.
EGEE is a project funded by the European Union under contract IST NA3 Strategy for Future Grids Malcolm Atkinson Director of the National e-Science.
GriPhyN Project Paul Avery, University of Florida, Ian Foster, University of Chicago NSF Grant ITR Research Objectives Significant Results Approach.
Feb 2-4, 2004LNCC Workshop on Computational Grids & Apps Middleware for Production Grids Jim Basney Senior Research Scientist Grid and Security Technologies.
Status of Globus activities Massimo Sgaravatto INFN Padova for the INFN Globus group
CTSS Rollout update Mike Showerman JP Navarro April
The National Grid Cyberinfrastructure Open Science Grid and TeraGrid John-Paul “JP” Navarro TeraGrid Area Co-Director for Software Integration Mike Wilde.
1 An update on the Open Science Grid for IHEPCCC Ruth Pordes, Fermilab.
Condor Project Computer Sciences Department University of Wisconsin-Madison Condor Introduction.
GRIDS Center John McGee, USC/ISI April 10, 2003 Internet2 – Spring Member Meeting Arlington, VA NSF Middleware Initiative.
The Great Migration: From Pacman to RPMs Alain Roy OSG Software Coordinator.
OSG Status and Rob Gardner University of Chicago US ATLAS Tier2 Meeting Harvard University, August 17-18, 2006.
OSG Facility Miron Livny OSG Facility Coordinator and PI University of Wisconsin-Madison Open Science Grid Scientific Advisory Group Meeting June 12th.
Defining the Technical Roadmap for the NWICG – OSG Ruth Pordes Fermilab.
1 Open Science Grid Progress & Vision Keith Chadwick, Fermilab
Status of Grids for HEP and HENP
Presentation transcript:

Alain Roy Computer Sciences Department University of Wisconsin-Madison Condor & Middleware: NMI & VDT

Condor Team Experience › Condor works on many platforms › Condor build & packaging experience › Condor deployed in many environments

Condor: platforms › Condor Today (6.6)  Linux Intel (3 versions)  Linux Alpha  Windows  Mac OS X  Solaris (3 versions)  HPUX  IRIX  Digital Unix  Tru64  AIX › This isn’t new (Condor 6.0, 1999)  Linux Intel  Solaris (2 versions)  HPUX  IRIX (3 versions)  Digital Unix › We have experience with many platforms

Condor: builds & tests › We believe in nightly builds & tests  Running since early 2000  Runs on every platform  Developers immediately know when something breaks

Condor: build pool (subset)

Condor: Many environments › Maintain large pool in CS department  Three platforms  Many diverse applications › Have assisted with many pools world- wide:  You’ve seen many of them today  Commercial  Research  Large & Small

Question › Can we apply this expertise to building, packaging, and distributing Grid software? › YES  NSF Middleware Initiative (NMI)  Virtual Data Toolkit (VDT)

What is NMI? › NMI is a software infrastructure for middleware, especially grid › Many groups contribute software to NMI › Building, packaging, and testing is based here in Madison, with Condor Team members

NMI Builds Sources (CVS) Patching GPT src bundles Build & Test Condor pool (30+ computers) … Build Test Binaries Web Builds distributed with Condor 18 distinct platforms

Builds benefit from Condor › Fault tolerance  Computer fails—build just runs later › Can use DAGMan for ordered builds › New platform? Just add a computer

Condor & NMI › NMI infrastructure builds lots of software  Capable of doing nightly builds and tests  Condor is one of these pieces of software  Helps Condor with nightly builds and regular tests › We use Condor to build Condor (via NMI) We eat our own dog food

NMI GRIDS Status › NMI Release 4: December 2003  Globus  Condor-G  Network Weather Service  KX509  GSI OpenSSH  MyProxy  MPICH-G2  Grid Packaging Tools  Gridconfig  Gridsolve  PyGlobus  UberFTP › NMI Release 5: coming soon › Six platforms:  RedHat 7.2, 8.0, 9.0  Solaris 8  IA64 RedHa  IA64 SuSE Linux

What is the VDT? › A collection of software  NMI software + extras tailored for specific collaborations  Common Grid middleware (Condor, Globus, and more…)  Virtual data software  Utilities › An easy installation & configuration mechanism  Goal: Push a button, everything you need to be a consumer or provider of Grid resources just works  Two methods: Pacman: installs and configures it all RPM: installs some of the software, no configuration › A support infrastructure  Coordinate bug fixing  Help desk  Understand community needs and wishes

What’s in the VDT? › Condor Group  Condor/Condor-G  DAGMan  Fault Tolerant Shell  ClassAds › Globus Alliance  Job submission (GRAM)  Information service (MDS)  Data transfer (GridFTP)  Replica Location (RLS) › EDG & LCG  Make Gridmap  Certificate Revocation List Updater  Glue Schema/Info prov. › ISI & UC  Chimera & Pegasus › NCSA  MyProxy  GSI OpenSSH › LBL  PyGlobus  Netlogger › Caltech  MonaLisa › VDT  VDT System Profiler  Configuration software › Others  KX509 (U. Mich.)

VDT Builds Sources (CVS) Patching GPT src bundles NMI Build & Test Condor pool (30+ computers) … Build Test Package VDT Build Contributors (VDS, etc.) Build Pacman cache RPMs Binaries Test Will use NMI processes soon

Recent Success: Grid2003 › VDT was deployed on 27 sites  VDT/Condor Team members contributed expertise and support › VDT provided access to CPUs › Condor-G used by most users  Close collaboration promoted understanding

Many VDT Customers › GriPhyN collaborators  US-CMS  US-Atlas  LIGO  SDSS › Particle Physics Data Grid › European Data Grid Project › Enabling Grids for e-Science in Europe Project › LHC Computing Grid Project › Alliance Grid Testbed (AGT)

Questions?