Miron Livny Center for High Throughput Computing Computer Sciences Department University of Wisconsin-Madison Open Science Grid (OSG)

Slides:



Advertisements
Similar presentations
European Middleware Initiative (EMI) 101. State-Of-The-Art in Grids Today High Performance Computing (HPC) InfrastructuresHigh Throughput Computing (HTC)
Advertisements

Joint CASC/CCI Workshop Report Strategic and Tactical Recommendations EDUCAUSE Campus Cyberinfrastructure Working Group Coalition for Academic Scientific.
SLA-Oriented Resource Provisioning for Cloud Computing
Bosco: Enabling Researchers to Expand Their HTC Resources The Bosco Team: Dan Fraser, Jaime Frey, Brooklin Gore, Marco Mambelli, Alain Roy, Todd Tannenbaum,
Broader Impacts: Meaningful Links between Research and Societal Benefits October 23, 2014 Martin Storksdieck I Center for Research on Lifelong STEM Learning.
Towards a Virtual European Supercomputing Infrastructure Vision & issues Sanzio Bassini
Kathy Benninger, Pittsburgh Supercomputing Center Workshop on the Development of a Next-Generation Cyberinfrastructure 1-Oct-2014 NSF Collaborative Research:
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
 Amazon Web Services announced the launch of Cluster Compute Instances for Amazon EC2.  Which aims to provide high-bandwidth, low- latency instances.
Introduction and Overview “the grid” – a proposed distributed computing infrastructure for advanced science and engineering. Purpose: grid concept is motivated.
Milos Kobliha Alejandro Cimadevilla Luis de Alba Parallel Computing Seminar GROUP 12.
Data Sources & Using VIVO Data Visualizing Scholarship VIVO provides network analysis and visualization tools to maximize the benefits afforded by the.
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
Communicating with Users about HTCondor and High Throughput Computing Lauren Michael, Research Computing Facilitator HTCondor Week 2015.
Welcome to HTCondor Week #14 (year #29 for our project)
Miron Livny Computer Sciences Department University of Wisconsin-Madison From Compute Intensive to Data.
M.Goldberg/NSFOSGSep 18, 2012 The Open Science Grid 1.
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
A Roadmap to Service Excellence Information Technology Strategic Plan University of Wisconsin-Madison A report to the ITC
1/8 Enhancing Grid Infrastructures with Virtualization and Cloud Technologies Ignacio M. Llorente Business Workshop EGEE’09 September 21st, 2009 Distributed.
GIG Software Integration: Area Overview TeraGrid Annual Project Review April, 2008.
Near East Rural & Agricultural Knowledge and Information Network - NERAKIN Food and Agriculture Organization of the United Nations Near East and North.
Welcome to CW 2007!!!. The Condor Project (Established ‘85) Distributed Computing research performed by.
Key Project Drivers - FY11 Ruth Pordes, June 15th 2010.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Data Infrastructures Opportunities for the European Scientific Information Space Carlos Morais Pires European Commission Paris, 5 March 2012 "The views.
Patterns of Teacher Professional Development in Florida Florida Joint Center for Citizenship October, 2008.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
SmartGRID Ongoing research work in Univ. Fribourg and Univ. Applied Sciences of Western Switzerland (HES-SO) SwiNG Grid Day, Bern, Nov. 26th, 2009 Ye HUANG.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Welcome and Condor Project Overview.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
Discussion Topics DOE Program Managers and OSG Executive Team 2 nd June 2011 Associate Executive Director Currently planning for FY12 XD XSEDE Starting.
Condor Team Welcome to Condor Week #10 (year #25 for the project)
Miron Livny Computer Sciences Department University of Wisconsin-Madison Condor : A Concept, A Tool and.
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
1 Condor Team 2011 Established 1985.
Authors: Ronnie Julio Cole David
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Miron Livny Center for High Throughput Computing Morgridge Institute for Research and University of Wisconsin-Madison The Dynamics of a China-US Partnership.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
August 3, March, The AC3 GRID An investment in the future of Atlantic Canadian R&D Infrastructure Dr. Virendra C. Bhavsar UNB, Fredericton.
CEDPS Data Services Ann Chervenak USC Information Sciences Institute.
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
CMS Usage of the Open Science Grid and the US Tier-2 Centers Ajit Mohapatra, University of Wisconsin, Madison (On Behalf of CMS Offline and Computing Projects)
GriPhyN Project Paul Avery, University of Florida, Ian Foster, University of Chicago NSF Grant ITR Research Objectives Significant Results Approach.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
Open Science Grid as XSEDE Service Provider Open Science Grid as XSEDE Service Provider December 4, 2011 Chander Sehgal OSG User Support.
ELECTRONIC SERVICES & TOOLS Strategic Plan
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
EGEE is a project funded by the European Union under contract IST EGEE Summary NA2 Partners April
Accelerating Campus Research with Connective Services for Cyberinfrastructure Rob Gardner Steve Tuecke.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Condor and (the) Grid (one of.
Parag Mhashilkar Computing Division, Fermi National Accelerator Laboratory.
High Risk 1. Ensure productive use of GRID computing through participation of biologists to shape the development of the GRID. 2. Develop user-friendly.
European Science Cloud for Research Towards a common vision Per Öster CSC – IT Center for Science Ltd.
Collaborative Tools for the Grid V.N Alexandrov S. Mehmood Hasan.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
March 2014 Open Science Grid Operations A Decade of HTC Infrastructure Support Kyle Gross Operations Support Lead Indiana University / Research Technologies.
Accessing the VI-SEEM infrastructure
Clouds , Grids and Clusters
Monitoring and Information Services Technical Group Report
Miron Livny John P. Morgridge Professor of Computer Science
University of Technology
Scheduled Accomplishments
Grid Application Model and Design and Implementation of Grid Services
Welcome to (HT)Condor Week #19 (year 34 of our project)
Presentation transcript:

Miron Livny Center for High Throughput Computing Computer Sciences Department University of Wisconsin-Madison Open Science Grid (OSG)

Big Picture › Starting year six of the ‘original’ five year project and year 12 of early ‘grid’ projects – PPDG (DOE), GriPhyN (NSF-ITR) and iVDGL (NSF). › Proposal for the next five years under joint review of DOE and NSF › Over all funding 30+ FTEs › 10+ institutions involved

“The members of the OSG are united by a commitment to promote the adoption and to advance the state of the art of distributed high throughput computing (DHTC) – shared utilization of autonomous resources where all the elements are optimized for maximizing computational throughput.”

“ The OSG project is organized into five technical areas− Production, Technology, Software, Security, and User Support− with each area having a lead, and a support area that includes the following activities- Education, Documentation and Training, Communication, International Outreach and Assessment.”

“ As a focal point, Production is positioned to enhance this economy in other areas: by working toward interoperability with other CIs, and by replicating the DHTC pattern so that any interested university can benefit. We begin by helping universities build out their own local fabric of shared DHTC services to support a local community of students and faculty. ”

Open Science Grid (OSG)OSG HTC at the National Level

Overlay Resource Managers Ten years ago we introduced the concept of Condor glide-ins as a tool to support ‘just in time scheduling’ in a distributed computing infrastructure that consists of recourses that are managed by (heterogeneous) autonomous resource managers. By dynamically deploying a distributed resource manager on resources allocated (provisioned) by the local resource managers, the overlay resource manager can implement a unified resource allocation policy. In other words, we use remote job invocation to get resources allocated.

Subject: [Chtc-users] Daily CHTC OSG glidein usage From: Date: 9/5/ :15 AM To: Total Usage between and Group Usage Summary User Hours Pct Morgridge_Thomson % 2 Biochem_Attie % 3 Chemistry_Schmidt % 4 Chemistry_Cui % 5 ChE_dePablo % TOTAL % Info:

“The vision of the OSG partnership centers around enabling scientific research and discovery through DHTC. We aspire to be open towards the broadest possible range of stakeholders in terms of diversity of science and organizational structure.”