Www.cs.wisc.edu/~miron 1 Condor Team 2011 Established 1985.

Slides:



Advertisements
Similar presentations
"Recent development of e- Science & Grid in Thailand" 24 January 2006 Piyawut Srichaikul, NECTEC Putchong Uthayopas, KU.
Advertisements

Welcome to HTCondor Week #15 (year #30 for our project)
Joint CASC/CCI Workshop Report Strategic and Tactical Recommendations EDUCAUSE Campus Cyberinfrastructure Working Group Coalition for Academic Scientific.
Campus Grids & Campus Infrastructures Community Rob Gardner Computation Institute / University of Chicago July 17, 2013.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Overview of Wisconsin Campus Grid Dan Bradley Center for High-Throughput Computing.
Bosco: Enabling Researchers to Expand Their HTC Resources The Bosco Team: Dan Fraser, Jaime Frey, Brooklin Gore, Marco Mambelli, Alain Roy, Todd Tannenbaum,
Why do I build CI and what did it teach me? Miron Livny Wisconsin Institutes for Discovery Madison-Wisconsin.
High Performance Computing Course Notes Grid Computing.
HPC - High Performance Productivity Computing and Future Computational Systems: A Research Engineer’s Perspective Dr. Robert C. Singleterry Jr. NASA Langley.
People Power Academic + Private Collaboration for Better Electric Motors Condor Week May 2012 Brooklin Gore.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
Structural Genomics – an example of transdisciplinary research at Stanford Goal of structural and functional genomics is to determine and analyze all possible.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
PRESTON SMITH ROSEN CENTER FOR ADVANCED COMPUTING PURDUE UNIVERSITY A Cost-Benefit Analysis of a Campus Computing Grid Condor Week 2011.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
Welcome to HTCondor Week #14 (year #29 for our project)
Assessment of Core Services provided to USLHC by OSG.
M.Goldberg/NSFOSGSep 18, 2012 The Open Science Grid 1.
OSG Grid Workshop in KNUST, Kumasi, Ghana August 6-8, 2012 following the AFRICAN SCHOOL OF FUNDAMENTAL PHYSICS AND ITS APPLICATIONS July 15-Aug 04, 2012.
Welcome to CW 2007!!!. The Condor Project (Established ‘85) Distributed Computing research performed by.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Advanced Computing Services for Research Organisations Bob Jones Head of openlab IT dept CERN This document produced by Members of the Helix Nebula consortium.
Flexibility and user-friendliness of grid portals: the PROGRESS approach Michal Kosiedowski
A short introduction to GRID Gabriel Amorós IFIC.
1 Florida Cyberinfrastructure Development: SSERCA Fall Internet2 Meeting Raleigh, Va October 3, 2011 Paul Avery University of Florida
Helix Nebula The Science Cloud CERN – 14 May 2014 Bob Jones (CERN) This document produced by Members of the Helix Nebula consortium is licensed under a.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
SG - OSG Improving Campus Research CI Through Leveraging and Integration: Developing a SURAgrid-OSG Collaboration John McGee, RENCI/OSG Engagement Coordinator.
Ms. Maize’s Kindergarten Class Curriculum Presentation.
Results of the HPC in Europe Taskforce (HET) e-IRG Workshop Kimmo Koski CSC – The Finnish IT Center for Science April 19 th, 2007.
Miron Livny Center for High Throughput Computing Computer Sciences Department University of Wisconsin-Madison Open Science Grid (OSG)
Discussion Topics DOE Program Managers and OSG Executive Team 2 nd June 2011 Associate Executive Director Currently planning for FY12 XD XSEDE Starting.
Condor Team Welcome to Condor Week #10 (year #25 for the project)
Miron Livny Computer Sciences Department University of Wisconsin-Madison Condor : A Concept, A Tool and.
ALICE-USA Grid-Deployment Plans (By the way, ALICE is an LHC Experiment, TOO!) Or (We Sometimes Feel Like and “AliEn” in our own Home…) Larry Pinsky—Computing.
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
"Recent development of e- Science & Grid in Thailand" 24 January 2006 Piyawut Srichaikul, NECTEC Putchong Uthayopas, KU.
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
CERN What Happens at CERN? "In the matter of physics, the first lessons should contain nothing but what is experimental and interesting to see. A pretty.
Miron Livny Center for High Throughput Computing Morgridge Institute for Research and University of Wisconsin-Madison The Dynamics of a China-US Partnership.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
High Energy FermiLab Two physics detectors (5 stories tall each) to understand smallest scale of matter Each experiment has ~500 people doing.
Phoenix Convention Center Phoenix, Arizona Transactive Energy in Building Clusters [Innovation][Regional Innovation in Arizona] Teresa Wu Arizona State.
EuCARD-2 is co-funded by the partners and the European Commission under Capacities 7th Framework Programme, Grant Agreement European Network for.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
Where to go and what to do next — OSG, resources, help, mentors Thursday afternoon, 4:30pm Jim Weichel OSG Special Projects Fermi National Laboratory,
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
NORDUnet NORDUnet e-Infrastrucure: Grids and Hybrid Networks Lars Fischer CTO, NORDUnet Fall 2006 Internet2 Member Meeting, Chicago.
Welcome!!! Condor Week 2006.
Welcome to CW 2008!!!. The Condor Project (Established ‘85) Distributed Computing research performed by.
Conley Kindergarten Curriculum Presentation. Language Arts OBJECTIVES: Mastery of letters and sounds Mastery of writing letters Mastery identifying words.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
March 2014 Open Science Grid Operations A Decade of HTC Infrastructure Support Kyle Gross Operations Support Lead Indiana University / Research Technologies.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
Sustainability of EMI Results
Scaling Science Communities Lessons learned by and future plans of the Open Science Grid Frank Würthwein OSG Executive Director Professor of Physics UCSD/SDSC.
Miron Livny John P. Morgridge Professor of Computer Science
What is CERN? About CERN's Name from the Web
What is CERN?.
High Throughput Computing for Astronomers
Satisfying the Ever Growing Appetite of HEP for HTC
Welcome to (HT)Condor Week #19 (year 34 of our project)
Welcome to HTCondor Week #17 (year 32 of our project)
Presentation transcript:

1 Condor Team 2011 Established 1985

Welcome to Condor Week #13 (year #28 for our project)

Open Science Grid was funded for another five years!

“The members of the OSG consortium are united in a commitment to promote the adoption and to advance the state of the art of distributed high throughput computing (DHTC)”

“We present a five year plan to sustain and extend our fabric of DHTC services, to transform the computing landscape on our campuses through a new generation of technologies that enable scientists to access “any data, anytime, anywhere” via a single identity, and to facilitate the transformation of the LHC computing capabilities from petascale to exascale.”

“We define DHTC to be the shared use of autonomous resources toward a common goal, where all the elements are optimized for maximizing computational throughput. Sharing of such resources requires a framework of mutual trust whereas maximizing throughput requires dependable access to as much processing and storage capacity as possible.”

We first introduced the distinction between High Performance Computing (HPC) and High Throughput Computing (HTC) in a seminar at the NASA Goddard Flight Center in July of 1996 and a month later at the European Laboratory for Particle Physics (CERN). In June of 1997 HPCWire published an interview on High Throughput Computing. High Throughput Computing

Why HTC? For many experimental scientists, scientific progress and quality of research are strongly linked to computing throughput. In other words, they are less concerned about instantaneous computing power. Instead, what matters to them is the amount of computing they can harness over a month or a year --- they measure computing power in units of scenarios per day, wind patterns per week, instructions sets per month, or crystal configurations per year.

High Throughput Computing is a activity FLOPY  (60*60*24*7*52)*FLOPS

From July 2010 – June 2011: 45 million hours used by 54 research groups in 35 departments Researchers who use the CHTC are located all over campus (red buildings)

Better hearing with Cochlear Implants Normal hearing listeners need sound’s temporal fine structure information to understand speech in noise, localize sounds, and recognize talkers and melodies. Conventional cochlear implant speech processing strategies discard temporal fine structure and only represent the temporal envelopes of sound. Experimental question: If we set the timing the stimulating electric pulses of cochlear implants to represent this temporal fine structure, will cochlear implant listeners hear better? $ 15 minutes The algorithm for extracting the fine structure and shifting the pulses is computationally expensive, and creating the over 25,000 stimuli for a cochlear implant experiment would take a lab computer 260 days to complete. OSG allowed us to make all the stimuli within a day. Tyler Churchill Binaurial Hearing and Speech Lab Waisman Center

Subject: [Chtc-users] Daily CHTC OSG glidein usage From: Date: 5/1/ :15 AM Total Usage between and User Hours Pct Statistics_Wahba % 2 Atlas % 3 Physics_Perkins % 4 BMRB % 5 CMS % 6 ChE_dePablo % 7 Chemistry % 8 MIR_Thomson % 9 Statistics_Shao % TOTAL %

From desktop to discovery… Desktop 8,760 hours UW-Madison CHTC 45 million hours Open Science Grid 496 million hours

Thank you for building such a wonderful (D)HTC community