Welcome to HTCondor Week #14 (year #29 for our project)

Slides:



Advertisements
Similar presentations
21 st Century Science and Education for Global Economic Competition William Y.B. Chang Director, NSF Beijing Office NATIONAL SCIENCE FOUNDATION.
Advertisements

Program Goals: to create a new type of organization - a cyberinfrastructure collaborative for plant science - that will enable new conceptual advances.
Conference xxx - August 2003 Fabrizio Gagliardi EDG Project Leader and EGEE designated Project Director Position paper Delivery of industrial-strength.
Welcome to HTCondor Week #15 (year #30 for our project)
Supporting Research on Campus - Using Cyberinfrastructure (CI) Public research use of ICT has rapidly increased in the past decade, requiring high performance.
1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
NG-CHC Northern Gulf Coastal Hazards Collaboratory Simulation Experiment Integration Sandra Harper 1, Manil Maskey 1, Sara Graves 1, Sabin Basyal 1, Jian.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
National Center for Research Resources NATIONAL INSTITUTES OF HEALTH T r a n s l a t i n g r e s e a r c h f r o m b a s i c d i s c o v e r y t o i m.
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Presentation at WebEx Meeting June 15,  Context  Challenge  Anticipated Outcomes  Framework  Timeline & Guidance  Comment and Questions.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update June 12,
Introduction to ACI Shared Solutions for Shared Success Paul Wilson ACI Faculty Director Spring 2014.
Funding Opportunities at NSF Jane Silverthorne International Arabidopsis Consortium Workshop January 15, 2011.
EInfrastructures (Internet and Grids) US Resource Centers Perspective: implementation and execution challenges Alan Blatecky Executive Director SDSC.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CIF21) NSF-wide Cyberinfrastructure Vision People, Sustainability, Innovation,
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
The "Earth Cube” Towards a National Data Infrastructure for Earth System Science Presentation at WebEx Meeting July 11, 2011.
 Amazon Web Services announced the launch of Cluster Compute Instances for Amazon EC2.  Which aims to provide high-bandwidth, low- latency instances.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
Knowledge Environments for Science and Engineering: Current Technical Developments James French, Information and Intelligent Systems Division, Computer.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
The NIH Roadmap for Medical Research
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Assessment of Core Services provided to USLHC by OSG.
Welcome to CW 2007!!!. The Condor Project (Established ‘85) Distributed Computing research performed by.
LIGHTNESS Introduction 10th Oct, 2012 Low latency and hIGH Throughput dynamic NEtwork infrastructureS for high performance datacentre interconnectS.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program CASC, May 3, ADVANCED SCIENTIFIC COMPUTING RESEARCH An.
NCAR Annual Budget Review October 8, 2007 Tim Killeen NCAR Director.
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Computational Literacy NPACI Site Visit July 22, 1999 Gregory A. Moses EOT Thrust Leader.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update September.
Cyberinfrastructure Planning at NSF Deborah L. Crawford Acting Director, Office of Cyberinfrastructure HPC Acquisition Models September 9, 2005.
Miron Livny Center for High Throughput Computing Computer Sciences Department University of Wisconsin-Madison Open Science Grid (OSG)
DOE 2000, March 8, 1999 The IT 2 Initiative and NSF Stephen Elbert program director NSF/CISE/ACIR/PACI.
Condor Team Welcome to Condor Week #10 (year #25 for the project)
Miron Livny Computer Sciences Department University of Wisconsin-Madison Condor : A Concept, A Tool and.
DV/dt - Accelerating the Rate of Progress towards Extreme Scale Collaborative Science DOE: Scientific Collaborations at Extreme-Scales:
CyberInfrastructure workshop CSG May Ann Arbor, Michigan.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
1 Condor Team 2011 Established 1985.
International Exascale Software Co-Design Workshop: Initial Findings & Future Plans William Tang Princeton University/Princeton Plasma Physics Lab
Materials Innovation Platforms (MIP): A New NSF Mid-scale Instrumentation and User Program to Accelerate The Discovery of New Materials MRSEC Director’s.
ESFRI & e-Infrastructure Collaborations, EGEE’09 Krzysztof Wrona September 21 st, 2009 European XFEL.
Marv Adams Chief Information Officer November 29, 2001.
National Cybersecurity Center of Excellence Increasing the deployment and use of standards-based security technologies Mid-Atlantic Federal Lab Consortium.
Edinburgh e-Science MSc Bob Mann Institute for Astronomy & NeSC University of Edinburgh.
HEP and NP SciDAC projects: Key ideas presented in the SciDAC II white papers Robert D. Ryne.
A Scalable Two-Phase Top-Down Specialization Approach for Data Anonymization Using MapReduce on Cloud.
CYBERINFRASTRUCTURE FRAMEWORK FOR 21st CENTURY SCIENCE AND ENGINEERING (CIF21) Goal Develop and deploy comprehensive, integrated, sustainable, and secure.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
Welcome to CW 2008!!!. The Condor Project (Established ‘85) Distributed Computing research performed by.
1 Kostas Glinos European Commission - DG INFSO Head of Unit, Géant and e-Infrastructures "The views expressed in this presentation are those of the author.
What is Cloud Computing? Irving Wladawsky-Berger.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
INTRODUCTION TO GRID & CLOUD COMPUTING U. Jhashuva 1 Asst. Professor Dept. of CSE.
Toward High Breakthrough Collaboration (HBC) Susan Turnbull Program Manager Advanced Scientific Computing Research March 4, 2009.
BioExcel - Intro Erwin Laure, KTH. PDC Center for High Performance Computing BioExcel Consortium KTH Royal Institute of Technology – Sweden University.
overview of activities on High Performance Computing
Electron Ion Collider New aspects of EIC experiment instrumentation and computing, as well as their possible impact on and context in society (B) COMPUTING.
Miron Livny John P. Morgridge Professor of Computer Science
The Shifting Landscape of CI Funding
ICT NCP Infoday Brussels, 23 June 2010
© 2016 Global Market Insights, Inc. USA. All Rights Reserved Software Defined Networking Market to grow at 54% CAGR from 2017 to 2024:
Brian Matthews STFC EOSCpilot Brian Matthews STFC
Computer Services Business challenge
Welcome to (HT)Condor Week #19 (year 34 of our project)
Welcome to HTCondor Week #17 (year 32 of our project)
Presentation transcript:

Welcome to HTCondor Week #14 (year #29 for our project)

2 HTCondor Team 2012 Established 1985

A year of change, a year of new starts and a year of discovery

“This discovery is a huge triumph for mankind. There were more than 40 nations that came together for a long time to do this one thing that — even if it all worked out — wasn’t going to make anyone rich. It’s a powerful demonstration of the spirit of collaboration.” Collaborative computing, pioneered at UW–Madison, helped drive LHC analysis July 31, 2012 by Chris BarncardChris Barncard

Many New Terms Exa-scale Big data Software Defined Networks (SDN) Dark data Spot Pricing Advanced Computing Infrastructure (ACI) and Condo of Condos

How do we prepare for the HTC needs of 2020?

The Open Science Grid perspective

Scientific Collaborations at Extreme-Scales: dV/dt - Accelerating the Rate of Progress towards Extreme Scale Collaborative Science Collaboration of five institutions – ANL, ISI, UCSD, UND and UW Funded by the Advanced Scientific Computing Research (ASCR) program of the DOE Office of Science

“Using planning as the unifying concept for this project, we will develop and evaluate by means of at- scale experimentation novel algorithms and software architectures that will make it less labor intensive for a scientist to find the appropriate computing resources, acquire those resources, deploy the desired applications and data on these resources, and then manage them as the applications run. The proposed research will advance the understanding of resource management within a collaboration in the areas of: trust, planning for resource provisioning, and workload, computer, data, and network resource management.”

The HTCondor perspective

“… a mix of continuous changes in technologies, user and application requirements, and the business model of computing capacity acquisition will continue to pose new challenges and opportunities to the effectiveness of scientific HTC. … we have identified six key challenge areas that we believe will drive HTC technologies innovation in the next five years.” Evolving resource acquisition models Hardware complexity Widely disparate use cases Data intensive computing Black-box applications Scalability

“ Over the last 15 years, Condor has evolved from a concept to an essential component of U.S. and international cyberinfrastructure supporting a wide range of research, education, and outreach communities. The Condor team is among the top two or three cyberinfrastructure development teams in the country. In spite of their success, this proposal shows them to be committed to rapid development of new capabilities to assure that Condor remains a competitive offering. Within the NSF portfolio of computational and data- intensive cyberinfrastructure offerings, the High Throughput Computing Condor software system ranks with the NSF High Performance Computing centers in importance for supporting NSF researchers.” A recent anonymous NSF review

Thank you for building such a wonderful HTC community