Open Science Grid OSG Engagement Strategy and Status ETP Conference Call Oct-19-2006; 5:30PM EST Bringing additional non-physicists onto OSG John McGee.

Slides:



Advertisements
Similar presentations
2009 Transition Dynamics Enterprises, Inc. Used with permission. What You Need to Know about the Green Economy Add your name here.
Advertisements

4/2/2002HEP Globus Testing Request - Jae Yu x Participating in Globus Test-bed Activity for DØGrid UTA HEP group is playing a leading role in establishing.
Overview of Wisconsin Campus Grid Dan Bradley Center for High-Throughput Computing.
Integrating Linux Technology with Condor Kim van der Riet Principal Software Engineer.
RCAC Research Computing Presents: DiaGird Overview Tuesday, September 24, 2013.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Experiences Establishing a Bioinformatics Computing Facility.
6th Biennial Ptolemy Miniconference Berkeley, CA May 12, 2005 Distributed Computing in Kepler Ilkay Altintas Lead, Scientific Workflow Automation Technologies.
Open Science Grid Frank Würthwein UCSD. 2/13/2006 GGF 2 “Airplane view” of the OSG  High Throughput Computing — Opportunistic scavenging on cheap hardware.
Astrophysics, Biology, Climate, Combustion, Fusion, Nanoscience Working Group on Simulation-Driven Applications 10 CS, 10 Sim, 1 VR.
A quick introduction to CamGrid University Computing Service Mark Calleja.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
The iPlant Collaborative Community Cyberinfrastructure for Life Science Tools and Services Workshop Discovery Environment Overview.
Center For Research Computing (CRC), University of Notre Dame, Indiana Application of ND CRC to be a member of the OSG Council Jarek Nabrzyski CRC Director.
Introduction to HP LoadRunner Getting Familiar with LoadRunner >>>>>>>>>>>>>>>>>>>>>>
OSG End User Tools Overview OSG Grid school – March 19, 2009 Marco Mambelli - University of Chicago A brief summary about the system.
Building service testbeds on FIRE D5.2.5 Virtual Cluster on Federated Cloud Demonstration Kit August 2012 Version 1.0 Copyright © 2012 CESGA. All rights.
Welcome to CW 2007!!!. The Condor Project (Established ‘85) Distributed Computing research performed by.
1 port BOSS on Wenjing Wu (IHEP-CC)
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
ExTASY 0.1 Beta Testing 1 st April 2015
A Cloud is a type of parallel and distributed system consisting of a collection of inter- connected and virtualized computers that are dynamically provisioned.
1 Offshore Wind Energy: Federal Funding Strategy and Advocacy for Offshore Wind Energy Development Presentation to the Cuyahoga County Regional Energy.
OSG Public Storage and iRODS
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Jason Stowe Condor Week 2009 April 22 nd, Coming to Condor Week since Started as a User.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
SG - OSG Improving Campus Research CI Through Leveraging and Integration: Developing a SURAgrid-OSG Collaboration John McGee, RENCI/OSG Engagement Coordinator.
The iPlant Collaborative Community Cyberinfrastructure for Life Science Tools and Services Workshop Objectives.
BalticGrid-II Project 2nd BG-II AHM, , Riga, Latvia1 Overview of application CoPS (Comparison of Protein Structures) D.Ludviga IMCS UL (SigmaNet)
Running Kuali: A Technical Perspective Ailish Byrne - Indiana University Jay Sissom - Indiana University Foundation.
Parallel Computing with Matlab CBI Lab Parallel Computing Toolbox TM An Introduction Oct. 27, 2011 By: CBI Development Team.
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
CSIU Submission of BLAST jobs via the Galaxy Interface Rob Quick Open Science Grid – Operations Area Coordinator Indiana University.
The Future of the iPlant Cyberinfrastructure: Coming Attractions.
Evaluation of Agent Teamwork High Performance Distributed Computing Middleware. Solomon Lane Agent Teamwork Research Assistant October 2006 – March 2007.
The iPlant Collaborative Community Cyberinfrastructure for Life Science Tools and Services Workshop Discovery Environment Overview.
Blue Brain Project Carlos Osuna, Carlos Aguado, Fabien Delalondre.
The iPlant Collaborative Community Cyberinfrastructure for Life Science Tools and Services Workshop Discovery Environment Overview.
Transparently Gathering Provenance with Provenance Aware Condor Christine Reilly and Jeffrey Naughton Department of Computer Sciences University of Wisconsin.
Bio-Linux 3.0 An integrated bioinformatics solution for the EG community ClustalX showing DNA polymerase alignment GeneSpring showing yeast transcriptome.
OSG Executive Board Meeting Dec 14, RENCI Staff Mats Rynge –Manage Engagement VOMS, SC –Working with NREL Leesa Brieger –Working with Atmospheric.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
NEES Cyberinfrastructure Center at the San Diego Supercomputer Center, UCSD George E. Brown, Jr. Network for Earthquake Engineering Simulation NEES TeraGrid.
Biomedical and Bioscience Gateway to National Cyberinfrastructure John McGee Renaissance Computing Institute
Running Kuali: A Technical Perspective Ailish Byrne (Indiana University) Jonathan Keller (University of California, Davis)
GLIDEINWMS - PARAG MHASHILKAR Department Meeting, August 07, 2013.
Portal Update Plan Ashok Adiga (512)
Holding slide prior to starting show. Applications WG Jonathan Giddy
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
Getting Started with SIDL using the ANL SIDL Environment (ASE) ANL SIDL Team MCS Division, ANL April 2003 The ANL SIDL compilers are based on the Scientific.
Biomedical and Bioscience Gateway to National Cyberinfrastructure John McGee Renaissance Computing Institute
Open Science Grid as XSEDE Service Provider Open Science Grid as XSEDE Service Provider December 4, 2011 Chander Sehgal OSG User Support.
Purdue RP Highlights TeraGrid Round Table May 20, 2010 Preston Smith Manager - HPC Grid Systems Rosen Center for Advanced Computing Purdue University.
© Geodise Project, Scenario: Design optimisation v Model device, discretize, solve, postprocess, optimise Scripting.
Improving the Research Bootstrap of Condor High Throughput Computing for Non-Cluster Experts Based on Knoppix Instant Computing Technology RIKEN Genomic.
PEER 2003 Meeting 03/08/031 Interdisciplinary Framework Major focus areas Structural Representation Fault Systems Earthquake Source Physics Ground Motions.
INFN/IGI contributions Federated Clouds Task Force F2F meeting November 24, 2011, Amsterdam.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
Creating Grid Resources for Undergraduate Coursework John N. Huffman Brown University Richard Repasky Indiana University Joseph Rinkovsky Indiana University.
Accessing the VI-SEEM infrastructure
Engineering (Richard D. Braatz and Umberto Ravaioli)
Tools and Services Workshop
Joslynn Lee – Data Science Educator
Building and Testing using Condor
Haiyan Meng and Douglas Thain
Mats Rynge USC Information Sciences Institute
Welcome to (HT)Condor Week #19 (year 34 of our project)
Presentation transcript:

Open Science Grid OSG Engagement Strategy and Status ETP Conference Call Oct ; 5:30PM EST Bringing additional non-physicists onto OSG John McGee

8 December Open Science Grid Engagement Plan Getting new users: Direct Approach –Specific targeted opportunities. Develop relationships and provide technical assistance with running applications on OSG –RENCI Bioportal and workflow Getting new users: Broad Approach –Domain Science Conferences, journals, newsletters, etc Bio, Genomics, Chemistry, Environment, Geo –Funded NSF and NIH researchers –unsolicited , but with non-trivial context –Generate leads to be moved into the direct approach process Engagement Operations –Establish Engagement VO to support these activities –Engagement conference calls –bringing in OSG staff as appropriate

8 December Open Science Grid OSG Engagement VO Establish an Engagement VO at RENCI to facilitate access to OSG resources VOMS installation and setup –Software installation complete, some configuration remaining –Working to obtain host and ssl certificates –Hostname: osg-engage.renci.org would like: cname for engage.opensciencegrid.org Next step is VO registration as described on wiki

8 December Open Science Grid Broad Approach International Society for Computational Biology –Society is based SDSC –RECOMB Systems Biology Dec-1,2 at UCSD –Rocky Mountain Bioinformatics Conference; Dec1-3 –IEEE Symposium on Computational Intelligence in Bioinformatics and Computational Biology, April –International Symposium on Bioinformatics Research and Applications, May 2007 Funded grant reviews underway SuperComputing 2006

8 December Open Science Grid Example of a Direct engagement National Renewable Energy Lab (NREL) –Conference Call on 9/22/2006 –NREL HPC manager, NREL users, RENCI, UW-Madison –They have an existing internal Condor pool –Motivation to collaborate with natinoal Grid communities

8 December Open Science Grid NREL Application 1 Nick Long, Buildings and Thermal Systems Center –EnergyPlus application –Batches of jobs; Fortran-90 –30 to 100 per iteration, about 1500 simulations –Self extracting zip file sent to Condor machines –Modest data sizes, 5 to 10 MB coming back –No IP issues

8 December Open Science Grid NREL Application 2 Marshall Buhl, National Wind Technology Center –Offshore wind project CCON –3 different configs for wind turbines (shallow, med, deep water) –50k simulations per for each of the three turbine configs –100,000’s of cases over next year or two –Planning to use Condor locally, PERL scripts to generate jobs and random seeds –Non commercial, models are public domain models –NREL distributes these models with software –Wall clock runtime = 10 minutes per simulation –File sizes are modest – a few megabytes – aggregate will be hundreds of GB –Executables are windows – could be modified for Linux

8 December Open Science Grid NREL Application 3 Tony Markel, Center for Transportation Technologies and Systems –Vehicle simulation – detailed model –Matlab must be on the compute node –2 to 5 minute runtime job –This app uses some functions that cannot be compiled to an executable –Consider using UW Octave vs Matlab – Open Source clone of Matlab 2 nd potential application –simplified version of above, that is compiled –Explores the design space –Trade offs on parameters people use to decide on a vehicle –Running thousands of jobs –10 to 15 seconds per job

8 December Open Science Grid NREL Status Condor job router functionality is currently required We are negotiating an alternative with NREL to enable job flow sooner –Linux based Condor installation at NREL