PRAGMA 13, NCSA, 24 th September 2007 Amber 8 on PRAGMA Gfarm-V2 Datagrid.

Slides:



Advertisements
Similar presentations
Symantec 2010 Windows 7 Migration Global Results.
Advertisements

1. XP 2 * The Web is a collection of files that reside on computers, called Web servers. * Web servers are connected to each other through the Internet.
Slide 1 Insert your own content. Slide 2 Insert your own content.
1 Copyright © 2002 Pearson Education, Inc.. 2 Chapter 2 Getting Started.
CSF4 Meta-Scheduler Tutorial 1st PRAGMA Institute Zhaohui Ding or
Reports from Resource Breakout PRAGMA 16 KISTI, Korea.
The Avian Flu Grid Project Rommie Amaro & Dong Xu University of California, San Diego PRAGMA 13 - NCSA Sept 25, 2007.
DATE: 2008/03/11 NCHC-Grid Computing Portal (NCHC-GCE Portal) Project Manager: Dr. Weicheng Huang Developed Team: Chien-Lin Eric Huang Chien-Heng Gary.
11 Application of CSF4 in Avian Flu Grid: Meta-scheduler CSF4. Lab of Grid Computing and Network Security Jilin University, Changchun, China Hongliang.
17 th October, 2006PRAGMA 11, Beautiful Osaka, Japan COMPLAINTS TO RESOURCE GROUP Habibah A Wahab, Suhaini Ahmad, Nur Hanani Che Mat School of Pharmaceutical.
National Institute of Advanced Industrial Science and Technology Status report on the large-scale long-run simulation on the grid - Hybrid QM/MD simulation.
Experiences of Grid Enabled MPI Implementation named MPICH-GX with Atmospheric Applications Oh-kyoung Kwon, KISTI Salvador Castañeda, CICESE PRAGMA 11.
National Institute of Advanced Industrial Science and Technology Advance Reservation-based Grid Co-allocation System Atsuko Takefusa, Hidemoto Nakada,
A Proposal of Capacity and Performance Assured Storage in The PRAGMA Grid Testbed Yusuke Tanimura 1) Hidetaka Koie 1,2) Tomohiro Kudoh 1) Isao Kojima 1)
Gfarm v2 and CSF4 Osamu Tatebe University of Tsukuba Xiaohui Wei Jilin University SC08 PRAGMA Presentation at NCHC booth Nov 19,
GT4 Architectural Security Review December 17th, 2004.
© 2006 Open Grid Forum GGF18, 13th September 2006 OGSA Data Architecture Scenarios Dave Berry & Stephen Davey.
© 2010 Pearson Addison-Wesley. All rights reserved. Addison Wesley is an imprint of Chapter 11: Structure and Union Types Problem Solving & Program Design.
0 - 0.
2 pt 3 pt 4 pt 5pt 1 pt 2 pt 3 pt 4 pt 5 pt 1 pt 2pt 3 pt 4pt 5 pt 1pt 2pt 3 pt 4 pt 5 pt 1 pt 2 pt 3 pt 4pt 5 pt 1pt Simplify All mixed up Misc. AddingSubtract.
GridPP July 2003Stefan StonjekSlide 1 SAM middleware components Stefan Stonjek University of Oxford 7 th GridPP Meeting 02 nd July 2003 Oxford.
Andrew McNab - Manchester HEP - 17 September 2002 Putting Existing Farms on the Testbed Manchester DZero/Atlas and BaBar farms are available via the Testbed.
CS4026 Formal Models of Computation Running Haskell Programs – power.
WS-JDML: A Web Service Interface for Job Submission and Monitoring Stephen M C Gough William Lee London e-Science Centre Department of Computing, Imperial.
Distributed and Parallel Processing Technology Chapter2. MapReduce
Jobs for Montana's Graduates D25L1PP1. Sara works at a local restaurant. On Saturday she clocked in at 8:15am and clocked out at 11:45pm. Sara earns $7.50.
AN INGENIOUS APPROACH FOR IMPROVING TURNAROUND TIME OF GRID JOBS WITH RESOURCE ASSURANCE AND ALLOCATION MECHANISM Shikha Mehrotra Centre for Development.
compilers and interpreters
Telemetry Modules Quick Start
5.9 + = 10 a)3.6 b)4.1 c)5.3 Question 1: Good Answer!! Well Done!! = 10 Question 1:
1 Online communication: remote login and file transfer.
1 CS 446 – Tutorial 6 Frid. Nov. 6 th, 2009 Implementation Tutorial.
User Query Control An Enhancement For AS/400 Query On The IBM iSeries from  Copyright I/O International, 2005 Skip Intro.
* 1 Common Dialog Control. * 2 You want your user to set property or provide your application with some information easily? How do you do it? The Common.
EXAMPLE 3 Make a table for a function
Cluster Computing at IQSS Alex Storer, Research Technology Consultant.
Bottoms Up Factoring. Start with the X-box 3-9 Product Sum
The PLASTIC Model to HUTN transformation tool UDA.
Grid Resource Allocation Management (GRAM) GRAM provides the user to access the grid in order to run, terminate and monitor jobs remotely. The job request.
NorduGrid Grid Manager developed at NorduGrid project.
Assignment 3 Using GRAM to Submit a Job to the Grid James Ruff Senior Western Carolina University Department of Mathematics and Computer Science.
Practical Guide to Umbrella Sampling
1 Short Course on Grid Computing Jornadas Chilenas de Computación 2010 INFONOR-CHILE 2010 November 15th - 19th, 2010 Antofagasta, Chile Dr. Barry Wilkinson.
Enabling Grids for E-sciencE Medical image processing web portal : Requirements analysis. An almost end user point of view … H. Benoit-Cattin,
Building service testbeds on FIRE D5.2.5 Virtual Cluster on Federated Cloud Demonstration Kit August 2012 Version 1.0 Copyright © 2012 CESGA. All rights.
A Project about: Molecular Dynamic Simulation (MDS) Prepared By Ahmad Lotfy Abd El-Fattah Grid Computing Group Supervisors Alexandr Uzhinskiy & Nikolay.
Grid Data Management A network of computers forming prototype grids currently operate across Britain and the rest of the world, working on the data challenges.
NeSC Apps Workshop July 20 th, 2002 Customizable command line tools for Grids Ian Kelley + Gabrielle Allen Max Planck Institute for Gravitational Physics.
Computational grids and grids projects DSS,
Grid Computing I CONDOR.
Compiled Matlab on Condor: a recipe 30 th October 2007 Clare Giacomantonio.
Nadia LAJILI User Interface User Interface 4 Février 2002.
LHCb and DataGRID - the workplan for 2001 Eric van Herwijnen Wednesday, 28 march 2001.
CSF4 Meta-Scheduler Name: Zhaohui Ding, Xiaohui Wei
A Hierarchical MapReduce Framework Yuan Luo and Beth Plale School of Informatics and Computing, Indiana University Data To Insight Center, Indiana University.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks HYP3D Gilles Bourhis Equipe SIMPA, laboratoire.
February 22-23, Washington D.C. SURA ENDyne Software for Dynamics of Electrons and Nuclei in Molecules. Developed by Dr. Yngve Öhrn and Dr. Erik Deumens,
AHM04: Sep 2004 Nottingham CCLRC e-Science Centre eMinerals: Environment from the Molecular Level Managing simulation data Lisa Blanshard e- Science Data.
OPTIMIZATION OF DIESEL INJECTION USING GRID COMPUTING Miguel Caballer Universidad Politécnica de Valencia.
Tips for Designing a Good Simulation Project. Designing Good Simulation Projects Work incrementally –Do your literature homework. –Start with small model.
Amber 10 Tutorial Minimization in Sander. Minimization It is important to do a quick read of pg. 21 and the start of pg 22 of Amber10 Manual.pdf. These.
Software. Introduction n A computer can’t do anything without a program of instructions. n A program is a set of instructions a computer carries out.
INTRODUCTION TO XSEDE. INTRODUCTION  Extreme Science and Engineering Discovery Environment (XSEDE)  “most advanced, powerful, and robust collection.
Molecular Dynamics Analysis Toolkit Karl Debiec and Nick Rego Chong Group Department of Chemistry August 30 th 2013.
UNIX To do work for the class, you will be using the Unix operating system. Once connected to the system, you will be presented with a login screen. Once.
CMS analysis job and data transfer test results
Practice #0: Introduction
CCR Advanced Seminar: Running CPLEX Computations on the ISE Cluster
IS3440 Linux Security Unit 7 Securing the Linux Kernel
Michael P. McCumber Task Force Meeting April 3, 2006
Presentation transcript:

PRAGMA 13, NCSA, 24 th September 2007 Amber 8 on PRAGMA Gfarm-V2 Datagrid

PRAGMA 13, NCSA, 24 th September 2007 Motivation The running of AMBER in GFarm Data Grid is intended –to provide information useful in assessing the performance of running molecular dynamics simulations on a range of computer architectures found PRAGMA testbeds with and without GFarm.

PRAGMA 13, NCSA, 24 th September 2007 PRAGMA Clusters gfml17 gfml18 –All running in Gfarm and without Gfarm.

PRAGMA 13, NCSA, 24 th September 2007 AMBER 8 (Sander) Benchmark system atoms Generalized Born model on Sander 8--thermalization &cntrl nstlim=10,nrespa=4, ntx=5, irest=1, ntpr=8, ntf=2, ntc=2, ntb=0, temp0 = 298.0, ntt=1, tautp=1.0, cut=12.0, rgbmax=12.0, igb=1, saltcon=0.2, gbsa=0, nmropt=1 / &wt type='END' / DISANG=heme_tether.rst END

PRAGMA 13, NCSA, 24 th September 2007 Protocols Identify clusters with GFARM installed Make sure all the identified clusters have gridmpi installed Login to each cluster and mount gfarm directory. Recompile amber with gridmpi Copy the compiled program to gfarm Create rsl file for each job Make sure globus is working from the current working cluster to the execution clusters.

PRAGMA 13, NCSA, 24 th September 2007 execution time in second SystemCompletion time (s) Gfml17 (local NFS)63.78 gfml17-GfarmV Gfml18 (local NFS)63.78 gfml18-GfarmV rock32 (local NFS)60.60 rock32 GfarmV2 (Metaserver in Japan) Note: No recompilation on the cluster – the binaries are the same Executable binary and all input/output files are stored in Global Gfarm file system

PRAGMA 13, NCSA, 24 th September 2007 Previous Benchmark GfarmV1

PRAGMA 13, NCSA, 24 th September 2007 Xcluster- GFARM Xcluster

PRAGMA 13, NCSA, 24 th September h 3mins NOT ON GFARM 1 hour GFARM

PRAGMA 13, NCSA, 24 th September 2007 Advantage All files are shared across clusters. This will save time for copying all the input files to each desired cluster. User can save time by log in only once and not every time they change to other cluster.