The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/2007-2013) under grant agreement n° RI-261507. Pilot 2: MAPPER Use Case.

Slides:



Advertisements
Similar presentations
The map and reduce functions in MapReduce are easy to test in isolation, which is a consequence of their functional style. For known inputs, they produce.
Advertisements

LinkSCEEM-2: A computational resource for the Eastern Mediterranean.
EHarmony in Cloud Subtitle Brian Ko. eHarmony Online subscription-based matchmaking service Available in United States, Canada, Australia and United Kingdom.
Aug 9-10, 2011 Nuclear Energy University Programs Materials: NEAMS Perspective James Peltz, Program Manager, NEAMS Crosscutting Methods and Tools.
StratusLab is co-funded by the European Community’s Seventh Framework Programme (Capacities) Grant Agreement INSFO-RI Ioannis Konstantinou Greek.
A problem in IMS Learning Design To promote interoperability, few services Local tool frameworks like LAMS have much richer tool environment –Easy provisioning.
The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI Multiscale APPlications.
The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI CYFRONET Programming.
DIRAC API DIRAC Project. Overview  DIRAC API  Why APIs are important?  Why advanced users prefer APIs?  How it is done?  What is local mode what.
Cloud Storage Maria Evans Tam Huynh Kieu Le Mark Singh.
SCI-BUS is supported by the FP7 Capacities Programme under contract nr RI WS-PGRADE/gUSE Supporting e-Science communities in Europe Zoltan Farkas.
New Capabilities in QosCosGrid Middleware for Advanced Job Management, Advance Reservation and Co-allocation of Computing Resources B. Bosak, P. Kopta,
Building service testbeds on FIRE D5.2.5 Virtual Cluster on Federated Cloud Demonstration Kit August 2012 Version 1.0 Copyright © 2012 CESGA. All rights.
“High-performance computational GPU-stand for teaching undergraduate and graduate students the basics of quantum-mechanical calculations“ “Komsomolsk-on-Amur.
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
Descriptive Data Analysis of File Transfer Data Sudarshan Srinivasan Victor Hazlewood Gregory D. Peterson.
New Communities: The Virtual Physiological Human Use Case Stefan Zasada University College London
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
CSE 548 Advanced Computer Network Security Document Search in MobiCloud using Hadoop Framework Sayan Cole Jaya Chakladar Group No: 1.
A Framework for Elastic Execution of Existing MPI Programs Aarthi Raveendran Tekin Bicer Gagan Agrawal 1.
The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/ ) under grant agreement.
Nick Brook Current status Future Collaboration Plans Future UK plans.
Advanced Techniques for Scheduling, Reservation, and Access Management for Remote Laboratories Wolfgang Ziegler, Oliver Wäldrich Fraunhofer Institute SCAI.
Wenjing Wu Computer Center, Institute of High Energy Physics Chinese Academy of Sciences, Beijing BOINC workshop 2013.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
WALiD Wind Blade Using Cost-Effective Advanced Composite Lightweight Design Project Duration: 1 st February 2013 – 31 st January 2017 Grant Agreement No:
Enabling Grids for E-sciencE EGEE-III INFSO-RI Using DIANE for astrophysics applications Ladislav Hluchy, Viet Tran Institute of Informatics Slovak.
TeraGrid Advanced Scheduling Tools Warren Smith Texas Advanced Computing Center wsmith at tacc.utexas.edu.
Lightweight construction of rich scientific applications Daniel Harężlak(1), Marek Kasztelnik(1), Maciej Pawlik(1), Bartosz Wilk(1) and Marian Bubak(1,
Preliminary CPMD Benchmarks On Ranger, Pople, and Abe TG AUS Materials Science Project Matt McKenzie LONI.
Module: Software Engineering of Web Applications Chapter 2: Technologies 1.
Managing Data & Information Procedures & Techniques.
Grid Remote Execution of Large Climate Models (NERC Cluster Grid) Dan Bretherton, Jon Blower and Keith Haines Reading e-Science Centre
EGI Technical Forum Amsterdam, 16 September 2010 Sylvain Reynaud.
ATOS in Period 2 (WP4 leader) Technical Review Period 2 Michel van Adrichem, Mick Symonds, Josep Martrat This document produced by Members of the Helix.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI VO auger experience with large scale simulations on the grid Jiří Chudoba.
The Consortium FP7 collaborative Smart City project ENERGY Collaborative Project – This project has received funding from the European Union’s.
Jost von Hardenberg ISAC-CNR, Torino, Italy with Paolo Davini, Susanna Corti, and many others EUDAT User Forum, Rome,Italy 3-4 February, 2016.
Netherlands Institute for Radio Astronomy Big Data Radio Astronomy A VRC for SKA and its pathfinders Hanno Holties March 28 th 2012.
The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI Requirements for Multiscale.
Simulation Production System Science Advisory Committee Meeting UW-Madison March 1 st -2 nd 2007 Juan Carlos Díaz Vélez.
The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI CYFRONET Hands.
Geant4 GRID production Sangwan Kim, Vu Trong Hieu, AD At KISTI.
Get Data to Computation eudat.eu/b2stage B2STAGE How to shift large amounts of data Version 4 February 2016 This work is licensed under the.
Open Science Grid Consortium Storage on Open Science Grid Placing, Using and Retrieving Data on OSG Resources Abhishek Singh Rana OSG Users Meeting July.
The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI CYFRONET Multiscale.
EUDAT receives funding from the European Union's Horizon 2020 programme - DG CONNECT e-Infrastructures. Contract No Collaboration.
The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI EGI and PRACE ecosystem.
INTRODUCTION TO XSEDE. INTRODUCTION  Extreme Science and Engineering Discovery Environment (XSEDE)  “most advanced, powerful, and robust collection.
Implementation of GLUE 2.0 support in the EMI Data Area Elisabetta Ronchieri on behalf of JRA1’s GLUE 2.0 Working Group INFN-CNAF 13 April 2011, EGI User.
The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI Distributed Multiscale.
PLG-Data and rimrock Services as Building
Seasonal School Demo and Assigments
Simulation Production System
Need Our modern life would be impossible without the availability of versatile plastics – bottles, packaging, electrical items, cars, construction – the.
Multiscale Applications on European e-Infrastructures
LinkSCEEM-2: A computational resource for the development of Computational Sciences in the Eastern Mediterranean Mostafa Zoubi SESAME Outreach SESAME,
The EPIKH Project Roberto Barbera
WS-PGRADE for Molecular Sciences and XSEDE
Grid Portal Services IeSE (the Integrated e-Science Environment)
Study course: “Computing clusters, grids and clouds” Andrey Y. Shevel
OGF HPC Basic Profile Interoperability Demonstrator
Simulation use cases for T2 in ALICE
Mirjam van Daalen, (Stephan Egli, Derek Feichtinger) :: Paul Scherrer Institut Status Report PSI PaNDaaS2 meeting Grenoble 6 – 7 July 2016.
Grid Canada Testbed using HEP applications
Alice Software Demonstration
MULTILINGUAL USER INTERFACE
Title of the presentation: Methods and Data Analysis
Expand portfolio of EGI services
Common Authentication and Authorisation Service for Life Science Research Mikael Linden, ELIXIR Finland.
Presentation transcript:

The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI Pilot 2: MAPPER Use Case Presented by: Mariusz Mamonski On behalf of Derek Groen, James Suter, Peter Coveney and the MAPPER consortium

2 Clay-polymer nanocomposites ● We develop quantitative coarse-grained models of clay-polymer nanocomposites to predict materials properties, e.g.: ● The thermodynamically favourable state of the composites. ● Their elasticity. ● Wide range of potential applications: EGI Technical Forum

3 Nanocomposites ● Main ingredients: ● Montmorillonite clay, both "charged" and "uncharged". ● Polymers, such as polyvinyl alcohol and polyethylene glycol. ● Simulations start off in an encapsulated state. ● We are assessing the properties of these composite systems to find cases where the materials exfoliate. EGI Technical Forum

4 Scale Separation Map EGI Technical Forum

5 Nanomaterials use case (extensive) EGI Technical Forum

6 Step 1: Quantum mechanical ● Goal: calculate energy potentials to be used in step 2. ● Code: CPMD ( optionally CASTEP. ● # of simulations: 1 (or a few). ● # of cores per simulation: <64. ● Duration per simulation: ~24 hours. ● Data produced per simulation: typically MBs, although the restart file is ~3GB. ● Data transfer required: MBs before and after the simulations. ● Site type: local cluster. EGI Technical Forum

7 Step 2: All-atom ● # of simulations: 1 ● # of cores per simulation: 1,024-8,192 ● Duration per simulation: ~24h ● Data produced per simulation: ~1GB ● Data transfer required per simulation between PRACE site and the manager: ~1GB. ● Access mechanisms required/supported: ● Required: GridFTP, support for remote job submission using UNICORE (via QCG-Broker) EGI Technical Forum

8 Step 3: CG parametrization ● # of simulations: ~20-40 (one after another) ● # of cores per simulation: 16 – 256 ● Duration per simulation: 1h - 4h ● Data produced per simulation: 75MB - 4GB ● Data transfer required per simulation between EGI site and the manager: GB. ● (rest are particle positions, which can be stored for future reference) ● Access mechanisms required/supported: ● Required: GridFTP, must have support for QCG job submission, preferably have QCG Computing installed. ● Advance reservation provides a performance benefit here. EGI Technical Forum

9 Step 4: CG Large Simulation ● # of simulations: 1 ● # of cores per simulation: 8,192-65,536 cores. ● Duration per simulation: ~12h ● Data produced per simulation: 1TB+ ● Data transfer required: MBs to start, 1TB+ afterwards. ● The particle positions of these simulations are to be stored for future reference and analysis. ● Access mechanisms required/supported: ● Required: GridFTP, support for QCG job submission. EGI Technical Forum

10 Tools ● GridFTP. ● UNICORE. ● QosCosGrid Environment. ● AHE. ● CPMD. ● LAMMPS. ● Perl / Python scripts. ● GridSpace (on the user side). EGI Technical Forum

11 A few other data aspects ● All data is stored in files. ● Filenames may be non-unique (e.g., "in.lammps") ● Filename+directory tree is unique. ● Position files are typically large (>1GB), other files are much smaller. EGI Technical Forum

Questions? EGI Technical Forum