Progress from PRAGMA 7 PRAGMA 8 Workshop 3 May 2005 Singapore Bioinformatics Institute.

Slides:



Advertisements
Similar presentations
PRAGMA – TeraGrid – AIST Interoperation Testing Philip Papadopoulos.
Advertisements

GLIF Global Lambda Integrated Facility Kees Neggers CCIRN Cairns Australia, 3 July 2004.
INDIANAUNIVERSITYINDIANAUNIVERSITY GENI Global Environment for Network Innovation James Williams Director – International Networking Director – Operational.
Kento Aida, Tokyo Institute of Technology Grid Working Group Meeting Aug. 27 th, 2003 Tokyo Institute of Technology Kento Aida.
Resource WG Breakout. Agenda How we will support/develop data grid testbed and possible applications (1 st day) –Introduction of Gfarm (Osamu) –Introduction.
Update Progress Since PRAGMA 9 Bringing the Grid to Coastal Zones Townsville PRAGMA – 28 March 2006.
CSF4 Meta-Scheduler Tutorial 1st PRAGMA Institute Zhaohui Ding or
Multi-organisation Grid Accounting System (MOGAS): PRAGMA deployment update A/Prof. Bu-Sung Lee, Francis School of Computer Engineering, Nanyang Technological.
PRAGMA 17 (10/29/2009) Resources Group Pacific Rim Application and Grid Middleware Assembly Resources.
CAMPUS GRID WORKSHOP, ICT RESEARCH PLATFORM, 16 MAY 2007 PRAGMA : My personal hope & future Habibah A Wahab, School of Pharmaceutical Sciences, Universiti.
Update on Internships Pacific Rim Undergraduate Experiences prime.ucsd.edu.
Biosciences Working Group Update Wilfred W. Li, Ph.D., UCSD, USA Habibah Wahab, Ph.D., USM, Malaysia Hosted by IOIT Hanoi, Vietnam, Oct 29, 2009.
Proposal for Hosting PRAGMA 2007/2008 National Center for High- Performance Computing, NARL PRAGMA 9, 21-22, Oct, 2005 Hyderabad, India.
21 st Century Science and Education for Global Economic Competition William Y.B. Chang Director, NSF Beijing Office NATIONAL SCIENCE FOUNDATION.
Summary Steering Committee Meeting 25 September 2007 PRAGMA NCSA PRAGMA Engagements in Cyberinfrastructure.
Steering Committee Meeting Summary PRAGMA 18 4 March 2010.
The Future of PRAGMA Challenges for the Working Groups, PRAGMA Members, and the PRAGMA Community Active Participation.
PRAGMA 10 Townsville, Australia Re-Capping John OCallaghan APAC.
PRAGMA 15 (10/24/2008) Resources Group Pacific Rim Application and Grid Middleware Assembly Resources.
PRAGMA 10 Invitation Presented at PRAGMA 9 (Hyderabad, India) Rajesh Chhabra QPSF Grid Manager Project Leader- User Interface and Visualization Services.
Thanks to All of the Sponsors: –APAC, QPSF, AARNET, Grangenet, SGI, Nexium, ReefHQ, Sun Local Coordinators: Ian Atkinson and his many helpers Kathryn Ottaway.
Shinji Shimojo Fang-Pang Lin Telescience WG PRAGMA 17.
Slides Please send to Arun
Summary PRAGMA Steering Committee Meeting 25 March 2009.
Summary of Steering Committee Meeting 24 October 2008.
Resource WG PRAGMA Mason Katz, Yoshio Tanaka, Cindy Zheng.
Telescience Update Shinji Shimojo Fang-Pang Lin. Discussions 1. Environment: a. Common Test Platform: b. Common Architecture i. Middleware ii. Viz Wall.
Cindy Zheng, PRAGMA 8, Singapore, 5/3-4/2005 Status of PRAGMA Grid Testbed & Routine-basis Experiments Cindy Zheng Pacific Rim Application and Grid Middleware.
Resource/data WG Summary Yoshio Tanaka Mason Katz.
2 nd APGrid PMA F2F Meeting Osaka University Convention Center October 15 09: :20 # Participants: 26.
Cindy Zheng, PRAGMA 11, 10/16/2006 Resources Group P acific R im A pplication and G rid M iddleware A ssembly
PRAGMA Update 21 October 2005 Hyderabad. PRAGMAs Founding Motivations – Updated 2005 The grid is transforming e-science: computing, data *, and collaboration.
Summary of Steering Committee Meeting 22 March 2007.
Kejun Dong, Kai Nan CNIC/CAS CNIC Resources And Activities Update Resources Working Group PRAGMA11 Workshop, Oct.16/17 Osaka, Japan.
Biosciences Working Group Update & Report Back Wilfred W. Li, Ph.D., UCSD, USA Habibah Wahab, Ph.D., USM, Malaysia Hosted by IOIT Hanoi, Vietnam, Oct 29,
CCGrid 2006, 5/19//2006 The PRAGMA Testbed Building a Multi-Application International Grid San Diego Supercomputer Center / University of California, San.
Cindy Zheng, SC2006, 11/12/2006 Cindy Zheng PRAGMA Grid Testbed Coordinator P acific R im A pplication and G rid M iddleware A ssembly San Diego Supercomputer.
ACOMP, 3/15/2007 Cindy Zheng Peter Arzberger Philip Papadopoulos Mason Katz P acific R im A pplication and G rid M iddleware A ssembly University of California,
Strengthen Existing and Establish New Collaborations Work with Science Teams to Advance Grid Technologies and Improve the Underlying Infrastructure In.
Why Optical Networks Are Emerging as the 21 st Century Driver Scientific American, January 2001.
A Short History of GLEON Peter Arzberger University of California-San Diego Tim Kratz University of Wisconsin-Madison.
GEONETCast Initiative of GEO presented at the EC GEONETCast workshop 5 March 2006 GEO Secretariat.
© Copyright 2009 IMS Global Learning Consortium All Rights Reserved. 1 Building the Standards for Learning Functionality Mashup IMS Learning Tool Interoperability.
Telescience WG Update Fang Pang Lin Shinji Shimojo Pragma 25.
PRAGMA Overview and Future Directions Workshop on Building Collaborations in Clouds, HPC and Applications 17 July 2012.
PRAGMA19 – PRAGMA 20 Collaborative Activities Resources Working Group.
National Institute of Advanced Industrial Science and Technology ApGrid: Current Status and Future Direction Yoshio Tanaka (AIST)
National Institute of Advanced Industrial Science and Technology Introduction to Grid Activities in the Asia Pacific Region jointly presented by Yoshio.
Global Lake Ecological Observatory Network - GLEON: Catalyzing Global Team Science based on PRAGMA Peter Arzberger Tim Kratz, Fang-Pang Lin Philip Papadopoulos,
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
Supercomputing Center Jysoo Lee KISTI Supercomputing Center National e-Science Project.
Cindy Zheng, PRAGMA9, 10/21/2005 PRAGMA Grid Testbed & Routine-basis Experiments May – October 2005 Cindy Zheng Pacific Rim Application and Grid Middleware.
Pacific Rim International University - Fostering Globally-leading Researchers in Integrated Sciences - Susumu Date Shoji Miyanaga Osaka University, Japan.
PRAGMA: Cyberinfrastructure, Applications, People Yoshio Tanaka (AIST, Japan) Peter Arzberger (UCSD, USA)
National Institute of Advanced Industrial Science and Technology Introduction of PRAGMA routine-basis experiments Yoshio Tanaka
APBioNet Report AP* Retreat, Bangkok Aug. 24, 2001 Shoba Ranganathan Vice-President, APBioNet Assoc Prof, Bioinformatics Centre National University of.
1 The state of Grid computing in Vietnam, and which aims the VNGrid Project wants to reach Dr. Lang Van, Tran HCM City Institute of Information Technology.
A Wide Range of Scientific Disciplines Will Require a Common Infrastructure Example--Two e-Science Grand Challenges –NSF’s EarthScope—US Array –NIH’s Biomedical.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Pacific Rim Application and Grid Middleware Assembly: PRAGMA A community building collaborations and advancing grid-based applications Peter Arzberger,
SC2008 (11/19/2008) Resources Group Pacific Rim Application and Grid Middleware Assembly Reports.
Kento Aida, Tokyo Institute of Technology Grid Working Group Aug. 29 th, 2003 Tokyo Institute of Technology Kento Aida.
Kento Aida, Tokyo Institute of Technology Grid working group meeting Jan. 26 th, 2005 Bangkok.
Thoughts on International e-Science Infrastructure Kevin Thompson U.S. National Science Foundation Office of Cyberinfrastructure iGrid2005 9/27/2005.
PRAGMA19 – PRAGMA 20 Collaborative Activities Resources Working Group.
Activity Reports LSGRID2004 RIKEN Genomic Science Center Fumikazu KONISHI.
Source: Paul Hanson. Collaboration in Environmental Science Global Lake Ecological Observatory Network A grassroots network of –People: lake scientists,
Telescience/ECOgrid breakout Shinji Shimojo (Osaka University) Fang-Pang Lin (NCHC)
Progress Presentations
PRAGMA Telescience at iGRID 2002
Presentation transcript:

Progress from PRAGMA 7 PRAGMA 8 Workshop 3 May 2005 Singapore Bioinformatics Institute

PRAGMA 7 September 2004 San Diego

PRAGMA at SC04

Contents: Overview Accomplishments PRIME Working Groups Institutions References Opportunities Sponsors

Accomplishments: Achieving Success through Partnership Telescience: KBSI, Software for camera Computational Chemistry: Nimrod/GAMESS- APBS/Kepler (ligand protein docking) EcoGrid and Lake Metabolism –Prototype International Lake Observatory –Coral Reef Sensing –Meeting on September 2004 (plan global lake observatory network; link coral reef experts) –Follow-on meeting March 2005 Gfarm and iGAP –Middleware Integration –Proteome Analysis Bandwidth Challenge Awards from SC03 –Distributed Infrastructure (Gfarm) –Application (Telescience) Middleware Interoperability –Rock Rolls, Ninf-G, Gfarm –KRocks krocks.cluster.or.kr KROCK Nov 04

People Deputy Chair –Huge thanks to Jysoo Lee Job well done –Huge thanks to Fang-Pang Lin More to do Steering Committee –BII: Arun Krishnan –KISTI: Kum Won Cho –USM: Yussof Hassan Admad

Bill Chang NSF Changes –Bill Chang, Head, Beijing Office, NSF

Teri Simas

Routine Use Tremendous Steps Forward! Testbed of several sites – status/setup.htmlhttp://pragma-goc.rocksclusters.org/pragma-grid- status/setup.html 15 Institutions Five applications –Time-Dependent Density Functional Theory (TDDFT) –mpiBLAST, QM-MD, Savannah Case Study –iGAP – Gfarm Lessons learned –Time to disseminate results to broader community via publications

28 April 2005

1 st application Time-Dependent Density Functional Theory (TDDFT) Computational quantum chemistry application Grid-enabled by Nobusada (IMS), Yabana (Tsukuba Univ.) and Yusuke Tanimura (AIST) using Ninf-G Experiment ran 6/1/04 ~ 8/31/04 10 sites, 8 countries, 198 CPUs Driver: Yusuke and Cindy # of major executions: 43 Total execution time: 1210 hours (50.4 days) Longest run: 164 hours (6.8 days) Average length of run: hours (1.2 days) Major enhancements to the application Major enhancements to ninf-G

Routine Use Applications

Resources and Networking Gfarm Roll for clusters (part of Rocks distribution) New Internet Links via TransPAC –LA - Tokyo: OC192 (25 April) –Tokyo - Hongkong: OC48 –Singapore … (plan by September) National Lambda Rail started recently –10GE links San Diego-Seattle, LA – Seattle, Chicago – Seattle PNWGP –2.5 Gig to Korea (soon to be 10 G) –2.5 Gig to Taiwan

Lake Metabolism Website

Typhoon Yuan Yang Lake, Taiwan – August 2004 Part of a growing global lake observatory network - An example of episodic events and threshold dynamics Access can be difficult during the most interesting times Photo by Peter Arzberger, October 2004 Used by NSF Director Feb 2005

Taiwans Natural Beauty

PRIME 2004

PRIME 2005 Osaka University –Three students: Telescience, Biogrid NCHC –Four students: Ecogrid, Optiputer, Systems Biology (one from Wisconsin) Monash University –Five students: Computational Chemistry, Bioinformatics, Cardiac Modeling CNIC –Two students: Networking Analysis, Protein Structure Analysis Looking at ways to enhance the students cultural competency

Publications Since Oct 2004 incomplete Telescience, Sensors, and Ecogrid. –Juncai Ma, Shoji Hatano, Shinji Shimojo, Implementation of field monitoring system by IPv6 and GRID Authentication on the Loess Plateau, Agricultural Information Research, 13(4), (in japanese) pp , 2004 –Toyokazu Akiyama, Kazunori Nozaki, Seiichi Kato, Shinji Shimojo, Steven T. Peltier, Abel Lin, Tomas Molina, George Yang, David Lee, Mark Ellisman, Sei Naito, Atsushi Koike, Shuichi Matsumoto, Kiyokazu Yoshida, Hirotaro Mori, "Scientific Grid Activities in Cybermedia Center, Osaka University", 5-th IEEE/ACM CCGrid proceedings (BioGrid'05 Workshop), 2005 (to appear). –Porter, J.H, Arzberger, P,, Braun, H-W, Bryant, P., Gage, S, Hansen, T, Hanson, P, Lin, F-P, Lin, C-C, Kratz, T, Michener, W, Shapiro, S, and Williams, T., 2005 Wireless Sensor Networks for Ecology, Biosciences. (accepted for publication) –Sensors for Environmental Observations, NSF Workshop Report Life Sciences –Yoshiyuki Kido, Susumu Date, Shingo Takeda, Shoji Hatano, Juncai Ma, Shinji Shimojo, and Hideo Matsuda, "Architecture of a Grid-enabled research platform with location-transparency for bioinformatics", Genome Informatics Vol. 15, No. 2, pp , 2004 –Baldridge, K.K.*; Sudholt, W.; Greenberg, J.P.; Amoreira, C.; Potier, Y.; Altintas, I.; Birnbaum, A.; Abramson, D.; Enticott, C.; Slavisa, G. Cluster and Grid Infrastructure for Computational Chemistry and Biochemistry. In Parallel Computing for Bioinformatics (Invited Book Chapter), A. Y. Zomaya (Ed.), John Wiiley & Sons, 2005, in press. –Sudholt, W.; Baldridge, K.; Abramson, D.; Enticott, C.; Garic, S.; Kondric, C.; Nguyen, D. Application of Grid Computing to Parameter Sweeps and Optimizations in Molecular Modeling. Future Generation Computer Systems (Invited), , –Shahab, A., D. Chuon, T. Suzumura, W. W. Li, R. W. Byrnes, K. Tanaka, L. Ang, S. Matsuoka, P. E. Bourne, M. A. Miller, & P. W. Arzberger. Grid Portal Interface for Interactive Use and Monitoring of High-Throughput Proteome Annotation. Lecture Notes in Computer Science. Vol pp –Wei, X, W. W. Li, O. Tatebe, G. Xu, H. Liang & J. Ju. (2005). Implementing data aware scheduling in Gfarm using LSFTM scheduler plugin mechanism. Proceedings of the 2005 International Conference on Grid Computing and Applications (GCA'05). Las Vegas. In press. –Li, W, C.L Yeo, L.Ang, O.Tatebe, S. Sekiguchi, K Jeong, S. Hwang, S. Date, J-H Kwak. Protein Analysis using iGAP in Gfarm. Presented Life Science Grid Resources –Tanaka Y, Takemiya H, Nakada H, and Sekiguchi S. Design, implementation and performance evaluation of GridRPC programming middleware for a large-scale computational Grid, Proceedings of the 5th IEEE/ACM International Workshop on Grid Computing, , Nov. 2004, Pittsburgh, USA.

Key Events November SC04 (Pittsburgh) March GGF13 (Seoul) May 2005 – Grid Asia 2005 –PRAGMA 8 (2 – 4 May) –NEESit Meeting (5 May) –Life Science Grid 2005 (5 – 6 May) September iGRID 2005 (San Diego) September 2005 – APAC 2005 (Gold Coast) October 2005 – PRAGMA 9 (Hyderabad) November 2005 – SC05 (Seattle)

PRAGMA Institutions at iGRID demos, from 18 countries. Pacific Rim demonstrations from Australia, China, Japan, Korea, Taiwan, US, Canada and Mexico Worlds First Demonstration of X GRID Application Switching using User Controlled Lightpaths: –KISTI, NCHC, Institutions in Canada and Spain Real Time Observational Multiple Data Streaming and Machine Learning for Environmental Research using Lightpath –NCHC, others Great Wall Cultural Heritage –CNIC, others Coordination of Grid Scheduler and Lambda Path Service over GMPLS: Toward Commercial Lambda Path Service –AIST, Osaka, Titech From Federal Express to Lambdas: Transporting Sloan Digital Sky Survey (SDSS) Data Using UDT –KISTI, CNIC, APAC, Starlight Real-time Multi-scale Brain Data Acquisition, Assembly, and Analysis using an End-to-End OptIPuter –Osaka, KISTI, NCHC, UCSD, Starlight Global Lambda Visualization Facility –KISTI, Starlight, NCSA iGRID APAC –APAC, Starlight, PNWGP

Steering Committee Agenda Review Application for Membership –Pacific Northwest Gigapop (Wednesday) Review Application to Host PRAGMA 10 –Queensland and APAC in March/April 2006 (Wed) Plan activities for iGRID2005, SC05, PRAGMA Brochure Discuss and outline plans and strategies for several years into future –Including multi-institutional proposal to a variety of funding agencies Discuss outcomes of study done at PRAGMA 7

Pilot Study: PRAGMA Background: Based on 11 interview at PRAGMA 7 –Understanding the social interactions needed for success of a virtual organization –Understanding view of success and challenges to date (for a path forward) Highlights: –Successes Built a collaborative network, trust, openness, based on shared vision Exchange information and technology that have benefited participants Make things happen, make things function Spun off other activities and collaborations –Challenges Balance growth without losing tight collaborations Balance and harness the diversity of interests Maturity of national, large-scale grid (PRAGMAs Role) Move beyond demo mode to persistence and broader usability Development of applications –Future That is what we create Conducted by Lyn Headley, UCSD

Expanding Routine Use Challenges for Resource Working Group Publish lessons learned, including observations of shortcoming of grid software –Conference Papers will force PRAGMA to think critically about these issues Continue to evolve deployed infrastructure, to make it deemed persistent –Move beyond daily use demos such as at SC05 or iGRID2005 demo, to a system usable post event –Make testbed usable by others, allowing multiple users

Help define the testbed infrastructure, to make it part of your daily use Define challenging runs that will lead to fundamentally new results –E.g. Run a complete genome through the iGAP pipeline Expanding Routine Use Challenges for Application Working Groups

Expanding Routine Use PRAGMA is about making things work PRAGMA has made strides to make routine the use of the grid. Make these experiments so that they can be replicated –More than just the experts, the drivers, the developers –More than just for the meeting –More than just for the original application Replicatibility is a fundamental tenet of good science. –Phil Papadopoulos

Welcome PRAGMA 8 Workshop 3 May 2005 Singapore Bioinformatics Institute