Vijay K. Agarwala Senior Director, Research Computing and Cyberinfrastructure The Pennsylvania State University University Park, PA 16802 USA

Slides:



Advertisements
Similar presentations
Joint CASC/CCI Workshop Report Strategic and Tactical Recommendations EDUCAUSE Campus Cyberinfrastructure Working Group Coalition for Academic Scientific.
Advertisements

U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Listening to the Future Presented by Larry Johnson and Kristi Nelson Transforming Lives, Education, and Knowledge.
TRLN Research Hubs Seminar: NCSU Libraries Steve Morris Interim Associate Director for the Digital Library November 4, 2014.
Information Technology Center Introduction to High Performance Computing at KFUPM.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
HIGH PERFORMANCE COMPUTING ENVIRONMENT The High Performance Computing environment consists of high-end systems used for executing complex number crunching.
1 Intellectual Architecture Leverage existing domain research strengths and organize around multidisciplinary challenges Institute for Computational Research.
NICLS: Development of Biomedical Computing and Information Technology Infrastructure Presented by Simon Sherman August 15, 2005.
FACULTY OF COMPUTER SCIENCE OUTPUT DD  annual event from students for students with contact to industry (~800 visitors)  live demonstrations  research.
High Performance Computing (HPC) at Center for Information Communication and Technology in UTM.
David L. Spooner1 IT Education: An Interdisciplinary Approach David L. Spooner Rensselaer Polytechnic Institute.
Dream a Little big Dream With us! October 2012 AASHE Conference Los Angeles, California.
Institutional Research Computing at WSU: Implementing a community-based approach Exploratory Workshop on the Role of High-Performance Computing in the.
Critical Emerging Network-Centric Applications Tele-control/tele-presence Defense Tele-medicine Remote plane/vehicle/robot control Distance learning Real-time.
UIUC Strategic Plan Melanie Loots Ruth Watkins August 18, 2006.
Cornell 18,000 students 2,000 faculty Twelve colleges on Ithaca campus Four are state colleges, eight are private (including grad school and school of.
Joshua Alexander University of Oklahoma – IT/OSCER ACI-REF Virtual Residency Workshop Monday June 1, 2015 Deploying Community Codes.
CATS Conference n Conference for Academic Technology Staff n Designed and implemented for the staff by the staff of the CSU.
Research Cyberinfrastructure Alliance Working in partnership to enable computationally intensive, innovative, interdisciplinary research for the 21 st.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Faculty Professional Development Center Board Presentation January 2005.
Effective User Services for High Performance Computing A White Paper by the TeraGrid Science Advisory Board May 2009.
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
National Center for Supercomputing Applications The Computational Chemistry Grid: Production Cyberinfrastructure for Computational Chemistry PI: John Connolly.
Computing Environment in Chinese Academy of Sciences Dr. Xue-bin Dr. Zhonghua Supercomputing Center Computer Network.
Research Support Services Research Support Services.
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
Deans/VPs Meeting January 2009 UB’s Strategic Plan for IT Elias G. Eldayrie CIO.
1 Strategic Thinking for IT Leaders View from the CFO Seminars in Academic Computing Executive Leadership Institute.
November 2005 Advanced Research Networks Conference BCNET UVic is Wired Leverage the Power.
The Research Computing Center Nicholas Labello
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
Top Issues Facing Information Technology at UAB Sheila M. Sanders UAB Vice President Information Technology February 8, 2007.
EPSCoR Cyberinfrastructure Assessment Workshop North Dakota Jurisdictional Assessment October 15, 2007 Bonnie Neas VP for IT North Dakota State University.
Learning and Engagement in Library Spaces Suzanne E. Thorin Ruth Lilly University Dean of University Libraries and Associate Vice President for Digital.
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
Issues Autonomic operation (fault tolerance) Minimize interference to applications Hardware support for new operating systems Resource management (global.
CCS Overview Rene Salmon Center for Computational Science.
Data Management Recommendation ISTeC Data Management Committee.
Academic Services and Emerging Technologies Mission: Provide high-quality computing and related information technology services in support of the teaching,
Renaissance Computing Institute: An Overview Lavanya Ramakrishnan, John McGee, Alan Blatecky, Daniel A. Reed Renaissance Computing Institute.
Institute For Digital Research and Education Implementation of the UCLA Grid Using the Globus Toolkit Grid Center’s 2005 Community Workshop University.
Software Overview Environment, libraries, debuggers, programming tools and applications Jonathan Carter NUG Training 3 Oct 2005.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
August 3, March, The AC3 GRID An investment in the future of Atlantic Canadian R&D Infrastructure Dr. Virendra C. Bhavsar UNB, Fredericton.
Center for Advanced Energy Studies Harold S. Blackman Interim Director, CAES July 18, 2007.
High-Performance and Grid Computing for Neuroinformatics: NIC and Cerebral Data Systems Allen D. Malony University of Oregon Professor Department of Computer.
NOVA Evaluation Report Presented by: Dr. Dennis Sunal.
Computational Science & Engineering meeting national needs Steven F. Ashby SIAG-CSE Chair March 24, 2003.
What is MCSR? Who is MCSR? What Does MCSR Do? Who Does MCSR Serve? What Kinds of Accounts? Why Does Mississippi Need Supercomputers? What Kinds of Research?
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
Internet2 Applications Group: Renater Group Presentation T. Charles Yun Internet2 Program Manager, Applications Group 30 October 2001.
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
Securing the Grid & other Middleware Challenges Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
Computing at SSRL: Experimental User Support Timothy M. McPhillips Stanford Synchrotron Radiation Laboratory.
High Risk 1. Ensure productive use of GRID computing through participation of biologists to shape the development of the GRID. 2. Develop user-friendly.
Computing Strategies. A computing strategy should identify – the hardware, – the software, – Internet services, and – the network connectivity needed.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
T he D igital S cholarship L aboratory Evans Library Florida Institute of Technology.
Building PetaScale Applications and Tools on the TeraGrid Workshop December 11-12, 2007 Scott Lathrop and Sergiu Sanielevici.
Advanced Research Computing Information Technology Virginia Tech
Multicore Applications in Physics and Biochemical Research Hristo Iliev Faculty of Physics Sofia University “St. Kliment Ohridski” 3 rd Balkan Conference.
Putting All The Pieces Together: Developing a Cyberinfrastructure at the Georgia State University Library Tim Daniels, Learning Commons Coordinator Doug.
Penn State Information Technology
Language Technology and Data Analysis Laboratory (LADAL)
Presentation transcript:

Vijay K. Agarwala Senior Director, Research Computing and Cyberinfrastructure The Pennsylvania State University University Park, PA USA Research Computing: Critical Needs and Opportunities at a University Based Academic Computing Center HPC User Forum October , 2008 High Performance Computing Center Stuttgart (HLRS),Germany October 16 th, 2008 Imperial College, London

Human Resources Vice Provost for Information Technology and CIO Associate Vice Pro vost Financial Services Marketing and Communication Research Computing and Cyberinfrastructure (RCC) Consulting and Support Services (CSS) Security Operations and Services (SOS) Teaching and Learning with Technology (TLT) Digital Library Technologies (DLT) Administrative Information Services (AIS) Telecommunications and Networking Services (TNS) Research Security Libraries Instruction CommunicationBusiness Support ITS Organization

High Performance Computing Systems Domain Expertise and Consulting Support Software Development and Programming Support Visualization and Telecollaborative Systems Senior Director Faculty Advisory Committee on Research Computing and Cyberinfrastructure Information Technology Services Research Computing and Cyberinfrastructure Meets the high-end computing technology needs of scholars in their research and teaching endeavors. The group partners with faculty members and collaborates with technology companies and other research organizations.

Research Computing and Cyberinfrastructure Provide systems services by researching current practices in operating system, file system, data storage, job scheduling as well as computational support related to compilers, parallel computations, libraries, and other software support. Also supports visualization of large datasets by innovative means to gain better insight from the results of simulations. Enable large-scale computations and data management by building and operating several state-of-the art computational clusters and machines with a variety of architectures. Consolidate and thus significantly increase the research computing resources available to each faculty participant. Faculty members can frequently exceed their share of the machine to meet peak computing needs Provide support and expertise for using programming languages, libraries, and specialized data and software for several disciplines. Investigate emerging visual computing technologies and implement leading-edge solutions in a cost-effective manner to help faculty better integrate data visualization tools and immersive facilities in their research and instruction. Investigate emerging architectures for numerically-intensive computations and work with early-stage companies. For example: interconnects, networking, and graphics processors and FPGA for computations. Help build inter- and intra-institutional research communities using cyberinfrastructure and grid technologies. Maintain close contacts with NSF and DoE funded national centers, and help faculty members with porting and scaling of codes across multiple architectures.

Compute Engines LION-XJ 144 nodes 288 quad-core procs Memory-16GB /node IB interconnect LION-XK 144 nodes 288 quad-core procs Memory-32GB /node IB interconnect Pleiades 170 nodes 340 processors 340 GB RAM 35.1 TB storage LION-XO 132 nodes 368 processors 1280 GB RAM Silverstorm Infiniband interconnect Hammer/LION-XD 16 nodes 64 processors Memory-128GB /node Infiniband interconnect LION-XB 16 nodes 128 processors 512 GB RAM Pathscale Infinipath Infiniband interconnect LION-XC 140 nodes 560 processors 1664 GB RAM Unisys ES nodes 64 processors 192 GB RAM Myrinet interconnect HPC Storage Farm 100 TB of disk needs to be 1000 TB

Programs, Libraries, and Application Codes in Support of Computational Research Compilers and Debuggers: AbsoftProFortran, GNU Pascal, IBM XLF, IBM XLC/C++, Intel Fortran, Intel C/C++, Lahey/Fujitsu Fortran 95 Pro, Portland PGI Compilers and Tools, Java, PathScale EKO compiler suite, TotalView, DDT, Valgrind Computational Biology: BLAST, Blastall, Cister, ClustalW, ClustalX, Dotter, FASTA, fastDNAml, GeneMachine, GENSCAN, HMMgene, MrBayes, MZEF, PHRED/PHRAP/Consed, PHYLIP, ReadSeq, RepeatMasker, SEG, sim4, Sputnik, Treetool, wuBlast Computational Chemistry and Material Science: FHI98MD, Gamess, Gaussian 03, GaussView, SemiChem, NWChem, Jaguar, Maestro, CHARMM, WIEN2K, VASP, ThermoCalc, Accelrys Material Studio, ADF, tmolex, Amber, Gromacs, NAMD, WxDragon, Molden, CPMD, Rosetta, CCP4 Finite Element Solvers: ABAQUS, ANSYS, FLUENT, GAMBIT, FIELDVIEW, LS-DYNA, MD/Nastran, OpenFOAM Mathematical and Statistical Libraries and Applications: ATLAS, BLAS, ESSL, IMSL, LaPack, ScaLaPack, MASS, GOTO, Intel MKL, AMD ACML, Mathematica, MATLAB, Maple, Distributed MATLAB, PETSc, NAG, StarP, Watson Sparse Matrix Solver Solid Modeling: MD/Patran Statistics: R, SAS Parallel Libraries: MPICH, Optimized versions of MPICH for high-performance cluster interconnects, Parallel IMSL, Distributed MATLAB, StarP, Distributed Maple Optimization: GAMS / CPLEX, Csim, Tomlab Multiphysics: Comsol All software installations are driven by faculty. The software stack on every system is customized and entirely shaped by faculty needs.

Joint research and education initiative (NSF, DOE, PSU) focused on understanding molecular issues related to environmental chemical kinetics, geochemical cycling of elements, fate and transport of contaminants, and carbon sequestration. (Dr. Susan L. Brantley, Professor of Geosciences) Virtual center that integrates genetic, immunological, ecological and other studies to understand how disease processes work, and how they inter-relate across time and length scales. (Dr. Ottar Bjornstad, Associate Professor of Entomology and Biology; Dr. Bryan Grenfell, Alumni Professor of Biology ) Center for Gravitational Wave Physics (NSF, PSU) fosters research of a truly interdisciplinary character linking the highest caliber astrophysics, gravitational wave physics and experimental gravitational wave detection in the pursuit of the scientific understanding of gravity. (Dr. Lee S. Finn, Professor of Physics, Astronomy and Astrophysics) Interdisciplinary center (NSF, NIH, PSU) focused on identifying issues in statistics, research design, and measurement emerging in the prevention and treatment of problem behaviors, particularly drug abuse. (Dr. Linda M. Collins, Professor of Human Development and Family Studies, Statistics) Participating Research Centers

Collaborative effort (NSF, PSU, Georgia Tech) aimed to educate the next generation of scientists and engineers in the emerging field of materials design. (Dr. Zi-Kui Liu, Professor of Materials Science and Engineering) State Large collaboration (NSF, DOE, PSU, and others) of about 150 scientists working to use the AMANDA and IceCube telescopes to detect ultra-high energy neutrinos. (Dr. Douglas F. Cowen, Professor of Physics, Astronomy and Astrophysics) Center aimed at describing, modeling, and understanding the Earth's climate system. (Dr. Michael E. Mann, Associate Professor of Meteorology) A hub to connect experimental and simulation activities through the organization of collaborative projects, short courses and workshops. (Dr. Jorge O. Sofo, Associate Professor of Physics, Astronomy and Astrophysics) Participating Research Centers

2-layer ONIOM Method HF/6-311+G(d,p)/HF3-21G* Low levelHigh level-90.4 ppm HPC

Performance of LS-DYNA Case Study – Blast Loading Blast load using ConWep algorithm 450,000 dofs, spherical blast for 11ms Study of mesh convergence for plastic strain stability Professor: Ashok D.Belegundu Student : Vikas Argod Dept of Mechanical and Nuclear Engineering

Visualization Services Recent support areas include: Visualization system design and deployment 3D modeling and geometry exchange (e.g. FormZ) Visualization development applications and programming toolkits (e.g. OpenDX, VTK ) VR development and device libraries (e.g. VRPN, VMRL, JAVA3D, OpenGL, OpenSG, CaveLib) Domain specific visualization tools (e.g. VMD, SCIRun) Telecollaborative tools and facilities support (e.g. Access Grid, VNC) Parallel graphics and online visualization (e.g. Paraview, DCV) Programming for graphics (e.g. C/C++, JAVA3D, Tcl/Tk, Qt) Staff members provide consulting, teach seminars, assist faculty and support facilities for visualization and VR.

Visualization Facilities Locating facilities strategically across campus for convenient access by targeted disciplines and user communities Leveraging existing applications and workflows so that visualization and VR can be natural extensions to existing work habits for the users being served Providing outreach seminars, end-user training and ongoing staff support in use of the facilities Working on an ongoing basis with academic partners to develop and adapt these resources more precisely to fit their needs Helping to identify and pursue funding opportunities for further enhancement of these efforts as they take root Our goal is to foster more effective use of visualization and VR techniques in research and teaching across colleges and disciplines via strategic deployment of facilities and related support.

Immersive Environments Lab (IEL) in partnership with School of Architecture and Landscape Architecture 208 Stuckeman Family Building Focused on teaching and research in the experiential understanding of design spaces by architecture and landscape architecture students 3-screen 3D-stereo multi-OS display offers multi-modal immersive environment Experiential design review in design studio and digital media courses Telecollaborative studio using standard definition video and 3D application sharing with Carleton University, Spring 2007 (lab also supports Access Grid) Research into immersive and collaborative tools for design professions Develop application and data integration workflows for ARCH/LARCH (Building Information Modeling, energy and structural analyses, land use planning, etc.) in conjunction with ICON lab under internal Bowers support

Immersive Construction Lab (ICon) in partnership with Architectural Engineering 306 Engineering Unit C Focused on research and teaching in the use of immersive visualization and VR techniques for planning and management of large construction projects 3-screen, 3D-stereo, Windows desktop, immersive information environment VR extensions to commercial construction planning applications, custom development of “VR-like” teaching modules Industry partners provide real world use cases for studying the practical application of tools under development Linked SMART Board allows dynamic VR updates from scheduling applications, etc. Laptop display sharing facilitates group collaboration among students Building upon ICON lab, IEL and related work, Penn State hosted CONVR 2007 international conference on construction applications of virtual reality

Visualization/VR Lab, 215 Osmond in partnership with Materials Simulation Center VR facility for central campus science community (Materials Science, Molecular Biology, Physics and more) 8’ x 8’ active stereo display Tracked devices for user interaction Linux console workstation Complement of open source data visualization tools (VMD, VTK, Paraview - can be built upon in response to user needs) Seminars for teaching graduate students on use of VR tools Initial users in Materials Science and Molecular Biology

Visualization/VR Lab, 336 IST Building in partnership with Computer Science and Engineering Facility targets compute-intensive applications in science, engineering and related disciplines Large-format 3D stereo display (6.75 x 9 ft., 1400 x 1050 pixel) for VR applications 2 x 2 tiled display (6 x 8 ft., 2800 x 2100 pixel) for high-resolution applications Linux console workstations Interactive device support Initial complement of data visualization tools (VMD, VTK, Paraview, SCIRun) to be built upon in response to user needs Opened in October 2007 Seminars and outreach activity underway

Sports Medicine VR Lab assisting a partnership between Kinesiology, Athletics and HMC Lasch Football Building Special purpose lab supports study of perception action disruptions in posture and balance related to mild traumatic brain injury (e.g. concussions), elderly populations, etc. Motion in VR display is synchronized with measurement from EEG, postural tracking and force plate instrumentation Enhanced two-screen lab (wall and floor) in development for Recreation Building for broader use by faculty in kinesiology, psychology, engineering science and mechanics, SSRI and Hershey Medical Center

Collaborative Tools in Research Telecollaboration –Adobe Connect (site license) –Vyew –Access Grid (many-to-many) Document / Source Sharing –Subversion –Wiki MediaWiki (public domain) Confluence Wiki (commercial) Web-Based Science Gateways –Materials Simulation Center Gateway (with Materials Research Institute)

ACCESS Grid Teleconferencing Facility Room 140 Computer Building Allows faculty to participate in Access Grid events with international academic community Scalable, multi-group telecollaboration (voice, video, application sharing) using multicast internet connections. Small number of highly satisfied users Needs greater awareness and adoption Ongoing research collaboration: Dean Snow (ANTHY), Craig Cameron (BMB) Research reporting: Richard Alley (GEOSCI), Donald Bryant (BMB), Mark Gahegan (GEOG) Virtual conferences: Genomics and Bioinformatics, SC Global, SC Desktop

Industry Outreach in partnership with Institute for Computational Science (ICS) and Industrial Research Office (IRO) "... to out-compete, we must out-compute.... " Putting Pennsylvania based companies at a competitive advantage by helping meet their large-scale computing needs. Many small-to-medium sized companies, and even larger ones, do not have enough in-house expertise and resources for large-scale computations and as a result have not been able to use simulation and analysis tools with far greater frequency to help them innovate faster and become more competitive. Helping develop stronger relationship between faculty and industry by providing computational services; may lead to more alignment between faculty research areas and industry needs and positive impact on economic development along the I-99 corridor and throughout the commonwealth.