Download presentation
Presentation is loading. Please wait.
Published byJasmine Todd Modified over 9 years ago
1
Miron Livny Computer Sciences Department University of Wisconsin-Madison miron@cs.wisc.edu http://www.cs.wisc.edu/~miron From Compute Intensive to Data Intensive Grid Computing
2
www.cs.wisc.edu/condor Welcome to the 2 nd Annual Condor meeting!!! (Hope to see you all in the EuroGlobus+Condor meeting in southern Italy in mid June)
3
www.cs.wisc.edu/condor The Condor Project (Established ‘85) Distributed Computing research performed by a team of 26 faculty, full time staff and students who face software engineering challenges in a UNIX/Linux/NT environment, are involved in national and international collaborations, actively interact with academic and commercial users, maintain and support a large distributed production environment, and educate and train students. Funding - DoD, DoE, NASA, NSF, AT&T,IBM, INTEL, Microsoft and the UW Graduate School
4
www.cs.wisc.edu/condor State of the project › Extended the scope and size of the project › Extremely busy › Defended and preserved our unique approach to distributed/Grid computing › Enjoyed the impact we have on “real” users › Maintained the reputation of our research and the quality of our technology › Worked hard to make a commercial Condor fly alongside our project
5
www.cs.wisc.edu/condor The Physicists & Us › Some of us have our roots in physics (Derek, Doug, Miron, Raj) › Long history of joint work (NIKHF 91,INFN 96, PPDG 99, SMC, WA92 COROUS, PHINEX, CMS …) › Grid Physics Network (GriPhyN) project funded by the Information Technology Research program of NSF ($10.9M for five years $2M to our group) › Joint UW-HEP and Condor project funded by the UW Graduate School. › 1M hours of Condor CPUs were allocated by the National Resource Allocation Board (NREC) to the CMS group at Caltech › Our team selected as the main CS group for the next phase of PPDG (DOE-SciDAC proposal pending) › CMS group of UW is one of the four sites of the Grid Laboratory of Wisconsin (GLOW) (NSF-MRI proposal pending)
6
www.cs.wisc.edu/condor Physicists Have a LOT of data. (others have data problems too …)
7
www.cs.wisc.edu/condor The Data Grids & Us › Extend and enhance existing capabilities of Condor › Develop new storage and data transfer management technologies (key element in PPDG proposal) › Interface “legacy” applications and the Grid “fabric” with new capabilities/technologies › Work closely with the Globus team and follow their leadership on data transfer and replica management tools (joint Data Grid SciDAC proposal is pending) › Close collaboration with the EU Data Grid effort › Make ClassAds the “Esperanto” of Grid management (joint Security SciDAC proposal with Globus is pending)
8
www.cs.wisc.edu/condor And what about … › Allocating Network Resources, › Scheduling of Storage Resources, › Co-allocation of Processing, Storage and Network Resources, › Resource Reservation, › Data Shipping vs. Function Shipping, › Data Security, › Write Behind, › Data Affinity, › Or … ?
9
www.cs.wisc.edu/condor Challenges Ahead › Ride the “Grid Wave” without losing our balance › Leverage the talent and expertise of our new faculty (Distributed I/O, Distributed Scheduling, Networking, Security) › Expend our UW-Flock to a state-wide system (WiscNet?) › Apply the Master Worker paradigm to domain decomposition problems (ongoing work with JPL) › Scale our Master Worker framework to 10,000 workers. › Open Source vs. Public Domain binaries vs. a Commercial version of Condor
10
www.cs.wisc.edu/condor Don’t ask “what can the Grid do for me?”ask “what can I do with the Grid?”
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.