Presentation is loading. Please wait.

Presentation is loading. Please wait.

LHCb Grid MeetingLiverpool, 11.09.00 UK GRID Activities Glenn Patrick 11.09.00 Not particularly knowledgeable-just based on attending 3 meetings.  09.08.00UK-HEP.

Similar presentations


Presentation on theme: "LHCb Grid MeetingLiverpool, 11.09.00 UK GRID Activities Glenn Patrick 11.09.00 Not particularly knowledgeable-just based on attending 3 meetings.  09.08.00UK-HEP."— Presentation transcript:

1 LHCb Grid MeetingLiverpool, 11.09.00 UK GRID Activities Glenn Patrick 11.09.00 Not particularly knowledgeable-just based on attending 3 meetings.  09.08.00UK-HEP Globus meeting(RAL)  11.07.00UK-Grid Meeting(Cosener’s)  15.06.00UK-Grid Testbed Meeting (RAL) Globus meeting on 9th August the most useful, especially from the technical point of view.

2 Globus Activities RAL Globus 1.1.3 installed on RAL-CSF (Redhat 6.1 + PBS) with 2 gatekeepers (Andrew will cover). Interfacing GASS Server to RAL DataStore (T.Folkes). Backend code has been written and various technical issues identified concerning globus-cache, proxies and file opens. Users to access files using pseudo path name? Gaining experience with GSI-ftp and GSI-ssh (B.Saunders). Now working on PPD Linux machines.

3 Globus activities II Manchester (Andrew McNab) Globus packaged in RPM format (built on RH6.2/Globus 1.1.3). Interim measure as Globus supposed to be adopting RPM in future. GSI-ssh,GSI-ftp & GSI-gdm also being packaged in RPM. “Grid Aware ROOT” - first attempt at calling GASS client API from ROOT by modifying Tfile class. However, ROOT already has mechanisms for remote files and next version will add Grid files into list of recognised protocols. Need MDS/LDAP names rather than URLs. Standardising GASS cache on Manchester machines. Spool area for each user.

4 Globus Activities III QMW (Alex Martin) Globus 1.1.3 installed on several Linux boxes. PBS installed (packaged as RPM). Developed simple “quote” system using LDAP protocol and Perl scripts. Currently only works for single job and only based on CPU requirement. What is really required in wider batch world? TASKS Common kit of parts/Globus distribution (eg.RPM). Solution for handling certificates & Gridmap files. Common UKHEP GIIS Service (gateway index)? Security implications. Next meeting 20th Sept (Manchester)

5 LHC(UK) Proto-Centres/Testbeds? RAL Tier 1 (ALICE/ATLAS/CMS/LHCb) Submitted to JIF in Feb for £5.9M. Outcome in Nov(?). Liverpool MAP/COMPASS (LHCb/ATLAS) Funded by JREI in 1998. Operational. Upgrade requested. Glasgow/Edinburgh Computing Centre (LHCb/ATLAS) Submitted to JREI in May. Outcome known in ~December. Manchester-UCL-Sheffield (ATLAS WW scattering?) JREI bid. 32 cpu + 5 TB disk. Bristol (BaBar/CMS/LHCb) 8 node Linux  32 nodes later+storage server. Birmingham (ALICE) Funded by JREI(1999). Farm and disk storage. Others?

6 UK GRID Organisation Visible people seem to fall into 2 main camps: Potential “Exploiters” for experiments & applications. System Managers installing/developing Globus, etc. Various people are then involved in DataGrid work packages, but organisation of UK middleware work is not clear (to me). Significant funds supposed to go into Grid and e-science. Some (hierarchical) structures have been proposed along lines.. PPARC Grid Steering Committee Particle Physics Grid Management Board Technical Work Groups Sub-groups

7 RAL GRID Organisation Up until now there has been a “CLRC Grid Team” which was originally based around PPD+ITD, but has gradually pulled in other departments/sciences. Now just about to be split into: e-science forum for all of CLRC. Particle Physics Grid Team. Not clear yet how this maps into existing structures and how it affects effort for LHCb applications.


Download ppt "LHCb Grid MeetingLiverpool, 11.09.00 UK GRID Activities Glenn Patrick 11.09.00 Not particularly knowledgeable-just based on attending 3 meetings.  09.08.00UK-HEP."

Similar presentations


Ads by Google