DataTAG Work Package 4 Meeting Bologna Simone Ludwig Brunel University 23rd and 24th of May 2002.

Slides:



Advertisements
Similar presentations
30-31 Jan 2003J G Jensen, RAL/WP5 Storage Elephant Grid Access to Mass Storage.
Advertisements

S.L.LloydGrid Steering Committee 8 March 2002 Slide 1 Status of GridPP Overview Financial Summary Recruitment Status EU DataGrid UK Grid Status GridPP.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Andrew McNab - Manchester HEP - 10 May 2002 UK Testbed Deployment Aim of this talk is to the answer the questions: –What are other sites doing? –What are.
Status Report University of Bristol 3 rd GridPP Collaboration Meeting 14/15 February, 2002Marc Kelly University of Bristol 1 Marc Kelly University of Bristol.
Andrew McNab - Manchester HEP - 17 September 2002 Putting Existing Farms on the Testbed Manchester DZero/Atlas and BaBar farms are available via the Testbed.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
SCARF Duncan Tooke RAL HPCSG. Overview What is SCARF? Hardware & OS Management Software Users Future.
UCL HEP Computing Status HEPSYSMAN, RAL,
Presenter Name Facility Name EDG Testbed Status Moving to Testbed Two.
GridPP News NeSC opening “Media” dissemination Tier 1/A hardware Web pages Collaboration meetings Nick Brook University of Bristol.
LNL CMS M.Biasotto, Bologna, 29 aprile LNL Analysis Farm Massimo Biasotto - LNL.
CHEPREO Tier-3 Center Achievements. FIU Tier-3 Center Tier-3 Centers in the CMS computing model –Primarily employed in support of local CMS physics community.
A conceptual model of grid resources and services Authors: Sergio Andreozzi Massimo Sgaravatto Cristina Vistoli Presenter: Sergio Andreozzi INFN-CNAF Bologna.
27-29 September 2002CrossGrid Workshop LINZ1 USE CASES (Task 3.5 Test and Integration) Santiago González de la Hoz CrossGrid Workshop at Linz,
Task 3.5 Tests and Integration ( Wp3 kick-off meeting, Poznan, 29 th -30 th January 2002 Santiago González de la.
Title US-CMS User Facilities Vivian O’Dell US CMS Physics Meeting May 18, 2001.
Site Report HEPHY-UIBK Austrian federated Tier 2 meeting
Andrew McNab - Manchester HEP - 22 April 2002 UK Rollout and Support Plan Aim of this talk is to the answer question “As a site admin, what are the steps.
Edinburgh University Experimental Particle Physics Alasdair Earl PPARC eScience Summer School September 2002.
The EDG Testbed Deployment Details The European DataGrid Project
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
GRACE Project IST EGAAP meeting – Den Haag, 25/11/2004 Giuseppe Sisto – Telecom Italia Lab.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
April 2001HEPix/HEPNT1 RAL Site Report John Gordon CLRC, UK.
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
A monitoring tool for a GRID operation center Sergio Andreozzi (INFN CNAF), Sergio Fantinel (INFN Padova), David Rebatto (INFN Milano), Gennaro Tortone.
28 April 2003Imperial College1 Imperial College Site Report HEP Sysman meeting 28 April 2003.
CMS Stress Test Report Marco Verlato (INFN-Padova) INFN-GRID Testbed Meeting 17 Gennaio 2003.
21 st October 2002BaBar Computing – Stephen J. Gowdy 1 Of 25 BaBar Computing Stephen J. Gowdy BaBar Computing Coordinator SLAC 21 st October 2002 Second.
Laboratório de Instrumentação e Física Experimental de Partículas GRID Activities at LIP Jorge Gomes - (LIP Computer Centre)
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
São Paulo Regional Analysis Center SPRACE Status Report 22/Aug/2006 SPRACE Status Report 22/Aug/2006.
1 PRAGUE site report. 2 Overview Supported HEP experiments and staff Hardware on Prague farms Statistics about running LHC experiment’s DC Experience.
DataGRID WPMM, Geneve, 17th June 2002 Testbed Software Test Group work status for 1.2 release Andrea Formica on behalf of Test Group.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Tier1A Status Andrew Sansum 30 January Overview Systems Staff Projects.
Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford.
CDF computing in the GRID framework in Santander
Jens G Jensen RAL, EDG WP5 Storage Element Overview DataGrid Project Conference Heidelberg, 26 Sep-01 Oct 2003.
29/1/2002A.Ghiselli, INFN-CNAF1 DataTAG / WP4 meeting Cern, 29 January 2002 Agenda  start at  Project introduction, Olivier Martin  WP4 introduction,
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.
Andrew McNab - Manchester HEP - 17 September 2002 UK Testbed Deployment Aim of this talk is to the answer the questions: –“How much of the Testbed has.
December 26, 2015 RHIC/USATLAS Grid Computing Facility Overview Dantong Yu Brookhaven National Lab.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
DataTAG: Glue schema test C.Vistoli DataTAG Wp4 Meeting 23/5/2002.
Storage and Data Movement at FNAL D. Petravick CHEP 2003.
DataTAG is a project funded by the European Union DataTAG WP4 meeting, Bologna 29/07/2003 – n o 1 GLUE Schema - Status Report DataTAG WP4 meeting Bologna,
CNAF Database Service Barbara Martelli CNAF-INFN Elisabetta Vilucchi CNAF-INFN Simone Dalla Fina INFN-Padua.
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
The DataGrid Project NIKHEF, Wetenschappelijke Jaarvergadering, 19 December 2002
Gennaro Tortone, Sergio Fantinel – Bologna, LCG-EDT Monitoring Service DataTAG WP4 Monitoring Group DataTAG WP4 meeting Bologna –
Site Authorization Service Local Resource Authorization Service (VOX Project) Vijay Sekhri Tanya Levshina Fermilab.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
Bob Jones – Project Architecture - 1 March n° 1 Project Architecture, Middleware and Delivery Schedule Bob Jones Technical Coordinator, WP12, CERN.
CNAF - 24 September 2004 EGEE SA-1 SPACI Activity Italo Epicoco.
The EDG Testbed Deployment Details
LCG 3D Distributed Deployment of Databases
Sergio Fantinel, INFN LNL/PD
SAM at CCIN2P3 configuration issues
UK GridPP Tier-1/A Centre at CLRC
EDT-WP4 monitoring group status report
CMS report from FNAL demo week Marco Verlato (INFN-Padova)
A conceptual model of grid resources and services
Report on GLUE activities 5th EU-DataGRID Conference
Presentation transcript:

DataTAG Work Package 4 Meeting Bologna Simone Ludwig Brunel University 23rd and 24th of May 2002

2 Overview UK Testbed Activities Software Releases Status Overview Testbed at RAL Further Activities PPARC Effort in WP4 DataTAG Work done so far Work Plan

3 Software Releases Deployment largely driven/limited by EDG software release status and cycles 3 major releases with three yearly Testbeds 1, 2 and 3 Minor releases every 2 months, and then patch level releases between those Currently there is a mixture of sites with: –some version of the Globus gatekeeper –old Globus installations –Globus 2.0/2.0beta installations, including EDG installations (using 2.0beta), usually just a Computing Element BaBar installations of EDG CE

4 Status Overview Green DotG1.1.3G2.0(b)EDG-CEBabar-CE Birminghamyyy Bristolyyyyy Brunelyy Cambridgey Edinburghyy Glasgowyy Imperialyyy Lancasteryy Liverpoolyy Manchesteryyyyy Oxfordyy QMULyyy RALyyyy RHULyy UCLy

5 Testbed at RAL Hardware Purchase installed March 156 Dual 1.4GHz 1GB RAM, 30GB disks (312 cpus) 26 Disk servers (Dual 1.266GHz) 1.9TB disk each Expand the capacity of the tape robot by 35TB Current EDG TB setup 14 Dual 1GHz PIII, 500MB RAM 40GB disks Compute Element (CE) Storage Element (SE) User Interfaces (UI) Information Node (IN) + Worker Nodes (WN) + Existing Central Facilities (Non Grid) 250 CPUs 10TB Disk 35TB Tape (Capacity 330 TB)

6 Further Activities Testbed 1.2 Integration started last week TB2.0 September 2002 –Several major releases of software due TB3.0 March 2003 –First feasible release using OGSA –Even if early adopters start now EDG already adopted by other Grids –Pressure to expand testbeds or integrate with others –DataTAG, CrossGrid, GLUE

7 PPARC Effort in WP4 DataTAG Involved in 4.1: Networked Resource Discovery Systems - Service Discovery 4.3: Interworking between domain- specific collective Grid services - Interoperability

8 Work done so far Induction to Grid technologies Development and Installation of GLOBUS Grid Lab Environment Analysis of Grid services Review of Analysis, Modelling and Reverse- engineering tools Investigating of Schema notations Investigation of different Resource Discovery Systems (Jini, SDS, LDAP)

9 Work Plan Investigation of various experiments (CMS, BaBar) in order to find requirements necessary for Grid operation -> Service Discovery Investigation of Web Service Technology to get an idea of a common framework Definition of a Common Framework in which interfaces and protocols can be specified and modelled (-> Interoperability) Development of Test Cases to verify and validate the interoperability issues