CDF computing in the GRID framework in Santander

Slides:



Advertisements
Similar presentations
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Advertisements

24-Apr-03UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS SECURITY SOFTWARE MAINTENANCE BACKUP.
LCG Grid Deployment Board, March 2003 Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Status of GridKa for LCG-1 Forschungszentrum Karlsruhe.
CROSSGRID WP41 Valencia Testbed Site: IFIC (Instituto de Física Corpuscular) CSIC-Valencia ICMoL (Instituto de Ciencia Molecular) UV-Valencia 28/08/2002.
Southwest Tier 2 Center Status Report U.S. ATLAS Tier 2 Workshop - Harvard Mark Sosebee for the SWT2 Center August 17, 2006.
Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
Distributed IT Infrastructure for U.S. ATLAS Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Task 3.5 Tests and Integration ( Wp3 kick-off meeting, Poznan, 29 th -30 th January 2002 Santiago González de la.
Title US-CMS User Facilities Vivian O’Dell US CMS Physics Meeting May 18, 2001.
1 1 WLCWLCG workshop G workshop. Introduction to KISTI Introduction to NSDC Project Activities in 2009 System architecture Management plan for Alice tier-1.
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
Remote Production and Regional Analysis Centers Iain Bertram 24 May 2002 Draft 1 Lancaster University.
UTA Site Report Jae Yu UTA Site Report 4 th DOSAR Workshop Iowa State University Apr. 5 – 6, 2007 Jae Yu Univ. of Texas, Arlington.
30-Jun-04UCL HEP Computing Status June UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS.
Computing for HEP in the Czech Republic Jiří Chudoba Institute of Physics, AS CR, Prague.
Alain Romeyer - 15/06/20041 CMS farm Mons Final goal : included in the GRID CMS framework To be involved in the CMS data processing scheme.
D0 SAM – status and needs Plagarized from: D0 Experiment SAM Project Fermilab Computing Division.
29 June 2004Distributed Computing and Grid- technologies in Science and Education. Dubna 1 Grid Computing in the Czech Republic Jiri Kosina, Milos Lokajicek,
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
Tier 3 and Computing Delhi Satyaki Bhattacharya, Kirti Ranjan CDRST, University of Delhi.
A Design for KCAF for CDF Experiment Kihyeon Cho (CHEP, Kyungpook National University) and Jysoo Lee (KISTI, Supercomputing Center) The International Workshop.
March 2003 CERN 1 EDG and AliEn in Prague Dagmar Adamova INP Rez near Prague.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
LCG-2 Plan in Taiwan Simon C. Lin and Eric Yen Academia Sinica Taipei, Taiwan 13 January 2004.
Andrew McNabNorthGrid, GridPP8, 23 Sept 2003Slide 1 NorthGrid Status Andrew McNab High Energy Physics University of Manchester.
Batch Scheduling at LeSC with Sun Grid Engine David McBride Systems Programmer London e-Science Centre Department of Computing, Imperial College.
1 1APCTP LHC Konkuk University. Introduction to GSDC Project Activities in 2009 Strategies and Plans in 2010 GSDC office opening ceremony CERN.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
A Plan for HEP Data Grid Project in Korea Kihyeon Cho Center for High Energy Physics (CHEP) Kyungpook National University CDF/D0 Grid Meeting August 5,
International Workshop on HEP Data Grid Nov 9, 2002, KNU Data Storage, Network, Handling, and Clustering in CDF Korea group Intae Yu*, Junghyun Kim, Ilsung.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
D0SAR - September 2005 Andre Sznajder 1 Rio GRID Initiatives : T2-HEPGRID Andre Sznajder UERJ(Brazil)
Laboratório de Instrumentação e Física Experimental de Partículas GRID Activities at LIP Jorge Gomes - (LIP Computer Centre)
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
26SEP03 2 nd SAR Workshop Oklahoma University Dick Greenwood Louisiana Tech University LaTech IAC Site Report.
The II SAS Testbed Site Jan Astalos - Institute of Informatics Slovak Academy of Sciences.
October 2002 INFN Catania 1 The (LHCC) Grid Project Initiative in Prague Dagmar Adamova INP Rez near Prague.
1 PRAGUE site report. 2 Overview Supported HEP experiments and staff Hardware on Prague farms Statistics about running LHC experiment’s DC Experience.
HEPIX - HEPNT, 1 Nov Milos Lokajicek, IP AS CR, Prague1 Status Report - Czech Republic HEP Groups and experiments Networking and Computing Grid activities.
October 2006ICFA workshop, Cracow1 HEP grid computing in Portugal Jorge Gomes LIP Computer Centre Lisbon Laboratório de Instrumentação e Física Experimental.
Dave Newbold, University of Bristol8/3/2001 UK Testbed 0 Sites Sites that have committed to TB0: RAL (R) Birmingham (Q) Bristol (Q) Edinburgh (Q) Imperial.
Sep 02 IPP Canada Remote Computing Plans Pekka K. Sinervo Department of Physics University of Toronto 4 Sep IPP Overview 2 Local Computing 3 Network.
KOLKATA Grid Site Name :- IN-DAE-VECC-02Monalisa Name:- Kolkata-Cream VO :- ALICECity:- KOLKATACountry :- INDIA Shown many data transfers.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
National HEP Data Grid Project in Korea Kihyeon Cho Center for High Energy Physics (CHEP) Kyungpook National University CDF CAF & Grid Meeting July 12,
7 March 2000EU GRID Project Proposal Meeting CERN, M. Lokajicek 1 Proposal for Participation of the Czech Republic in the EU HEP GRID Project Institute.
DataTAG Work Package 4 Meeting Bologna Simone Ludwig Brunel University 23rd and 24th of May 2002.
December 26, 2015 RHIC/USATLAS Grid Computing Facility Overview Dantong Yu Brookhaven National Lab.
Scientific Storage at FNAL Gerard Bernabeu Altayo Dmitry Litvintsev Gene Oleynik 14/10/2015.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
Tier 3 Status at Panjab V. Bhatnagar, S. Gautam India-CMS Meeting, July 20-21, 2007 BARC, Mumbai Centre of Advanced Study in Physics, Panjab University,
International Workshop on HEP Data Grid Aug 23, 2003, KNU Status of Data Storage, Network, Clustering in SKKU CDF group Intae Yu*, Joong Seok Chae Department.
January 30, 2016 RHIC/USATLAS Computing Facility Overview Dantong Yu Brookhaven National Lab.
MC Production in Canada Pierre Savard University of Toronto and TRIUMF IFC Meeting October 2003.
UTA Site Report Jae Yu UTA Site Report 7 th DOSAR Workshop Louisiana State University Apr. 2 – 3, 2009 Jae Yu Univ. of Texas, Arlington.
Dave Newbold, University of Bristol14/8/2001 Testbed 1 What is it? First deployment of DataGrid middleware tools The place where we find out if it all.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
Instituto de Biocomputación y Física de Sistemas Complejos Cloud resources and BIFI activities in JRA2 Reunión JRU Española.
CMB & LSS Virtual Research Community Marcos López-Caniego Enrique Martínez Isabel Campos Jesús Marco Instituto de Física de Cantabria (CSIC-UC) EGI Community.
Grid activities in Czech Republic Jiri Kosina Institute of Physics of the Academy of Sciences of the Czech Republic
13 January 2004GDB Geneva, Milos Lokajicek Institute of Physics AS CR, Prague LCG regional centre in Prague
Southwest Tier 2 Center Status Report
SAM at CCIN2P3 configuration issues
5th DOSAR Workshop Louisiana Tech University Sept. 27 – 28, 2007
UK GridPP Tier-1/A Centre at CLRC
The INFN TIER1 Regional Centre
Lee Lueking D0RACE January 17, 2002
Computing activities at Victoria
Presentation transcript:

CDF computing in the GRID framework in Santander Santander HEP Group IFCA (Instituto de Fisica de Cantabria, CSIC-UC) Teresa Rodrigo, Iván Vila, Alberto Ruiz, Rocio Vilar, Gervasio Portilla, Jonatan Piedra, Javier Fernández, Javier Cuevas Jesús Marco (marco@ifca.unican.es) ICRB Meeting 4 September 2002

CDF Computing in the GRID Framework in Santander GRID activities IFCA Santander present in EU-DATAGRID Participation in testbed (WP6): CA provider for SPAIN Team (~5 people) + experience with Globus + EDG (1.2.0) IFCA/CSIC main partner for EU-CROSSGRID project Use of distributed O/R DBMS Data-mining: distributed NN (via MPI) Responsible for testbed (WP4) distributed over the Geant network across 11 European countries National initiative: LCG-ES project Close to Tier-2 level resources, CMS “oriented”, including : MC production farm Physics Analysis NEW Resources (hardware + personnel) 4-IX-2002 CDF Computing in the GRID Framework in Santander

CDF Computing in the GRID Framework in Santander Available Resources “New” Infrastructure: Cluster ~100 IBM servers (~50% available for HEP) (dual 1.26 GHz, 640Mb-4GB RAM, 80 GB/server) + 4-way processor gatekeeper Gigabit local backbone Local Disk and Tape ( 10 Tb on disk, 10 Tb on LTO robot) Improved network connection: 155 Mbps Santander-Geant 2.5 Gbps into-Geant and to USA Personnel (for whole GRID activity): 2 Grid “experts” + support from university computing architecture team 2 DBMS “experts” Several seniors with experience in HEP offline software (installation and processing) 4-IX-2002 CDF Computing in the GRID Framework in Santander

CDF Computing in the GRID Framework in Santander Perspectives Need to use the GRID framework sinergy for activities, infraestructure use, personnel in DATAGRID/CROSSGRID and also activity in CMS Funding requested for our activity in CDF computing: 1 dedicated person (starting before end of the year) 50 k$/year on hardware (double by 2004 current resources) Mixed model: (trying to optimize funding possibilities/analysis objectives): Minimal resources at FNAL: Disk on-line + CPU (basically for people at FNAL ) Shared resources at Santander: 50% of current server cluster operated in GRID framework Upgrade of on-line disk + additional computing nodes + tape 4-IX-2002 CDF Computing in the GRID Framework in Santander

Interest of Collaboration Physics channels: B-physics, top and Higgs searches Possible Scheme (to be considered in discussion of contribution to CDF): MC Production in Santander GRID cluster (including limited tape storage) Production data analysis jobs in Santander also via GRID Initial user analysis jobs tested at Fermilab with limited resources Deploy a powerful Interactive Physics Analysis Facility in GRID framework, including significative resources in Santander Contribute to CDF Grid Activities with expertise on testbeds, and participating in EU-US initiatives 4-IX-2002 CDF Computing in the GRID Framework in Santander