Российский GRID для операций с большими массивами научных данных Ильин В.А. (НИИЯФ МГУ) проекты LCG (LHC Computing GRID) и EGEE (Enabling Grids for E_science.

Slides:



Advertisements
Similar presentations
CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
Advertisements

Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
Highest Energy e + e – Collider LEP at CERN GeV ~4km radius First e + e – Collider ADA in Frascati GeV ~1m radius e + e – Colliders.
EGEE is a project funded by the European Union under contract IST Application Identification and Support (NA4) Activities in the RDIG-EGEE.
Randall Sobie The ATLAS Experiment Randall Sobie Institute for Particle Physics University of Victoria Large Hadron Collider (LHC) at CERN Laboratory ATLAS.
Alain Romeyer - Dec Grid computing for CMS What is the Grid ? Let’s start with an analogy How it works ? (Some basic ideas) Grid for LHC and CMS.
1 Developing Countries Access to Scientific Knowledge Ian Willers CERN, Switzerland.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
V.Gavrilov 1, I.Golutvin 2, V.Ilyin 3, O.Kodolova 3, V.Korenkov 2, L.Levchuk 4, E.Tikhonenko 2, S.Shmatov 2,V.Zhiltsov 2 1- Institute of Theoretical and.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
ITEP participation in the EGEE project NEC’2005, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
Frédéric Hemmer, CERN, IT DepartmentThe LHC Computing Grid – October 2006 LHC Computing and Grids Frédéric Hemmer IT Deputy Department Head October 10,
Task 6.1 Installing and testing components of the LCG infrastructure to achieve full-scale functionality CERN-INTAS , 25 June, 2006, Dubna V.A.
GGF12 – 20 Sept LCG Incident Response Ian Neilson LCG Security Officer Grid Deployment Group CERN.
LHCb computing in Russia Ivan Korolko (ITEP Moscow) Russia-CERN JWGC, October 2005.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
LISHEP, Rio de Janeiro, 20 February 2004 Russia in LHC DCs and EDG/LCG/EGEE V.A. Ilyin Moscow State University.
LHC Computing Review - Resources ATLAS Resource Issues John Huth Harvard University.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
IST E-infrastructure shared between Europe and Latin America High Energy Physics Applications in EELA Raquel Pezoa Universidad.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks NA3 Activity in Russia Sergey Oleshko, PNPI,
GRID development in Russia 1) Networking for science and higher eductation 2) Grid for HEP 3) Digital Divide V. Ilyin SINP MSU.
LHC Computing Plans Scale of the challenge Computing model Resource estimates Financial implications Plans in Canada.
ICHEP06, 29 July 2006, Moscow RDIG The Russian Grid for LHC physics analysis V.A. Ilyin, SINP MSU V.V. Korenkov, JINR A.A. Soldatov, RRC KI LCG.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
А.Минаенко Совещание по физике и компьютингу, 03 февраля 2010 г. НИИЯФ МГУ, Москва Текущее состояние и ближайшие перспективы компьютинга для АТЛАСа в России.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Responsibilities of ROC and CIC in EGEE infrastructure A.Kryukov, SINP MSU, CIC Manager Yu.Lazin, IHEP, ROC Manager
SCIC in the WSIS Stocktaking Report (July 2005): uThe SCIC, founded in 1998 by ICFA, is listed in Section.
EGEE is a project funded by the European Union under contract IST The Russian Research Centre Kurchatov Institute Partner Introduction Dr.Sergey.
Institute of High Energy Physics ( ) NEC’2005 Varna, Bulgaria, September Participation of IHEP in EGEE.
RDIG (Russian Data Intensive Grid) e-Infrastructure: Status and Plans Vyacheslav Ilyin (SINP, MSU), Vladimir Korenkov (JINR, Dubna), Aleksey Soldatov (RRC.
V.A. Ilyin,, RIGF, 14 May 2010 Internet and Science: LHC view V.A. Ilyin SINP MSU, e-ARENA.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Digital Divide in Russia Alexey Soldatov, RRC “Kurchatov Institute”
Development of Russian Grid Segment in the frames of EU DataGRID, LCG and EGEE projects V.A.Ilyin (SINP MSU), V.V.Korenkov (JINR, Dubna) NEC’2003, Varna.
V.Ilyin, V.Gavrilov, O.Kodolova, V.Korenkov, E.Tikhonenko Meeting of Russia-CERN JWG on LHC computing CERN, March 14, 2007 RDMS CMS Computing.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
A. Hoummada May Korea Moroccan ATLAS GRID MAGRID Abdeslam Hoummada University HASSAN II Ain Chock B.P Maarif CASABLANCA - MOROCCO National.
V.Gavrilov 1, I.Golutvin 2, V.Ilyin 3, O.Kodolova 3, V.Korenkov 2, E.Tikhonenko 2, S.Shmatov 2,V.Zhiltsov 2 1- Institute of Theoretical and Experimental.
Ian Bird LCG Deployment Area Manager & EGEE Operations Manager IT Department, CERN Presentation to HEPiX 22 nd October 2004 LCG Operations.
WLCG and the India-CERN Collaboration David Collados CERN - Information technology 27 February 2014.
V.A. Ilyin, ICFA DDW’06, Cracow, 11 October 2006 Networking and Grid in Russia V.A. Ilyin DDW’06, Cracow 11 October 2006.
ITEP participation in the EGEE project NEC’2007, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
Russia-CERN Joint Working Group on LHC Computing Russia-CERN Joint Working Group on LHC Computing, 19 March, 2004, CERN V.A. Ilyin 1.Some about JWGC 2.Russia.
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
INFSO-RI Enabling Grids for E-sciencE RDIG - Russia in EGEE Viatcheslav Ilyin RDIG Consortium Director, EGEE PMB SINP MSU (48),
LHC Computing, CERN, & Federated Identities
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
V. Ilyin, Russia – EU, Russia participation in EGEE stable core infrastructure - new applications/new resources/new.
10-Jan-00 CERN Building a Regional Centre A few ideas & a personal view CHEP 2000 – Padova 10 January 2000 Les Robertson CERN/IT.
INFSO-RI Enabling Grids for E-sciencE Работы в проекте EGEE по направлению NA3 Е. Слабоспицкая ИФВЭ.
tons, 150 million sensors generating data 40 millions times per second producing 1 petabyte per second The ATLAS experiment.
Ian Bird WLCG Networking workshop CERN, 10 th February February 2014
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
Enabling Grids for E-sciencE INFSO-RI Dr. Rüdiger Berlich Forschungszentrum Karslruhe Introduction to Grid Computing Christopher.
Report on availability of the JINR networking, computing and information infrastructure for real data taking and processing in LHC experiments Ivanov V.V.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
Bob Jones EGEE Technical Director
Grid site as a tool for data processing and data analysis
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Russian Regional Center for LHC Data Analysis
LHCb computing in Russia
LHC Data Analysis using a worldwide computing grid
Presentation transcript:

Российский GRID для операций с большими массивами научных данных Ильин В.А. (НИИЯФ МГУ) проекты LCG (LHC Computing GRID) и EGEE (Enabling Grids for E_science in Europe) Таруса, 6 февраля 2004 г.

CERN

Online system Multi-level trigger Filter out background Reduce data volume Online reduction 10 7 Trigger menus Select interesting events Filter out less interesting level 1 - special hardware 40 MHz (40 TB/sec) level 2 - embedded processors level 3 - PCs 75 KHz (75 GB/sec) 5 KHz (5 GB/sec) 100 Hz (100 MB/sec) Data recording & offline analysis

Большой адронный коллайдер (БАК) потоки данных, этапы обработки и анализа ГБ/сек Архивное хранение ГБ/сек Интерактивный физический анализ Подготовка данных для анализа Подготовка данных для анализа детектор Суммарные данные по событию ESD сырые данные Реконструкция события Реконструкция события Моделиро- вание событий Моделиро- вание событий Данные для анализа (выделенные по физ. каналам) Отбор событий и первичная реконструкция Отбор событий и первичная реконструкция ~100 MБ/сек 1-6 ПБ/год тысячи ученых 200 TБ/год ПБ/год 200 MБ/сек РИВК БАК 5-10% Tier1 Tier2

LHC Challenges: Scale (in 2006) Data written to tape ~40 Petabytes/Year and UP (1 PB = 10**9 MBytes) Processing capacity TIPS and UP (1 TIPS = 10**6 MIPS) Typical networks Few Gbps Per Link Lifetime of experiment 2-3 Decades (start in 2007) Users ~ 5000 physicists Software developers ~ 300 (Four Experiments)

MONARC project regional group LHC Computing Model evolving CERN Tier3 physics department    Desktop Germany UK France Italy CERN Tier1 USA Tier1 The opportunity of Grid technology Tier2 Uni a Lab c Uni n Lab m Lab b Uni b Uni y Uni x Russia

Russian Tier2-Cluster Russian regional center for LHC computing Cluster of institutional computing centers with Tier2 functionality and summary resources at % level of the canonical Tier1 center for each experiment (ALICE, ATLAS, CMS, LHCb): analysis; simulations; users data support. Participating institutes: Moscow ITEP, KI, MSU, LPI, MEPhI… Moscow region JINR, IHEP, INR RAS St.Petersburg PNPI RAS, … Novosibirsk BINP SB RAS Coherent use of distributed resources by means of LCG (EDG, VDT, …) technologies. Active participation in the LCG Phase1 Prototyping and Data Challenges (at 5% level) Q42007 CPU kSI (15) Disk TB 712-(16) Tape TB (10)20-(50) Network Mbps /…Gbps/…

Russia in LCG We have started activity in LCG (LHC Computing GRID project) in autumn Russia is joining to the LCG-1 infrastructure now. First SINP MSU, then JINR, ITEP and IHEP. Goal – to have in Russia in Q4 operational segment of world-wide LCG infrastructure and be ready to DataChallenges in Manpower contribution to LCG (started in May 2003): the Agreement is under signing by CERN and Russia and JINR officials, 3 tasks for our responsibility: 1) testing new GRID mw to be used in LCG 2) evaluation of new-on-the-market GRID mw (first task – evaluation of GT3 and WebSphere) 3) common solutions for event generators (event data bases).

LHC Data Challenges Программы генерации модельных (Монте Карло) данных для будущих экспериментов на LHC. Типичная нагрузка на каналы связи сейчас – передача 100 Гбайт данных из Москвы в ЦЕРН за рабочий день  50 Мбит/сек ! Но это не средняя нагрузка – «пиковая»:

LCG is not a development project – F it relies on other grid projects for grid middleware development and support LCG - Goals The goal of the LCG project is to prototype and deploy the computing environment for the LHC experiments Two phases: –Phase 1: 2002 – 2005 –Build a service prototype, based on existing grid middleware –Gain experience in running a production grid service –Produce the TDR for the final system –Phase 2: 2006 – 2008 –Build and commission the initial LHC computing environment

Collaborating Computer Centres Building a Grid  The virtual LHC Computing organizations Grid Alice VO CMS VO

CEWN lhc01.sinp.msu.ru lhc02.sinp.msu.ru НИИЯФ МГУ SE lhc03.sinp.msu.ru Пример использования EDG (LCG) middleware (CMS VO) SINP MSU RB+ Information Index lhc20.sinp.msu.ru Пользователь lhc04.sinp.msu.ru ЦЕРН lxshare0220.cern.ch Падуя grid011.pd.infn.it

EGEE Timeline

Distribution of GRID Service Activities over Europe: Operations Management at CERN; Core Infrastructure Centres (CICs) in the UK, France, Italy, Russia and at CERN, responsible for managing the overall Grid infrastructure; Regional Operations Centres (ROCs), responsible for coordinating regional resources, regional deployment and support of services. Россия: CIC – МГУ, РНЦ КИ, ОИЯИ, ИТЭФ, ИПМ РАН ROC – ИФВЭ, ОИЯИ, ПИЯФ РАН, ИМПБ РАН

Pan-European Multi-Gigabit Backbone (33 Countries) January 2004 Planning Underway for “GEANT2” (GN2) Multi-Lambda Backbone, to Start In 2005

International Committee for Future Accelerators (ICFA) Standing Committee on Inter-Regional Connectivity (SCIC) ICFA SCIC Reports Networking for High Energy and Nuclear Physics - Feb/2004 ( Doc - A4 / Pdf - A4 )DocA4PdfA4 Report on the Digital Divide in Russia - Feb/2004 ( Doc - A4 / Pdf - A4 )DocA4PdfA4 Network Monitoring Report - Feb/2004 ( Doc - A4 / Pdf - A4 )DocA4PdfA4 Advanced Technologies Interim Report - Feb/2003 ( Doc )Doc Digital Divide Executive Report - Feb/2003 ( Doc )Doc

GLORIAD: Global Optical Ring (US-Ru-Cn) Also Important for Intra-Russia Connectivity “ Little Gloriad ” (OC3) Launched January 12; to OC192 in 2005

SCIC Monitoring WG – Throughput Improvements Bandwidth of TCP < MSS/(RTT*Sqrt(Loss)) (1) 1) Matthis et al., Computer Communication Review 27(3), July 1997 ( 1) Matthis et al., Computer Communication Review 27(3), July % annual improvement Factor ~100/10 yr Progress: but Digital Divide is Mostly Maintained Some Regions ~5-10 Years Behind SE Europe and Parts of Asia May be Catching Up (Slowly)

GLORIAD 10 Gbps Moscow 1 Gbps IHEP 8 Mbps (m/w), under construction 100 Mbps fiber-optic (Q1-Q2 2004?) JINR 45 Mbps, Mbps (Q1-Q2 2004), Gbps ( ) INR RAS 2 Mbps+2x4Mbps(m/w) BINP 1 Mbps, 45 Mbps (2004 ?), … PNPI 512 Kbps (commodity), and 34 Mbps f/o but budget is only for 2 Mbps (!) USA NaukaNET 155 Mbps GEANT 155 Mbps basic link, plus 155 Mbps additional link for GRID projects Japan through USA by FastNET, 512 Kbps Novosibirsk(BINP) - KEK REGIONAL and INTERNATIONAL CONNECTIVITY for RUSSIA High Energy Physics