Status of T3 GRID site infrastructure in Egypt Ashraf Mohamed Kasem Department of physics Ain Shams University.

Slides:



Advertisements
Similar presentations
CBPF J. Magnin LAFEX-CBPF. Outline What is the GRID ? Why GRID at CBPF ? What are our needs ? Status of GRID at CBPF.
Advertisements

PowerEdge T20 Customer Presentation. Product overview Customer benefits Use cases Summary PowerEdge T20 Overview 2 PowerEdge T20 mini tower server.
Custom’s K-12 Education Technology Council Presents… Custom Computer Specialists Server Technology Solutions Designed for NYCDOE Affordable and.
LANs made simple. 2 network devices connected to share resources.
The Academy of Scientific Research and Technology Egyptian Network of High Energy Physics (ASRT-ENHEP) Prof. Amr Radi26/9/2013 CMS Management Meeting1.
EENP2 Annual meeting January 2014 Ain Shams University, Cairo The Europe Egypt Network for Particle Physics SEVENTH FRAMEWORK PROGRAMME Marie Curie.
Linux Clustering A way to supercomputing. What is Cluster? A group of individual computers bundled together using hardware and software in order to make.
Institute for High Energy Physics ( ) NEC’2007 Varna, Bulgaria, September Activities of IHEP in LCG/EGEE.
Site Report HEPHY-UIBK Austrian federated Tier 2 meeting
IFIN-HH LHCB GRID Activities Eduard Pauna Radu Stoica.
Bondyakov A.S. Institute of Physics of ANAS, Azerbaijan JINR, Dubna.
CPP Staff - 30 CPP Staff - 30 FCIPT Staff - 35 IPR Staff IPR Staff ITER-India Staff ITER-India Staff Research Areas: 1.Studies.
16.1 © 2004 Pearson Education, Inc. Exam Managing and Maintaining a Microsoft® Windows® Server 2003 Environment Lesson 16: Examining Software Update.
5.3 HS23 Blade Server. The HS23 blade server is a dual CPU socket blade running Intel´s new Xeon® processor, the E5-2600, and is the first IBM BladeCenter.
1. Outline Introduction Virtualization Platform - Hypervisor High-level NAS Functions Applications Supported NAS models 2.
Digital Graphics and Computers. Hardware and Software Working with graphic images requires suitable hardware and software to produce the best results.
1 INDIACMS-TIFR TIER-2 Grid Status Report IndiaCMS Meeting, Sep 27-28, 2007 Delhi University, India.
Computer Design Corby Milliron. Mother Board specs Model: Processor Socket Intel Processor Interface LGA1150 Form Factor ATX Processors Supported 4th.
Cluster computing facility for CMS simulation work at NPD-BARC Raman Sehgal.
1 Deployment of an LCG Infrastructure in Australia How-To Setup the LCG Grid Middleware – A beginner's perspective Marco La Rosa
Basic Computer Structure and Knowledge Project Work.
1 October 20-24, 2014 Georgian Technical University PhD Zaza Tsiramua Head of computer network management center of GTU South-Caucasus Grid.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
Tutorial 11 Installing, Updating, and Configuring Software
T ASK 4 - P6 - P RODUCE UPDATED DOCUMENTATION FOR THE MODIFICATION By Victor Remes.
E-Infrastructure hierarchy Networking and Computational facilities in Armenia ASNET AM Network Armenian National Grid Initiative Armenian ATLAS site (AM-04-YERPHI)
DELL PowerEdge 6800 performance for MR study Alexander Molodozhentsev KEK for RCS-MR group meeting November 29, 2005.
Operated by Los Alamos National Security, LLC for DOE/NNSA U N C L A S S I F I E D Medialess Computing: A LANL success story using multiple KVM technologies.
Enabling Technologies for Distributed and Cloud Computing Dr. Sanjay P. Ahuja, Ph.D FIS Distinguished Professor of Computer Science School of.
Parts Of Computer Networks What makes them work!.
Appendix B Planning a Virtualization Strategy for Exchange Server 2010.
Objective  CEO of a small company  Create a small office network  $10,000 and $20,000 Budget  Three servers (workstations)  Firewall device  Switch.
Tier 3 and Computing Delhi Satyaki Bhattacharya, Kirti Ranjan CDRST, University of Delhi.
© Copyright 2013 TONE SOFTWARE CORPORATION New Client Turn-Up Overview.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
INDIACMS-TIFR Tier 2 Grid Status Report I IndiaCMS Meeting, April 05-06, 2007.
Sandor Acs 05/07/
Sejong STATUS Chang Yeong CHOI CERN, ALICE LHC Computing Grid Tier-2 Workshop in Asia, 1 th December 2006.
PDSF at NERSC Site Report HEPiX April 2010 Jay Srinivasan (w/contributions from I. Sakrejda, C. Whitney, and B. Draney) (Presented by Sandy.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
Your university or experiment logo here GridPP Storage Future Jens Jensen GridPP workshop RHUL, April 2010.
NCPHEP ATLAS/CMS Tier3: status update V.Mossolov, S.Yanush, Dz.Yermak National Centre of Particle and High Energy Physics of Belarusian State University.
Company LOGO “ALEXANDRU IOAN CUZA” UNIVERSITY OF IAŞI” Digital Communications Department Status of RO-16-UAIC Grid site in 2013 System manager: Pînzaru.
COMPUTER COMPARISON Period 4 By : Matthew Walker Joseph Deahn Philip Wymer Joshua Deloraya.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
How are they called?.
Installation of Storage Foundation for Windows High Availability 5.1 SP2 1 Daniel Schnack Principle Technical Support Engineer.
Enabling Technologies for Distributed Computing Dr. Sanjay P. Ahuja, Ph.D. Fidelity National Financial Distinguished Professor of CIS School of Computing,
SA1 operational policy training, Athens 20-21/01/05 Presentation of the HG Node “Isabella” and operational experience Antonis Zissimos Member of ICCS administration.
Current Status Work Package Three By: Mohamed Elshamy.
Evangelos Markatos and Charalampos Gkikas FORTH-ICS Athens, th Mar Institute of Computer Science - FORTH Christos.
INRNE's participation in LCG Elena Puncheva Preslav Konstantinov IT Department.
Development of a Tier-1 computing cluster at National Research Centre 'Kurchatov Institute' Igor Tkachenko on behalf of the NRC-KI Tier-1 team National.
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
Egypt Contribution in CMS Egyptian Network of High Energy Physics (ENHEP) 1 Amr Radi and Adel Awad Center of Theoretical Physics (CTP) British University.
CNAF - 24 September 2004 EGEE SA-1 SPACI Activity Italo Epicoco.
Grid Colombia Workshop with OSG Week 2 Startup Rob Gardner University of Chicago October 26, 2009.
Brief introduction about “Grid at LNS”
Status of BESIII Distributed Computing
The Beijing Tier 2: status and plans
Belle II Physics Analysis Center at TIFR
Get more done with Windows 10 Pro for Workstations
Heterogeneous Computation Team HybriLIT
Kolkata Status and Plan
Update on Plan for KISTI-GSDC
3.2 Virtualisation.
PK-CIIT Grid Operations in Pakistan
Interoperability & Standards
Creating a Windows 7 Professional SP1 Virtual machine
Creating a Windows 10 Virtual machine
Presentation transcript:

Status of T3 GRID site infrastructure in Egypt Ashraf Mohamed Kasem Department of physics Ain Shams University

Machine specs Dell Precision T1600 Technical Specifications Processor Intel ® Xeon ® E series or 2nd generation Intel ® Core™ processors. Turbo Boost mode and Intel Integrated HD Graphics on select processors. ChipsetIntel ® C206 Memory4 GB up to 16 GB 1333 MHz ECC memory GraphicsNVIDIA Quadro 400 Hard Drive 500 GB SATA drive Communications-Integrated Intel ® 82579LM Gigabit LAN -Brocade Fiber Channel HBA

Machines IP addresses 1wn1.enhep.eg.net wn2.enhep.eg.net wn3.enhep.eg.net wn4.enhep.eg.net wn5.enhep.eg.net wn6.enhep.eg.net wn7.enhep.eg.net cn.enhep.eg.net ui.enhep.eg.net ms.enhep.eg.net dpm.enhep.eg.net squid.enhep.eg.net phedex.enhep.eg.net bdii.enhep.eg.net tbdii.enhep.eg.net For monitoring For monitoring

Storage capacity and allocation Item Host Name File System type Mount pointCapacity Volume Manager Storage Element SE-DPMExt3se2TB Computing Element CREAMExt3packages 300GB (Shared Capacity) User Interface UIExt3packages Worker Node (1-7)Ext3packages Storage Element SE-DPMExt3luserdata2TB User Interface UIExt3user1300GB Computing Element CREMCEExt3Busers 300GB (Shared Capacity) Worker Node (1-7)Ext3Busers

What installed Component Statues EMI middleware Installation EMI1 and EPELOk YAIM configuration for UI, WN, CREAM, BDII-top, BDII-site, SE_HEAD, SE_DISk Ok SquidOk Testing CREAM and WNsOk Testing DPM_disk and DPM_HEADOk Testing TOP and SITE BDIIRegistration is missing UI and CMSSWOK Xrootd and RFIO testOk

1GB GLORAID GLORIAD (Global Ring Network for Advanced Application Development) is a high-speed computer network used to connect scientific organizations in Russia, China, United States, the Netherlands, Korea and Canada. India, Singapore, Vietnam, and Egypt.computer networkRussiaChinaUnited StatesNetherlandsKoreaCanada But in Egypt during my work for grid at ASRT the link speed is too slow as we only able to make download with 4 MB as a maximum speed.

All previous information Have not been updated since Igor leaving. configuration files have been created by Igor with small help from me and Mohamed. At that time, our knowledge is very weak so that we cold not manage things.

But, now….. After visiting the LLR and Bari, a very good knowledge has been gained in grid site installation and configuration and we thought we can manage things well and with small help in the upper levels of the grid administration work.

Also now I am working with My colleague Dina in small T3 grid site at BUE using 9 virtual machines grid site Like the one we have installed at LLR. But in last few days we face a problem in the link speed, and This issue is resolved.

What is the future -1 Inclusion of more people to work in the project. Work for upgrading the T3 grid site in Egypt to work with EMI3. Make update to the configurations. Testing Registering our site at GOCDB data base as we are in contact with South African guys.

What is the future -2 faculty of science Ain shams university logo We are waiting for the processing of a suitable place for the devices that came from CERN to start to install another site at ASRT But bigger than the other one. In the far future we hope to have more than one site and I suggest places like Ain Shams, Helwan and El-Fayom universities.

What is the future -3 Also in the fare future hope to make a federation between the different grid sites. Hope to have T2 site in the near future.

Many thanks to ….. Amr Radi. Igor Semeniouk. Philippe Miné. Ludwik Dobrzynski. Giuseppe Iaselli. Giorgio Maggi.