T-StoRM: a StoRM testing framework

Slides:



Advertisements
Similar presentations
WP2: Data Management Gavin McCance University of Glasgow.
Advertisements

4 th DataGRID Project Conference, Paris, 5 March 2002 Testbed Software Test Plan I. Mandjavidze on behalf of L. Bobelin – CS SI; F.Etienne, E. Fede – CPPM;
System Integration Verification and Validation
Test Case Management and Results Tracking System October 2008 D E L I V E R I N G Q U A L I T Y (Short Version)
Software Frameworks for Acquisition and Control European PhD – 2009 Horácio Fernandes.
1 CMPT 275 Software Engineering Requirements Analysis Process Janice Regan,
© 2006, Cognizant Technology Solutions. All Rights Reserved. The information contained herein is subject to change without notice. Automation – How to.
MSF Testing Introduction Functional Testing Performance Testing.
Makrand Siddhabhatti Tata Institute of Fundamental Research Mumbai 17 Aug
TESTING STRATEGY Requires a focus because there are many possible test areas and different types of testing available for each one of those areas. Because.
1 Software Testing (Part-II) Lecture Software Testing Software Testing is the process of finding the bugs in a software. It helps in Verifying and.
SRM at Clemson Michael Fenn. What is a Storage Element? Provides grid-accessible storage space. Is accessible to applications running on OSG through either.
Glite I/O Storm Testing in EDG-LCG Framework Elena Slabospitskaya, Vadim Petukhov, (IHEP, Russia) Gilbert Grosdidier, (CNRC, France) NEC'2005, Sept 16.
EMI INFSO-RI EMI SA2 Report Quality Assurance Alberto Aimar (CERN) SA2 WP Leader.
EMI INFSO-RI EMI Quality Assurance Processes (PS ) Alberto Aimar (CERN) CERN IT-GT-SL Section Leader EMI SA2 QA Activity Leader.
 To explain the importance of software configuration management (CM)  To describe key CM activities namely CM planning, change management, version management.
CERN, 29 August 2006 Status Report Riccardo Zappi INFN-CNAF, Bologna.
EMI INFSO-RI SA2 - Quality Assurance Alberto Aimar (CERN) SA2 Leader EMI First EC Review 22 June 2011, Brussels.
INFSO-RI Enabling Grids for E-sciencE SA1: Cookbook (DSA1.7) Ian Bird CERN 18 January 2006.
A. Sim, CRD, L B N L 1 OSG Applications Workshop 6/1/2005 OSG SRM/DRM Readiness and Plan Alex Sim / Jorge Rodriguez Scientific Data Management Group Computational.
TESTING LEVELS Unit Testing Integration Testing System Testing Acceptance Testing.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
Conference name Company name INFSOM-RI Speaker name The ETICS Job management architecture EGEE ‘08 Istanbul, September 25 th 2008 Valerio Venturi.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
Jens G Jensen RAL, EDG WP5 Storage Element Overview DataGrid Project Conference Heidelberg, 26 Sep-01 Oct 2003.
EMI INFSO-RI Argus Policies in Action Valery Tschopp (SWITCH) on behalf of the Argus PT.
INFSO-RI Enabling Grids for E-sciencE ARDA Experiment Dashboard Ricardo Rocha (ARDA – CERN) on behalf of the Dashboard Team.
System/SDWG Update Management Council Face-to-Face Flagstaff, AZ August 22-23, 2011 Sean Hardman.
G.Govi CERN/IT-DB 1 September 26, 2003 POOL Integration, Testing and Release Procedure Integration  Packages structure  External dependencies  Configuration.
EMI INFSO-RI SA1 Session Report Francesco Giacomini (INFN) EMI Kick-off Meeting CERN, May 2010.
European Middleware Initiative (EMI) The Software Engineering Model Alberto Di Meglio (CERN) Interim Project Director.
The new FTS – proposal FTS status. EMI INFSO-RI /05/ FTS /05/ /05/ Bugs fixed – Support an SE publishing more than.
EMI INFSO-RI Software Quality Assurance in EMI Maria Alandes Pradillo (CERN) SA2.2 Task Leader.
EGI-Engage Data Services and Solutions Part 1: Data in the Grid Vincenzo Spinoso EGI.eu/INFN Data Services.
CABLING SYSTEM WARRANTY REGISTRATION. PURPOSE OF CABLING REGISTRATION.
Software Development Process CS 360 Lecture 3. Software Process The software process is a structured set of activities required to develop a software.
Automatic testing and certification procedure for IGI products in the EMI era and beyond Sara Bertocco INFN Padova on behalf of IGI Release Team EGI Community.
EMI Inter-component and Large Scale Testing Infrastructure Danilo Dongiovanni INFN-CNAF.
BNL dCache Status and Plan CHEP07: September 2-7, 2007 Zhenping (Jane) Liu for the BNL RACF Storage Group.
LHCC Referees Meeting – 28 June LCG-2 Data Management Planning Ian Bird LHCC Referees Meeting 28 th June 2004.
CERN Certification & Testing LCG Certification & Testing Team (C&T Team) Marco Serra - CERN / INFN Zdenek Sekera - CERN.
SG SCM with MKS scmGalaxy Author: Rajesh Kumar
EMI INFSO-RI Testbed for project continuous Integration Danilo Dongiovanni (INFN-CNAF) -SA2.6 Task Leader Jozef Cernak(UPJŠ, Kosice, Slovakia)
SAM architecture EGEE 07 Service Availability Monitor for the LHC experiments Simone Campana, Alessandro Di Girolamo, Nicolò Magini, Patricia Mendez Lorenzo,
G. Russo, D. Del Prete, S. Pardi Kick Off Meeting - Isola d'Elba, 2011 May 29th–June 01th A proposal for distributed computing monitoring for SuperB G.
DGAS Distributed Grid Accounting System INFN Workshop /05/1009, Palau Giuseppe Patania Andrea Guarise 6/18/20161.
Bologna, March 30, 2006 Riccardo Zappi / Luca Magnoni INFN-CNAF, Bologna.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI Services for Distributed e-Infrastructure Access Tiziana Ferrari on behalf.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
Enabling Grids for E-sciencE Agreement-based Workload and Resource Management Tiziana Ferrari, Elisabetta Ronchieri Mar 30-31, 2006.
Enabling Grids for E-sciencE Claudio Cherubino INFN DGAS (Distributed Grid Accounting System)
Implementation of GLUE 2.0 support in the EMI Data Area Elisabetta Ronchieri on behalf of JRA1’s GLUE 2.0 Working Group INFN-CNAF 13 April 2011, EGI User.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI solution for high throughput data analysis Peter Solagna EGI.eu Operations.
Introduction for the Implementation of Software Configuration Management I thought I knew it all !
Jean-Philippe Baud, IT-GD, CERN November 2007
Regional Operations Centres Core infrastructure Centres
StoRM: current status and developments
OGF PGI – EDGI Security Use Case and Requirements
StoRM: a SRM solution for disk based storage systems
Vincenzo Spinoso EGI.eu/INFN
Status of the SRM 2.2 MoU extension
SOFTWARE TESTING OVERVIEW
Cisco Data Virtualization
StoRM Architecture and Daemons
Introduction to Data Management in EGI
Testing for patch certification
Testbed Software Test Plan Status
Leigh Grundhoefer Indiana University
Overview Activities from additional UP disciplines are needed to bring a system into being Implementation Testing Deployment Configuration and change management.
Information Services Claudio Cherubino INFN Catania Bologna
Presentation transcript:

T-StoRM: a StoRM testing framework Elisabetta Roncheri, INFN CNAF as member of the StoRM team Munich, 28 March 2012

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany Overview Problem Statement T-StoRM solution Tests details T-StoRM usage Future work Conclusions 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany PROBLEM STATEMENT 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany What we considered StoRM is an implementation of the SRM interface It is characterized by Being usable on any posix file systems, such as GPFS and Lustre; Supporting several transfer protocols, such as gsiftp, file, https and http; Supporting access to tape by using GEMSS; Publishing information by using the GLUE standard; Supporting VOMS and GSI for authentication and authorization. Its deployment can be both easy and extremely complex according to site requirements. 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

Stand Alone StoRM deployment FrontEnd (FE) Exposes the Web service interface Manages user authentication Sends the request to the BackEnd DataBase (DB) Stores SRM request and status Stores file and space information BackEnd (BE) Binds with the underlying file systems Enforces authorization policy on files Manages SRM file and space metadata 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

Common StoRM deployment 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

Complex StoRM deployment 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany What we looked for StoRM is a multi-service software subject to intense testing, validation and verification activities. It needed something to Optimize the deployment of a new StoRM software version to be certified Simplify the reproduction of environments where error events raised to react as quick as possible Provide users that experience criticalities with suitable support in reasonable time Delegate validation and verification of remote sites to their administrators Reduce time and effort to support users Improve its software development life cycle 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany T-storm solution 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

What we thought: T-StoRM T-StoRM is a StoRM testing framework that orchestrates tests by using a proper deployment and a test engine fed with a pre-built configuration file. It is designed to simplify the integration of new test suites. It verifies the StoRM implementation of the SRM interface by using all the SRM clients. It tests SRM calls one by one, providing atomic test that can be arbitrarily arranged to build more complex tests. It produces a report log for all the executed tests. It will be included in EMI 2. This development is ongoing in collaboration with IGI. 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany T-StoRM deployment 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany Tests details 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany Testing Levels Levels Descriptions Area Atomic testing verifies any SRM operation Any SRM Functional testing verifies any SRM functional specification Sanity testing verifies the correctness of the StoRM installation and configuration StoRM Specific Regression testing verifies the correctness of each bugfix in StoRM Integration testing verifies the behavirour between different SRM implementations Performance testing determines how StoRM performs in terms of responsiveness and stability under a particular workload Conformance testing determines whether the SRM implementation agrees with the SRM specification and the SRM memorandum of understanding 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany Atomic Testing This level verifies any SRM operation This level contains a set of tests that are also used in functional, regression, and integration levels. SRM Operations SRM Clients ping ClientSRM srmping arcsrmping ptp ptg rm srmrm lcg-del rmdir srmrmdir mkdir srmmkdir ... 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany Functional Testing This level verifies any SRM functional specification. Each test reproduces a certain use case in order to verify the SRM functional specification. Each use case can be implemented by using different SRM clients, such as dCache client, lcg-utils, and StoRM client. Some examples are: Move a file from the local node to the storage element by using the gsiftp protocol Move a file between the local node and the storage element by using the gsiftp protocol 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany Sanity Testing This level verifies the correctness of the StoRM installation and configuration. It contains tests that are specific to StoRM. Each test of this level needs to be executed before those of the other levels. 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany Regression Testing This level verifies the correctness of each StoRM bugfix. It contains tests that are specific to StoRM. Each test verifies bugfixes for the current X.Y.Z-W and previous versions. If the StoRM version under verification is lower than X.Y.Z-W, the test should show the issue. Some tests can be only executed in a given version range. The test is executed on top of a test environment similar to the one that showed the bug. These tests cover atomic, functional, and sanity levels. 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany Integration Testing A new testing level has added to satisfy a request coming from EMI JRA1 This level verifies the transfer functionality between different SRM implementations. The designed and implemented tests verify the files transfer between the EMI storage elements by using the gsiftp protocols. Even if it is out of the T-StoRM original scope, it well fits in its architecture. 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

Integration Testing: Use Case 1 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

Integration Testing: Use Case 2 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany Performance Testing This level determines how StoRM performs in terms of responsiveness and stability under a particular workload. The tests are not specific to StoRM. Load and stress tests are currently supported. 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany Load and Stress Tests Load tests are executed to understand the behaviour of the system under a specific load. Metrics to be gathered are for example: Response time of the service by using an SRM client on a user interface Service resource usage such as CPU and memory. Stress tests are executed to understand the upper limits within the StoRM system, and its robusteness in terms of load. They are also useful in production deployment to determine if the system satisfies expected bust load. 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany Conformance Testing This level determines if the SRM implementation agrees with the SRM specification (https://sdm.lbl.gov/srm-wg/doc/SRM.v2.2.html) the SRM Memorandum of Understanding. The more significant tests will be developed. One of the examples is: Verify the srmPING behaviour: having authorizationID as Input returning as Ouput: versionInfo of the SRM specification that is implemented extraInfo of the SRM implementation such as the StoRM version 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany T-storm usage 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany The Current Usage Who: the StoRM team the IGI-RTC (Release, Testing, Certification) group Why: To certify the StoRM version To verify the correctness of a StoRM installation and configuration in the StoRM testbed To verify new functionalities and fixed bugs 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

The Test Production Procedure The production of a new T-StoRM test follows the 5 steps described below: the test is identified with the pair (id, name): id is a unique identifier of 6 chairs; it is defined and described in the Test Plan document where test id is included it is developed in the correct test level it is included in the test suite finally, it is released in T-StoRM 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany The Near Future Usage Who: Site-administrators SRM certifiers Other software certifiers Why To validate a StoRM installation and configuration To certify SRM software products To certify other software products containing tests suitable for other software To monitor SRM instances 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany Future work 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany Future Work Web-based Reporting To simplify the relation among the test plan, the tests and the report logs T-StoRM integration with a Continuos Integration Framework To automate tests execution To immediate tests results reporting Taking advantage from virtual technologies To automate deployment and configuration of StoRM 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany conclusions 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany

E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany Conclusions T-StoRM is making faster and simpler the StoRM certification process. It has shown to be able to cover several test levels even not considered at the design time. It is designed to be easily extendible in order to be used for the certification of other software. 02/08/2018 E. Ronchieri, EGI UF, 26-30 March 2012, Munich, Germany