Presentation is loading. Please wait.

Presentation is loading. Please wait. Serving society Stimulating innovation Supporting legislation LPIS QA a.k.a. quality assurance by the MS Wim Devos.

Similar presentations

Presentation on theme: " Serving society Stimulating innovation Supporting legislation LPIS QA a.k.a. quality assurance by the MS Wim Devos."— Presentation transcript:

1 Serving society Stimulating innovation Supporting legislation LPIS QA a.k.a. quality assurance by the MS Wim Devos

2 Outline 1.intro 2.quality concepts and standards 3.Abstract Test Suite 4.Executive Test Suite 5.LPIS QA portal 6.screening of the ETS records 7.thresholds 8.results

3 Until 2010, EC assessed LPIS performance during Audit missions: Investigating OTSC control files and repeating control procedures, LPIS issues emerged All aspects and the whole process can be investigated X Only a very small sample is checked and resources are limited. = an example of an internal quality control by external body Since 2010, in MS report on the observed performance of their LPIS, can add representativeness and systematic monitoring based on external quality control by internal body a proactive strategy of continuous quality control and reporting this was a very high priority History

4 Goal of the LPIS QAF 1.Provide a view on the state of the LPIS that is harmonised quantitative unbiased precise complete current 2.This allows for comparison between MS a pan-European overview 3.and serves as base for planning remediate actions by the MS (self-assessment) considerations about the effect of weaknesses found (audit)

5 1.Find the failed processes that cause anomalies or defects and their effects: 1.Regulatory blockage (e.g. historical GAC mask) 2.Missed update (i.e. the land has changed) 3.Failed upgrade (e.g. eligibility rules have changed) 4.Incomplete processing (e.g. under a military mask) 5.Erroneous processing (i.e. sloppy job done) 6.Incompatible design (e.g. absence of LC delineation ) 2.Analyse findings thoroughly before starting overhauling a system!!! 3.Implement appropriate remedies MS Self-Assesment

6 Outline 1.intro 2.quality concepts and standards 3.Abstract Test Suite 4.Executive Test Suite 5.LPIS QA portal 6.screening of the ETS records 7.thresholds 8.results

7 Quality the totality of characteristics of an entity that bears on its ability to satisfy stated and implied needs stated: in specification (denotation) what can be measured implied: in users expectation (conotation) what is considered fit for purpose Question: which is the better car: Audi A4 or Suzuki Grand Vitara? 7

8 quality assurance (QA): the set of activities whose purpose is to demonstrate that an entity meets all quality requirements (mostly producers perspective) ISO 9000 – series Quality Management System – Requirements quality control (QC): the set of activities or techniques whose purpose is to ensure that all quality requirements are being met (often clients perspective) ISO2859 & ISO3951 series sampling procedures attributes/variables non-conformity: non-fulfilment of a specified requirement i.e. non compliant with (a part of) the specification defect: non-fulfilment of an intended usage requirement i.e. a non-conformity that is critical so that the intended use is not possible anomaly: observed non-conformity (registered - confirmed?) instrument to monitor your quality and drive the upkeep Key definitions

9 Quality Policy - institution, procedure, data Quality Assurance – procedures, data Quality Control External Quality Audit management test procedures test results Quality Inspection (recurrent) – data JRC: documentation sampling imagery 1 st `+ 2 nd screening Member StatesEuropean Commission

10 Testing methodology ISO 19105: Conformance and testing 1.Prepare/document your LPIS implementation FC/AS 2.Verify the logical consistency with EU model ATS 3.Verify other data quality elements (=values) ETS 4.Report Model Conformance Test Conformance Statement (ICS) Abstract Test Suite (ATS) Data Conformance Test Executable Test Suite (ETS) Additional Information for Testing Conformance Test Report Analysis of results Application Schema or Feature Catalogue of the implementation under test ONCE YEARLY

11 Data Quality ElementData Quality Sub-elements completeness commission/ omission logical consistency conceptual consistency codelist consistency format consistency topological consistency positional accuracy absolute or external accuracy relative or internal accuracy gridded data position accuracy thematic accuracy classification correctness non-quantitative attribute correctness quantitative attribute accuracy temporal accuracy accuracy of a time measurement temporal consistency temporal validity ISO ATS ETS

12 In practice 1.CAP Regulations CwRS source imagery LPIS databases Reporting cycle 2.GI Components LCM as core data schema ATS for consistency check harmonised ortho-imagery specification ( INSPIRE A2) inspection = f(LCCS) 3.Industry standards ISO 2859 acceptance sampling

13 7 primary EC Concerns 1.correct quantification of the truly eligible area within the LPIS as a system 2.distribution of ineligible land over the reference parcels 3.categorization of reference parcels regarding ineligible land 4.occurrence of critical defects within a reference parcel 5.proportion of declared area inside a reference parcel 6.effectiveness of update processes regarding the LPIS system 7.relation of LPIS quality issues with error rates observed during the on the spot checks

14 Outline 1.intro 2.quality concepts and standards 3.Abstract Test Suite 4.Executive Test Suite 5.LPIS QA portal 6.screening of the ETS records 7.thresholds 8.results

15 Source: application schema or Feature Catalogue Name:PhyBlock Definition:Reference parcel representing production block, which is a continuous area of agricultural land and grouping together a number of neighbouring agricultural parcels cultivated by one or more farmer(s) and delineated by most stable topographical boundaries. Feature Operation(s): Feature Attribute(s): Attributes inherited from ReferenceParcel class Own attributes: M - uniqueID; referenceArea; effectiveDate; status; C – digitizedArea; farmedArea O - perimeter; M – eligibleLandType; C – dominantLandType; Subtype of:ReferenceParcel LCM_discussionInside of physical block class we in fact have two different approaches. One which is purely equal to the definition of production block and contains only agricultural land. Another approach shall be called rather topographic block, because in this case non-agricultural parcels such as forest, residential, water etc. can exist. Such systems often cover 100% of national territory. What is more important one parcel may contain combination of several types of land cover of agricultural /non-agricultural land, e.g forest block containing small arable or grassland patches. According to recent LPIS survey [8] 5 from 10 physical block systems account for the latter approach. Therefore, this type of physical block can not be defined fully by only one land cover attribute. To handle this case we proposed use of dominantLandType. However, when there are different eligibleLandTypes inside one parcel, the case is much closer to cadastral parcel then production block, and probably can be handled by introducing of sub-parcel class. Nevertheless, the total area of the eligible land inside parcel shall be stored in attribute referenceArea[JU1] [JU1]

16 ATS Structure Abstract test suite structure Modules and tests: e.g. Module A.1.1 can be assigned Conforming value if one of the tests A OR A OR A OR A is Conforming Test purpose: verify definition of reference parcel: reference parcel demarcated by the farmer, who cultivates it (manage/execute his tenure rights: ownership, rent etc.) on multi-annual basis Test method: Inspect the documentation of the application schema or feature catalogue, verify consistency with ICS ATS NOTE: Conformant with Farmers block definition Test reference: LCM specification Test type:Basic test A.1.1 Basic test A A A A A A

17 ATSlog (xls-version) LCM - concept corresponding national implementation

18 ATS-log (xml) maps the core model into each national implementation dictionary 18

19 Scoreboard 19 Extract from ATS-log formality

20 ATS conclusion LCM-ATS look unimpressive but are quintessential! The ATS scoreboard needs no independent assessment/validation: incorrect mapping will cause poor ETS results 4 MS implemented remedial actions based on ATS findings LCM-ATS remained stable since 2009 urgent need for upgrade changed requirements from the reform best practices from the MS An untouched pool of information: All questions (e.g. ECoA) on design choices and procedures are (or should be) addressed in the FC/AS and ICS Allows for one common interface (cfr study JRC, TU Dresden) 20

21 Outline 1.intro 2.quality concepts and standards 3.Abstract Test Suite 4.Executive Test Suite 5.LPIS QA portal 6.screening of the ETS records 7.thresholds 8.results

22 Sampling 22 Based on ISO LQ prepared by the JRC from delivered data E.G. flanders population: within CwRS: selected by JRC: inspected by MS: 800 0,16% very small sample!!!

23 Inspection procedure stepwise instruction to ensure all observations are correctly made everything defined in LCM equivalence of CAPI and Field survey where possible observation/measurement separated from assessment & evaluation 23

24 Case IV LUI boundary GeoEye imagery (acquired in 2011) LUI On ground truth of the ongoing year MS identify the LUI (land under inspection, represented by the RP)

25 Case IV ETS 2011 delineation performed by the MS CAPI inspection Agricultural lands are measured Grassland (G): Agriculture land polygon area = 9414m 2 Hedge (BR):Landscape Feature polygon Area = 317m 2 Non agricultural areas are counted Natural Bare Areas CAPI benefits from crosschecking (2009 Bing imagery)

26 Field inspection 26 Alternative to vertices: 2 pics of field 2 pics of vertex Labour intensive: complementary ? Invisible boundary

27 What is assessed at parcel level? 1.Parcels with incorrect Maximum Eligible Area (area non-conforming) 2.Parcels with serious functional problems (critical defects) 1.absence of agriculture 2.invalid perimeter 3.invalid common boundaries 4.incomplete block 5.multi-parcel 6.multi-polygon 3.Reasons for their non-conformity 1.poor update 2.poor upgrade 3.poor processing 4.incomplete processing 5.poor design

28 1 standard observation record control file

29 Outline 1.intro 2.quality concepts and standards 3.Abstract Test Suite 4.Executive Test Suite 5.LPIS QA portal 6.screening of the ETS records 7.thresholds 8.results

30 What? instrument for managing the LPIS QA operations Not to be confused with wikiCAP: Needs IE 8 or above! For the MS: By authorized users (managed by the MS) data upload by MS (Proprietary zones, RP population, control records) data download by MS (pre-selection lists) linked to the CID portal (for CwRS zones) run automatic processes by JRC (sampling, validation) For the EC: 24/7 monitoring of status of each MS queries on options (e.g. selected CwRS zones) work floor contacts … 30

31 Data exchange process Abstract Test Suite (ATS) Screening Procedures LPIS Implementation Feature Catalogue Application Schema ATS reporting package XML PDF ETS reporting package XMLPDF GML Executive Test Suite (ETS) Random Sample Pre-selection GMLXML Orthoimagery WMS Sample pre-selection package

32 32 EC interface OK incomplete ??? incomplete very late late

33 33 MS interface

34 34 Key Numbers Infrastructure database size (geodb + portal db): 50GB Nr of uploaded files: 2400; size on disk: 50GB (zipped) Nr of packages (annual + discrete): 399 Nr of registered active users: 47 Content Nbr of tables Nbr of parcels (pts) 2010: : : Nbr of parcels (polygons)

35 Outline 1.intro 2.quality concepts and standards 3.Abstract Test Suite 4.Executive Test Suite 5.LPIS QA portal 6.screening of the ETS records 7.thresholds 8.results

36 What? Systematic inspection of the inspection records 1.methodological screening to identify and provide feedback on methodological issues assure correct implementation by discouraging manipulation 2.quality audit to support or revoke the scores and conclusion ( Y/N) Neither: a repetition of the inspection by the MS (2 nd opinion) 2.involves calibration or mitigation of the expectations ( thresholds) 3.directly yields a good LPIS or bad LPIS evaluation result Both make only sense in 1.presence of ALL required data (full population, zero state, imagery, neighbouring parcels) 2.absence of manifest manipulation (timely procedure, scoping, zone selection) 365 June 2012

37 copy-paste no-effectresults become unreliable 375 June 2012

38 Skipping cherry picking Parcel should be skipped, but wasntparcel skipped, without reason 385 June 2012

39 incomplete packages ? ? ? Lands can be added à la carte

40 2010 screening report Investigation full process flow of 10 inspected parcels

41 2011 screening report Investigation of: package content 10 random parcels

42 2011 screening findings

43 Outline 1.intro 2.quality concepts and standards 3.Abstract Test Suite 4.Executive Test Suite 5.LPIS QA portal 6.screening of the ETS records 7.thresholds 8.results

44 purpose Thresholds are values that control the decision flow! Currently thresholds are set to trigger analysis + reporting scores and findings must be interpreted appropriately E.g. poor QE1 score risk to the fund ? - Not necessarily. In general: 1.QE1-QE5 relate to biases in land cover resp. land use 2.QE2-QE3-QE4 provide parcel-based information 3.QE6-QE7 monitor effects of LPIS operations 445 June 2012

45 rationale QE1 (total area): 2%: threshold for serious error by ECoA DAS QE2 (rate of area based non conforming parcels): 3%: this threshold difference is specified in the Comm Reg 2009R1122 art Δ (area observed, area declared): 5 % and 7% thresholds introduce technical tolerance for smaller parcels 1ha: maximum tolerance of OTSC methodology QE3 (causes of non-conformities and defects) and : 5%, arbitrary: serves an alert function. QE4 (rate of defects): LQ2, (in percent nonconforming parcels) is set to 2 as in the threshold for serious error by ECoA QE5 (area declaration rate): informative QE6 (accumulated change rate): 25%, arbitrary, serves an alert function. QE7 (rate of irregular applications): no significantly effect (χ2) better_the_quality_expectations.29 No threshold is set by DGAgri on LPIS specific criteria!!! 45

46 beyond analysis…. important decisions depend on LPIS quality reduction of OTSC sample rates (abolition of the 4% risk sample) launch of LPIS refresh projects and overhauls estimation of financial risks of system Challenges: are the current thresholds fit for purpose for each of these ? is one single set of thresholds realistic? Scoreboard (=assesment) quality statement (evaluation)

47 Outline 1.intro 2.quality concepts and standards 3.Abstract Test Suite 4.Executive Test Suite 5.LPIS QA portal 6.screening of the ETS records 7.thresholds 8.results

48 Information offered by LPIS QA scoreanalysis remediesscreening

49 ASSESSMENT REPORTS Key numbers 43 reports for 45 lots 31 independent reports 18 in English (or translations) (excl. 4 UK and 1 IE) Min 2 / max 15 pages (incl. remedial action plan) Possible methodological issues (out of 31) scoping (general or specific):11 Land cover vs eligibility: 2 Understanding causal processes: 7 Applying acceptance number Ac21 ? Mitigate observations:7 ETS v February (part.) 40 / 43 exp Min 3 / max 26 (out of 40) ? (part.)2010

50 FINDINGS (out of 31) Self-assessment non-conforming: 22 # QE thresholds3 CAPI problems (image quality):9 wysiNwyg: 9 need RFV:4 (+ 1) These 3 problems are unrelated Report improvement effect JRC first opinion: 23/31 (28/43) (out of 40) / (part.)2010

51 SOME 2011 OBSERVATIONS & QUOTES cadastral parcels: 2 LPIS measure >90% of sample, 4 LPIS <15% report contaminated RP > 0.1ha report RP with >0.1ha contamination in our AP system, all non-conformities can be attributed to errors from farmers RP bordering woodlands were skipped due to poor image quality Art 31 holds MS may use GNSS, so ETS field activities cannot demand the use of GNSS update rates were measured differently for 2010 and 2011 so the rate for 2010 must be 0 we do not regard topographic blocks consisting of 10 or more permanent fields as a risk. cause for non-conformance: 1. Area observed (Aobs) is less than the area recorded (Arec)

52 REMEDIAL ACTIONS (out of 31): do nothing: 4 correct the found non-conformities:2 apply database changes:14 improve farmers input:8 improve OTSC feedback:6 start-strengthen intergov collaboration:6 start-strengthen periodic refresh:17 continue acute update:10 do at least 2 of the last 5 above:20 set up quality system: improve documentation & training: strengthen IT processes: strengthen organization: substitute VHR with aerial: (out of 40): (part.) 2010

53 REMEDIAL ACTION PLANS Driven by: Self-assessment: 18 Audit findings:5 Request revisions (out of 31) account for farmer update: 6 certain QE thresholds:10 JRC first opinion: 20/31 (24/43) 2011 (part.) /40 assessment 2010

54 PRELIMARY 2011 CONCLUSIONS clear improvement since 2010 on all points in nearly all MS 1.scores apart from QE6 (change rate): scores calculated in all reports 50% of LPIS passes all 6QE thresholds 2.assessment reports much less serious methodological issues (scope, land cover, mitigation) better relevance and acceptance of measures and thresholds better evidence of understanding causes behind the scores 3.remedial action plans 5 new types of action identified nearly all plans tackle several processes remaining weaknesses: TO ADDRESS BY THE MS!!! 1.version control ETS v4.3/draft v5.0/v5.1 2.timing TOO late sampling + 3 reports missing 3.acceptance numbers e.g. 1% expectation 8/800 but = 10/800

55 conclusions of an independent panel 1.The LPIS QA is unique and important initiative and a due step forward, involving: close MS-EC interaction, including digital data exchange comprehensive documentation and well defined guidance 2.The documentation can be improved further 3.The QE thresholds appear reasonable but some need further investigation and political or scientific motivation 4.The instrument can be used for self-assessment; for justification of LPIS involved risks, following potential weaknesses need addressing: the reference data need to be of appropriate quality data must be independent parcel shape/size and positional accuracy need further research. DGagri JRC DGagri DGagri / JRC

56 True or False? TrueFalse+/- A good LPIS QA score indicates a good LPIS One can trust the LPIS QA scores provided by the MS LPIS QA scores are not directly indicative for their performance in IACS LPIS QA scores are isolated snapshots in time. It's more important to present good LPIS QA scores than to demonstrate understanding of issues Remedial plans should only be based on the analysis from the MS Corrective measures are the sole responsibility of the MS

57 Questions? 575 June 2012

Download ppt " Serving society Stimulating innovation Supporting legislation LPIS QA a.k.a. quality assurance by the MS Wim Devos."

Similar presentations

Ads by Google