Presentation is loading. Please wait.

Presentation is loading. Please wait.

GIGS Overview A slide pack that can be used to present both internally and externally the GIGS process and business benefits The International Association.

Similar presentations


Presentation on theme: "GIGS Overview A slide pack that can be used to present both internally and externally the GIGS process and business benefits The International Association."— Presentation transcript:

1 GIGS Overview A slide pack that can be used to present both internally and externally the GIGS process and business benefits The International Association of Oil & Gas Producers Geomatics Committee Report # 430

2 GIGS is a process, for which OGP is supplying material
What is GIGS? GIGS - Geospatial Integrity of Geoscience Software History Initiated in 2007 as a Joint Industry Project (JIP), sponsored by OGP in response to significant concern and documented evidence of geospatial integrity failures in Geoscience software Purpose to provide geoscience software developers with recommended guidance concerning good industry practice regarding geospatial integrity Geospatial Integrity of Geoscience Software - GIGS GIGS started out in 2007 as a Joint Industry Project, sponsored by OGP, in response to significant concern and documented evidence of geospatial integrity failure in geoscience applications critical to E&P operator analysis, The JIP comprised a significant number of major Oil Gas E&P companies together with smaller regional operators. All had a common purpose. to provide geoscience software developers with recommended guidance concerning industry best practice in ensuring geospatial integrity GIGS is a process, for which OGP is supplying material

3 Why is GIGS necessary? The need for guidance concerning Geospatial Integrity with geoscience software comes from experience Such as: Lack of uptake of international data exchange standards Use of conflicting and inappropriate terminology Use of incomplete or incorrect CRS parameters Poor documentation Ambiguity caused by lack of adequate Meta-data Lack of audit-trail Almost all geoscience software require significant data management interaction to feed geospatial datasets into the applications. When international data exchange standards exist that have standardised unambiguous geospatial metadata associated with them, automation of the data loading process and interpretation of the appropriate CRS would be of significant business benefit in reducing time, effort and ensuring an improvement in geospatial integrity of the data. Within applications, many screens display different terminology to the geoscience user for what is essentially the same geospatial information. A lack of, or poor documentation is a significant contributory factor to geospatial integrity errors being made by the geoscience user when utilising geoscience software. Incorrect labelling of data and lack of meta-data is a common cause of confusion. incorrect data entry, potentially leading to erroneous spatial analysis and conclusions A comprehensive audit trail regarding data source, loading, coordinate operations etc would significantly enhance user confidence and understanding of the provenance and geospatial integrity of geoscience datasets.

4 What is Geospatial Integrity?
Geospatial integrity is defined as the adherence of geospatial data to the following criteria: • Completeness • Correctness • Consistency • Verifiabilty Completeness The coordinates of Geospatial data must be associated with the appropriate coordinate reference system (CRS), which should defined fully. Coordinates are in themselves ambiguous – they need the CRS to be identified to remove the ambiguity. Where coordinate operations are performed on the data, these operations must be defined unambiguously. Coordinate reference systems and coordinate operations is referred to as geospatial metadata. Correctness Applications must honour the precision of the coordinates: no apparent precision may be suggested by the addition of decimal places; coordinate operations must be executed commensurate with the precision of coordinates and the accuracy of the algorithm. The defining parameters of CRSs and coordinate operations must be free from numerical and terminological errors. Consistency Data model and terminology must be applied consistently through the application or application suite. Verifiability The user must be able to ascertain that completeness, correctness and consistency have been achieved and maintained .

5 What is the objective of GIGS?
A process and guidance note intended for wide use within the E&P industry to improve geospatial integrity. It is aimed at vendors and users of any computer package used in geoscience activities, including applications, processing packages underlying databases user interfaces It also includes software components or layers, such as: geodetic computation engines extensions middleware This Guidance Note applies especially to the software functions that address spatial data import, creation, merging, processing, coordinate operations, map projections, visualisation, and export, although it is also relevant for existing product maintenance and to new product design, testing and production support. It does not address raw data processing methods (e.g. wellbore curve calculation methods) or surface engineering software, though the general principles are still valid, nor does it address the quality of the geoscience datasets themselves. The focus of this Guidance Note is on the preservation of coordinate integrity and on maintenance of geospatial quality that is inherent within the original data set.

6 What is the GIGS material?
The GIGS material is delivered in three parts: ’Part 1 – Guidelines’ (OGP publication order code 430-1), ‘Part 2 – Software Review’(OGP publication order code 430-2), and ‘Part 3 – User guide for the GIGS Test Dataset’ (OGP publication order code 430-3) Software review checklist; GIGS Test Dataset. Sample MS PowerPoint slides (this slide deck!) It is supplemented by a number of companion electronic files: The Guidance Note is delivered in three parts: ’Part 1 – Guidelines’, describing the GIGS process – It also includes a Glossary of Terms” which is extremely useful as a tool for education and training and as an aide-memoir. ‘Part 2 – Software Review’, containing a software review checklist to enable structured examination of the spatial integrity of geoscience software and ‘Part 3 – User guide for the GIGS Test Dataset’ This Guidance Note is supplemented by a number of companion electronic files: Software review checklist, an MS-Excel spreadsheet intended to facilitate the execution of a software review and capture its results; GIGS Test Dataset, a series of data files to be used for testing of the algorithms and data exchange capabilities of the geoscience software. Sample MS PowerPoint slides explaining GIGS process and business benefits – this slide deck! The above digital documents and files are available from the OGP Geomatics Committee website The above digital documents and files are available from the OGP Geomatics Committee website

7 The GIGS guidelines contain:
GIGS guidance The GIGS guidelines contain: Technical background to geospatial integrity Definition of a geospatial dataset Key geodesy concepts Coordinate operations - what are they? The EPSG Geodetic Parameter Dataset - why is it important? GIGS Software Review Glossary of Terms Technical background to geospatial integrity Definition of a geospatial dataset A geoscience dataset is referenced to the real world by its geospatial data, which consist of two principal elements that are inextricably linked: The coordinate dataset (‘the coordinates‘) and The coordinate reference system (in this Guidance Note sometimes referred to as par t of the ‘geospatial metadata’) If the coordinates are presented in the absence of their geospatial metadata, the resulting positions are ambiguous and should be considered unreliable. Key geodesy concepts In order to fully understand geospatial integrity, and the solutions proposed in GIGS, it is necessary to understand the conceptual background of geodetic referencing. Coordinate operations - what are they? Coordinate operations (map projection conversions and coordinate transformations) change the coordinate values of a coordinate dataset, so that the dataset becomes referenced to a different CRS. To merge geosciences datasets they all need to be referenced to the same CRS The EPSG Geodetic Parameter Dataset - why is it important? The EPSG Geodetic Parameter Dataset, or EPSG Dataset in short, is a de facto global standard repository with definitions of: Coordinate reference systems and their component elements. This includes the definition of map projections, which are part of the definitions of projected CRSs. Coordinate transformations and conversions, including the associated parameter values and the description of the algorithm associated with each coordinate operation. Its content is vetted by experts. Historically the parameter values found in geosciences applications have sometimes been incorrect. GIGS Software Review – this will be described on the next slide Glossary of Terms The list of geospatial terms and acronyms used within the GIGS guidelines are defined for clarity, with additional text providing clarification of the definition or an example. The source of the definition is also indicated where relevant. This glossary of terms can prove a valuable resource as a training and reference tool, improving software documentation and promoting appropriate and standard terminology.

8 What is a GIGS software review?
A GIGS software review is a structured approach to evaluating the geospatial integrity aspects of geoscience software and consists of: • A qualitative evaluation of the software’s geospatial capability by means of a series of checklists; • A quantitative evaluation of the software’s capabilities by means of test data. Both software vendors/developers and clients/users may execute a GIGS review and benefit from its results. A GIGS software review is a structured approach to evaluating the geospatial integrity aspects of geoscience software and consists of: • A qualitative evaluation of the software’s geospatial capability by means of a series of checklists; • A quantitative evaluation of the software’s capabilities by means of test data. Both software vendors/developers and clients/users may execute a GIGS review and benefit from its results. The software review process is explained on the next slide

9 The software review process
A series of test procedures have been defined: Coordinates and their Geodetic Reference (series 0000) Documentation and Release Notes (series 1000) Pre-defined Geodetic Parameter Library (series 2000) User Defined Geodetic Parameter Library (series 3000) The User Interface (series 4000) Data Operations (series 5000) Audit Trail (series 6000) Deprecation (series 7000) - related to use of EPSG Dataset Error Trapping (series 8000) The individual tests are numbered for the purpose of reporting To facilitate ease of understanding, the review process has been split into a number of aspects, Coordinates and their Geodetic Reference (series 0000) General aspects of geospatial integrity in software, notably the association of geodetic metadata to coordinates Documentation and Release Notes (series 1000) Overview documentation, release notes, website documentation relating to the geospatial aspect of, and geodetic database within, the geoscience software Pre-defined Geodetic Parameter Library (series 2000) The Predefined Geodetic Parameter Library within the Geodetic engine of the geoscience software User Defined Geodetic Parameter Library (series 3000) The User Defined Geodetic Parameter Library within the geodetic engine of the geoscience software The User Interface (series 4000) The nomenclature, user orientated nature and accuracy of information of the User Interface for the geodetic engine of the geoscience software Data Operations (series 5000) The data operations, primarily the 2D seismic, 3D seismic and wellbore survey data manipulation in the geodetic engine of the geoscience software Audit Trail (series 6000) The audit trail for operations carried out within the geodetic engine of the geoscience software Deprecation (series 7000) - related to use of EPSG Dataset The deprecation of algorithms and files within the geodetic engine of the geoscience software Error Trapping (series 8000) The error trapping facilities provided for operations carried out in the geodetic engine of the geoscience software The individual tests are numbered for the purpose of reporting The review process is guided by checklists which describe what the reviewer should test for within the software (part 2 of this Guidance Note). In many places these checklists direct the reviewer to carry out Test Procedures using data from the GIGS Test Dataset.

10 Classification of software evaluation results
1. ‘Elementary’ intended for software without capability of performing coordinate operations, this level indicates that the software satisfies minimum requirements for this category of software 2. ‘Bronze’ intended for software with limited capability to perform coordinate operations, this level indicates that the software satisfies minimum requirements to achieve a basic level of geospatial integrity. 3. ‘Silver’ intended for software with full capability to perform coordinate operations, this level indicates the software establishes and maintains geospatial integrity to a fully satisfactory degree, based on industry best practices. The software is suitable for global deployment in the E&P industry 4. ‘Gold’ - intended for software with extensive capability to perform coordinate operations, this level indicates software performance that exceeds the geospatial integrity capabilities of the ‘silver’ level by incorporating additional software features that expand the range of applicability and/or reduce the probability of geospatial integrity violations. The interpretation of the evaluation results of geospatial aspects of the software application will depend on the scope of the application. Different software packages may have implemented functionality and reference data relating to geospatial integrity in different ways. In order to make a meaningful comparison possible this Guidance Note distinguishes four levels of compliance. These levels are identified by the following terms: The ‘bronze’, ‘silver’ and ‘gold’ compliance levels are progressively inclusive, in the sense that ‘silver’ level implies compliance to the ‘bronze’ level and ‘gold’ implies that ‘silver’ level is also achieved. The exception is the ‘Elementary’ level, which applies to software without coordinate operation capability. Geospatial integrity was defined in an earlier slide as compliance to four aspects: completeness, correctness, consistency and verifiability. The degree of compliance in the four levels listed leads to a specific definition of each aspect, appropriate to the level of compliance. This is described in detail within the associated GIGS Guidance Note What the ratings mean to developers and users is discussed in the next three slides.

11 The software review workflow
Five key steps in the review process can be identified as: 1. Define scope Full or partial review; include/exclude integration aspects 2. Prepare workplan Quantify time requirements 3. Identify and obtain expertise and resources People, equipment, software and data 4. Execute software review 5. Prepare report(s) Summary Report or Full Report, Conclusions & Recommendations Define scope Not always will a full GIGS software review be required: a new release version of a software tested before may only require an update of the existing test reports by updating what has changed. At the other end of the spectrum a user may wish to test a complex geoscience software in an integrated architecture where geospatial integrity is influenced by interactions with other software packages and data stores. GIGS does not address software testing in such integrated environments, but the methodology can easily be extended by analogy. Prepare workplan The levels of effort and difficulty in planning and executing a GIGS software review depend on the complexity of the software to be reviewed and the detailed scope of the review. The resources reserved for the review may have to include an element of training or familiarisation with the software. Identify resources Resources required for the review can be divided into: Expertise IT equipment and software licences Test data A GIGS software review requires contributions from several subject matter experts: IT expertise related to the software architecture & systems development, network support, security privileges and software licence issues; Geodetic and geospatial data management expertise (e.g. data loading); Domain and workflow expertise for the deployment of the software; Software support staff familiar with the software. The range and required level of skills are generally not found in a single person, so a team of several individuals will probably be required. Execute Review The review team will carry out the evaluation based upon the specific scope and will use the checklist provided in Part 2 of the guidance note. The scope of the checklist may be modified – either increased or decreased – depending on the purpose of the review. Numerical testing of the software’s capabilities and behaviour, by subjecting test data to the software’s functionality, should preferably be executed before the evaluation by checklist; when executed in parallel or after the review by checklist, the checklist will have to be revisited to record the results, including any issues that have been identified during numerical testing. A GIGS software review should preferably be done in the context of a user’s typical operating environment, using a typical hardware platform. This would indicate that the review should be done onsite under actual installed conditions. However, there can be advantages to an offsite review, where specific corporate IT infrastructure or security issues would not influence the application’s performance. Depending on the scope of the review, specific datasets, i.e. not included in the GIGS Test Dataset, should be collated and made available to the review team. Prepare reports Two types of reports are recognised Summary Report The results of the review are lists of compliant and non-compliant checklist responses for the software under test. A summary score per Test Series is automatically calculated in the spreadsheet and provides an overview of the capabilities of the geoscience software with respect to the GIGS geospatial integrity requirements. This summary result may form the basis for a Summary Report, suitable for management reporting and communications with end users. From the perspective of the software vendor the Summary Report can be used for marketing purposes and it may be used to identify the strong and weak points of the software’s geospatial capabilities, to be used to underpin an internal development programme for the software. Full Report, Conclusions and Recommendations After conducting a software review as described herein, a full report should be created. The report should detail conclusions drawn in the following categories and depending on which party conducted the review. When the review has been conducted by the vendor, the following aspects need to be covered in the report. Which of the geospatial integrity requirements does the software fully meet, partially meet, or does not meet? Requirements that are not met should be clearly delineated, with unresolved issues requiring user warnings and/or workarounds. It is very important that key geospatial integrity problems that might cause user errors are flagged and documented for circulation as part of the report on the product. For example, nomenclature in the software that may be incorrect or misleading should be clearly communicated to users and potential users of the geoscience software. Enhancements that may be considered by the developer for addition to subsequent release(s). When the review has been conducted by the user of the software the following aspects should be covered: Overall degree of geospatial integrity of the geoscience software and best practices observed. Unresolved issues, requiring user warnings and/or workarounds. It is very important that key problems that might cause user errors are flagged and documented by the reviewers for circulation throughout the company’s user community. For example, nomenclature that may be wrong or misleading should be clearly communicated to users. Critical problems which may require urgent attention by the vendor, and possibly suspension of its use within the company should be clearly documented and passed to appropriate authorities within the company. Enhancements which may be proposed to the vendor for their consideration in future releases. User guidance related to any of the above for further distribution within the company. Best practices observed in the geoscience software . Comparisons with other relevant geoscience software packages known to the reviewers, as appropriate.

12 For Clients / Users The GIGS software review process:
May be conducted on: Vendor software Company proprietary software May assist Geoscientists in establishing whether the software meets specific business and technical requirements Can be a key tool in the establishment and maintenance of geospatial data integrity within the business by: Optimising workflows Identifying both the strong and the weak points of geospatial data handling of relevant software Ensuring the appropriate tool is used to suit the purpose Identifying provision of additional guidance for the geosocience software user community The GIGS software review process May be conducted on: Vendor software Company proprietary software May assist Geoscientists in establishing whether the software meets specific business and technical requirements Can be a key tool in the establishment and maintenance of geospatial data integrity within the business by: Optimising workflows Identifying both the strong and the weak points of geospatial data handling of relevant software Ensuring the appropriate tool is used to suit the purpose Identifying provision of additional guidance for the geosocience software user community

13 For software Vendors / Developers
The GIGS software review process: Provides a means of self-certification or self-validation of the geospatial capabilities of the software. Enables more effective marketing of the product by communicating the results of the review to (prospective) clients Helps the vendor to identify development needs and prioritise improvements in the software. The structured software review provides an opportunity for education in this geodetic niche discipline and offers structure for communications with customers. For software Vendors / Developers The GIGS software review process Provides a means of self-certification or self-validation of the geospatial capabilities of the software. - essentially –does it work as we would like it to? Enables more effective marketing of the product by communicating the results of the review to (prospective) clients – we can show evidence that it is able to display and perform with the correct level of geospatial integrity in the areas it is designed to work within. Helps the vendor to identify development needs and prioritise improvements in the software. – if it cannot perform a specific operation correctly and that is deemed as important to the target audience of that software then it helps direct programming effort to address the issue. The structured software review provides an opportunity for education in this geodetic niche discipline and offers structure for communications with customers. It can be useful in communication to management and the user community the importance of attention to detail in these niche areas and show the magnitude of failure of geospatial integrity that may result through any failures.

14 The GIGS test dataset was designed:
for use in the evaluation of the geospatial integrity of geoscience software examined within the GIGS project to remain as a test harness for future reviews of geoscience software after publication of the GIGS Guidelines. a series of files provided in a variety of formats including industry data exchange formats and Microsoft Excel v2003 (.xls). Each file is designed for a specific GIGS test. Where practical, data from one test is reused for other test procedures consists of: The review and evaluation process is described in part 1 of this three-part Guidance Note. The review process is guided by checklists which describe what the reviewer should test for within the software (part 2 of this Guidance Note). In many places these checklists direct the reviewer to carry out Test Procedures using data from the GIGS Test Dataset. This part 3 document describes the test data and procedures for using that test data. The GIGS Test Dataset consists of a series of files provided in a variety of formats including industry data exchange formats and Microsoft Excel v2003 (.xls). Each file is designed for a specific GIGS test. Where practical, data from one test is reused for other test procedures. The test data can be described as being in one of three categories: Geodetic data definitions, used for the series 2000 (predefined geodetic parameter library), series 3000 (user-defined geodetic parameter library) and series 7000 (deprecation) tests; Conversion and transformation data, used for Data Operations series 5100 (map projections) and 5200 (other coordinate operations) tests; Seismic and wellbore data, used for Data Operations series 5300 (2D seismic location data), 5400 (3D seismic location data) and 5500 (wellbore data) tests. There are no test data files for the Documentation and Release Notes (Series 1000) or Error Trapping (Series 8000) tests. Some User Interface (Series 4000) and Audit Trail (Series 6000) tests utilise test data used in the Series 5000 tests. At this time, boundary and cultural data have not been developed for inclusion in the GIGS Test Dataset.

15 FAQs I am a Vendor / Developer I am a Client / User
My application gets a G/S/B rating. So what? Can I advertise my rating? What happens if I am nearly G/S/B? My application gets a G/S/B rating. What are the risks to my work/business? What should I do if the software does not reach S (silver)? Can I add more tests? I am a Client / User I am a Vendor / Developer My application gets a G/S/B rating. So what? Silver indicates the software establishes and maintains geospatial integrity to a fully satisfactory degree, based on industry best practices. If my software does not reach this standard, the software review tests will indicate the areas of concern. If your software is gold, the review will indicate the specific areas where it meets this standard. Can I advertise my rating? Of course. In fact clients may ask you for your GIGS rating. However, it should be recognised that the software review will only be applicable to the particular software version tested in the review. When new versions of the software are released tests may require review and re-testing dependent upon the software changes made. What happens if I am nearly G/S/B? Two response here – The first, as mentioned previously - the software review tests will indicate the areas of concern and where improvements can be made. Secondly, discussion with your clients regarding areas of shortfall will enable the client to mitigate these s shortfalls through corrective action in their use of the software – either by avoiding use of that “feature” or by the client utilising an alternative approach/software package if determined to be a serious omission. My application gets a G/S/B rating. What are the risks to my work/business? Close discussion with the vendor/developer will determine what geospatial integrity issues exist with that package (i.e. the GIGS test review results). Knowledge of what geospatial integrity issues there are with the software will help with the decision how to mitigate these issues i.e. if the software does not use a particular ProjCRS, then you know either to use an alternative software package that does, or that you simply work in a different ProjCRS. What should I do if the software does not reach silver level? Work closely with the vendor / developer to understand the points of failure. These may be in the user interface, in which case extra training may be necessary for the geoscience users and the vendor / developer should be encouraged to improve their software in this area for future releases. Can I add more tests? The tests are not exhaustive. They were developed based upon criteria developed from the original JIP and supplemented from review within the OGP geomatics committee. They can and should be added to and improved when dealing within specific Areas of Interest that have specialist or non-standard CRSs.

16 Backup slides

17 GIGS software review checklist
It is strongly recommended to use the MS-Excel spreadsheet version of the GIGS software review checklist to conduct the software evaluation. However, software evaluators who do not wish to do that can use the printed versions of the group of Test Series in this document. The main advantages of the spreadsheet version are the overview it provides by using ‘worksheets’ or tabs, and the automatic calculation of a summary score. Furthermore mistakes are less likely because the spreadsheet contains (limited) functionality for checking the consistency of entries. Also correction of mistakes is easier. The evaluator should enter the response of the software to the test criterion into the column marked in the header with the text “Enter Yes or No”. In the spreadsheet the fields in this column will only permit a “Yes “or “No” as valid entry values. Some tests offer several possible responses, reflecting the degree to which geoscience software may satisfy the test criterion. These reflect the Bronze, Silver and Gold classification levels explained in Section 3.3 of OGP document “Geospatial Integrity of Geoscience Software Part 1 – GIGS Guidelines”. The relevant fields are colour-coded with a colour that symbolizes the classification level of the test; in addition the classification level of each possible response is indicated by a letter in bold and in brackets: In a number of instances an informative question precedes the test: The last column of each Test Series worksheet allows the evaluator to record comments that qualify or provide additional information on the test response. As space is limited in this column, software evaluators may wish to create a separate file to record such information, entering cross-references in this worksheet column. The tests are numbered (1, 2, ...) and, where relevant, sub-numbered [ i), ii), ii), ...]; the first two columns of the spreadsheets are reserved for this numbering. The test criterion is written in black font against a pale green background. Italic text in dark grey has been added to some test criteria to provide clarification of the test.

18 Consolidation of evaluation results
GIGS grading The supplied spreadsheet contains formulas that will automatically summarise the entries into a consolidated result per Test Series. This is helpful in reporting the results of any structured software reviews conducted with the GIGS methodology. The consolidated score for any given Test Series therefore shows the minimum level at which the software is rated. To obtain e.g. a Silver level score for a Test Series, all tests that list a Silver-graded response should have a score of at least Silver, while all tests that only allow a Bronze-level response should have a score at that Bronze level. If even one test acquires only a Bronze score where a Silver option was listed, the overall score will revert to Bronze, unless also a Bronze-rated test response is missed, in case no GIGS rating is achieved at all. The above method of scoring demonstrates the progressive nature of the GIGS rating: A Silver grade implies that all Bronze-rated requirements are met. The consolidated score for any given Test Series therefore shows the minimum level at which the software is rated. It is, in other words, not possible to compensate shortcomings on one aspect of the software by superior results on other aspects.. The Elementary scores do not play a role in the evaluation of Bronze, Silver or Gold, as it applies to an entirely different category of software, without coordinate operation capability. For such software an Elementary grade can only be achieved if all tests containing a response for the Elementary category have received a “Yes” entry.

19 GIGS Critical problems which may require urgent attention by the vendor, and possibly suspension of its use within the company should be clearly documented and passed to appropriate authorities within the company. Enhancements which may be proposed to the vendor for their consideration in future releases. User guidance related to any of the above for further distribution within the company. Coordinates are ambiguous unless the coordinate reference system (CRS) to which they are referenced is identified. Latitude and longitude or geographical coordinates are typically used to map on the curved surface of the earth or ellipsoid; a latitude and longitude graticule can however be plotted on a two-dimensional map. The choice of a reference ellipsoid does not define a geodetic datum; a specific ellipsoid can be the basis for many geodetic datums. A geodetic datum can have only one reference ellipsoid; identifying a specific datum implies that ellipsoid and no other. Easting and northing coordinates are typically used to map on a projected plane. Map projection formulae distort the true curved surface of the earth in area, shape, orientation, and scale, by representing it on a two-dimensional flat map surface. Map coordinates are not unique unless qualified with all parameters of the projected CRS (including specification of its base geographic CRS, its map projection and its coordinate system). Heights are not unique unless the CRS to which they are referenced, including the vertical datum, is identified. Azimuths and bearings are not unique unless qualified with a heading reference. Length and angular parameter values are not unique unless qualified with a unit identification


Download ppt "GIGS Overview A slide pack that can be used to present both internally and externally the GIGS process and business benefits The International Association."

Similar presentations


Ads by Google