Presentation is loading. Please wait.

Presentation is loading. Please wait.

111111 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro Asunción Gómez-Pérez September 28th, 2004 Benchmarking.

Similar presentations


Presentation on theme: "111111 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro Asunción Gómez-Pérez September 28th, 2004 Benchmarking."— Presentation transcript:

1 111111 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro Asunción Gómez-Pérez September 28th, 2004 Benchmarking in WP 2.1

2 222222 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Index 1.Progress 2.Deliverable 2.1.4

3 333333 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez 0 6 12 18 24 30 36 42 48 D2.1.1: Benchmarking SoA D2.1.4: Benchmarking Methodology, criteria, test suites D2.1.6: Benchmarking building tools Benchmarking querying, reasoning, annotation Benchmarking semantic web service FinishedStartedNot started Progress: Benchmarking activities timeline

4 444444 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Ontology Technology/Methods EvaluationBenchmarking Desired attributes Weaknesses Comparative analysis... Continuous improvement Best practices Measurement Experimentation What has been done? in D 2.1.1 Survey of Scalability Techniques for Reasoning with Ontologies Overview of benchmarking, experimentation, and measurement State of the Art of Ontology-based Technology Evaluation Recommendations

5 555555 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez What are we doing? T 2.1.4 Benchmarking methodology, criteria, and test suites General evaluation criteria: Interoperability Scalability Robustness Benchmark suites for: Interoperability Scalability Robustness Benchmarking supporting tools: Workload generators Test generators Monitoring tools Statistical packages... Benchmarking results: Comparative analysis Compliance with norms Weaknesses Recommendations on tools Recommendations on practices Benchmarkin g Methodology Ontology tools: Ontology building tools Annotation tools Querying and reasoning services Semantic Web Services technology GOAL: Provide a framework for benchmarking activities in WP 2.1 (and maybe other WPs)

6 666666 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Experiment results: test 1 test 2 test 3... Experiment results: test 1 test 2 test 3... What will be done? T 2.1.6: Benchmarking of ontology building tools Tools/Partners:... Benchmarking results: Comparative analysis Compliance with norms Weaknesses Recommendations on tools Recommendations on practices Benchmark suites: RDF(S) Import capability OWL Import capability RDF(S) Export capability OWL Export capability... Interoperability Do the tools import/export from/to RDF(S)/OWL? Are the imported/exported ontologies the same? Is there any knowledge loss during import/export?... UPM Experiment results: test 1 test 2 test 3... NO OK Benchmarkin g ontology building tools Benchmarking supporting tools: Workload generators Test generators Monitoring tools Statistical packages...

7 777777 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Index 1.Progress 2.Deliverable 2.1.4

8 888888 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez D 2.1.4: Deliverable outline 1.Introduction 2.Benchmarking methodology 3.Building test suites for ontology tools 4.General supporting tools for benchmarking 5.Benchmarking ontology development tools and tool suites 6.Benchmarking ontology-based annotation tools 7.Benchmarking ontology querying tools and inference engines 8.Benchmarking semantic web service technology 9.Conclusion 10.Glossary D 2.1.4: Specification of a methodology, general criteria, and test suites for benchmarking ontology tools

9 999999 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Processes D 2.1.4: Benchmarking methodology Processes InputsOutputs Task 1Task n... Plan 1 Goals identification 2 Subject identification 3 Management involvement 4 Participant identification 5 Planning and resource allocation 6 Partner selection Experiment 7 Experiment definition 8 Experiment execution 9 Experiment results analysis Improve 10 Report writing 11 Findings communication 12 Findings implementation 13 Recalibration Methodology processes Methodology: Benchmarking process is: Planned Collaborative More Semantic Web oriented More KW oriented

10 10 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez D 2.1.4: Benchmarking methodology Plan 1.- Benchmarking goals identification Goals depend on the organisation’s vision, objectives, and strategies. 2.- Benchmarking subject identification 3.- Management involvement Inform the organisation's management about the benefits of the benchmarking study and its costs. Management support is needed to proceed and when implementing changes based on the benchmarking. 4.- Participant identification Identify and contact the members of the organisation that are involved with the selected tool. Select and train the members of the benchmarking team. 5.- Benchmarking planning and resource allocation The planning must consider time and resources. The planning must be integrated into the organisation's planning. Analyse the current tools in the organisation. Select, understand, and document the tool whose improvement would significantly benefit the organisation, according to: end user needs or expectations, organisational goals, etc. 6.- Benchmarking partner selection Identify, collect, and analyze information about the tools that are considered the best. Select the tools to benchmark with and make contact with someone in their organisations. The partner organisations may not belong to KW. Not all ‘best in class’ tools are developed by KW partners.

11 11 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez D 2.1.4: Benchmarking methodology Experiment 7.- Experiment definition 8.- Experiment execution 9.- Experiment results analysis Determine the experimentation plan and method. Define the experiment that will be performed. The experiment must collect not just the data on the performance of the tools but the reasons of this performance. Communicate the partners the experimentation plan and method and agree on it. Perform the experiment according to the experimentation plan and method. The collected data must be documented and prepared for analysis. Compare the results obtained from the experiments and the practices that lead to these results. Document findings in a report, including the best practices found (if any).

12 12 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez D 2.1.4: Benchmarking methodology Improve 10.- Benchmarking report writing The benchmarking report must provide an understandable summary of the benchmarking study with: An explanation of the benchmarking process followed. The results and conclusions of the experiments. The recommendations on improving the tools. 11.- Benchmarking findings communication Findings must be communicated to all the organisation (including identified participants) and to the benchmarking partners. Collect and analyze any feedback received. 12.- Benchmarking findings implementation Define a planning for the implementation of the benchmarking findings. Implement the necessary changes in order to achieve the desired results. Periodically monitor the benchmarked tool. 13.- Recalibration Recalibrate the benchmarking process using the lessons learnt. The benchmarking process should be repeated forever in order to obtain a continuous improvement.

13 13 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez D 2.1.4: Building test suites for ontology tools How to develop a test suite. The desirable properties that a test suite should have.

14 14 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez D 2.1.4: General supporting tools for benchmarking List of tools that can be useful when performing benchmarking activities, like: Test generators Workload generators Monitoring tools Statistical packages...

15 15 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez D 2.1.4: Benchmarking ontology... tools 1.Candidate tools List of candidate tools to be benchmarked: Description Reasons for inclusion 2.General evaluation criteria Ontology... tools functionalities with the general evaluation criteria that can be used when evaluating or when benchmarking these functionalities. Related to WP 2.1 topics (scalability, robustness, and interoperability). 3.Test suites Test suites for ontology... tools related to WP 2.1 topics (scalability, robustness, and interoperability). 4.Supporting tools Supporting tools specific to ontology... tools. 5.Conclusion Development Annotation Querying/inference Semantic Web Service

16 16 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez D 2.1.4: Glossary Definitions of terms used in the deliverable: Benchmark Benchmarking Benchmarking partner Best practice Interoperability Robustness Scalability...

17 17 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez D 2.1.4: Tasks and responsibilities D 2.1.4 Specification of a methodology, criteria, and test suites for benchmarking ontology tools Raúl García-Castro (UPM) 1.- IntroductionRaúl García-Castro (UPM) 2.- Benchmarking methodologyRaúl García-Castro (UPM) 3.- Building test suites for ontology toolsRaúl García-Castro (UPM) 4.- General supporting tools for benchmarkingRaúl García-Castro (UPM) 5.- Benchmarking ontology development tools and tool suites Raúl García-Castro (UPM) 6.- Benchmarking ontology-based annotation tools? Raúl asks Sheffield. 7.- Benchmarking ontology querying tools and inference engines Holger Wache 8.- Benchmarking semantic web service technology? Holger asks WP2.4 leader 9.- ConclusionRaúl García-Castro (UPM) 10.- GlossaryAll contributors

18 18 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez 19 NovContributions of the partnersto Raúl 26 NovDraft v0: compilation of the parts (before next meeting) 17 Dec Draft v1: complete to Quality Assessor (WP leader) 7 Jan Draft v2: reviewed by QAto Quality Controller (Holger asks Matteo Bonifacio or Roberta Cuel) 28 JanDraft v3: reviewed by QCto Quality Assurance Coordinator 14 FebFinal versionto European Commission D 2.1.4: Time schedule

19 19 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro Asunción Gómez-Pérez September 28th, 2004 Benchmarking in WP 2.1


Download ppt "111111 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro Asunción Gómez-Pérez September 28th, 2004 Benchmarking."

Similar presentations


Ads by Google