TENCompetence Assessment Model, Related Tools and their Evaluation Milen Petrov, Adelina Aleksieva-Petrova, Krassen Stefanov, Judith Schoonenboom, Yongwu.

Slides:



Advertisements
Similar presentations
DC8 Registries Breakout. Goals of the session Discuss and clarify : Requirements for registry Framework for policy Relate issues raised to EOR prototype.
Advertisements

Functional and non-functional requirements for building Service-oriented assessment model Adelina Aleksieva-Petrova Milen Petrov 5th TENCompetence Open.
Database Planning, Design, and Administration
ICT Class System Life Cycle.  Large systems development projects may involve dozens of people working over several months or even years, so they cannot.
IS6112 Application Modelling and Design Introduction.
Software Testing and Quality Assurance
UI Standards & Tools Khushroo Shaikh.
Using Open Technical E-learning Standards and Service-orientation to Support New Forms of E-assessment Yongwu Miao, Colin Tattersall, Judith Schoonenboom,
Fundamentals of Information Systems, Second Edition
A Domain-specific Modeling Approach to the Development of Online Peer Assessment Yongwu Miao and Rob Koper Educational Technology Expertise Centre Open.
A four-stage model for lifelong competence development Judith Schoonenboom - University of Amsterdam, Colin Tattersall, Yongwu Miao, Krassen Stefanov,
IMS1805 Systems Analysis Topic 6: Analysis as a process within a process.
Developing an approach for Learning Design Players Patrick McAndrew, Rob Nadolski & Alex Little Open University UK and Open University NL Paper available.
Lecture Nine Database Planning, Design, and Administration
Implementing an editor for IMS Learning Design: Technical and Usability issues in the development of Reload David Griffiths, Phillip Beauvoir, Mark Baxendale,
Database System Development Lifecycle Transparencies
Introduction to Software Testing
Irwin/McGraw-Hill Copyright © 2000 The McGraw-Hill Companies. All Rights reserved Whitten Bentley DittmanSYSTEMS ANALYSIS AND DESIGN METHODS5th Edition.
CS 5150 Software Engineering Lecture 15 Program Design 2.
Chapter 9 Database Planning, Design, and Administration Sungchul Hong.
Database Planning, Design, and Administration Transparencies
Database System Development Lifecycle © Pearson Education Limited 1995, 2005.
Overview of the Database Development Process
Aurora: A Conceptual Model for Web-content Adaptation to Support the Universal Accessibility of Web-based Services Anita W. Huang, Neel Sundaresan Presented.
1 Software Testing (Part-II) Lecture Software Testing Software Testing is the process of finding the bugs in a software. It helps in Verifying and.
Ontologies Reasoning Components Agents Simulations The Eclipse Process Framework Breno Machado.
© 2012 IBM Corporation Rational Insight | Back to Basis Series Chao Zhang Unit Testing.
S oftware Q uality A ssurance Part One Reviews and Inspections.
Software Quality Assurance Activities
RUP Implementation and Testing
VTT-STUK assessment method for safety evaluation of safety-critical computer based systems - application in BE-SECBS project.
Architecture-Based Runtime Software Evolution Peyman Oreizy, Nenad Medvidovic & Richard N. Taylor.
Graph Data Management Lab, School of Computer Science gdm.fudan.edu.cn XMLSnippet: A Coding Assistant for XML Configuration Snippet.
Database System Development Lifecycle 1.  Main components of the Infn System  What is Database System Development Life Cycle (DSDLC)  Phases of the.
Selected Topics in Software Engineering - Distributed Software Development.
BE-SECBS FISA 2003 November 13th 2003 page 1 DSR/SAMS/BASP IRSN BE SECBS – IRSN assessment Context application of IRSN methodology to the reference case.
1 INTEROP WP1: Knowledge Map Michaël Petit (U. of Namur) January 19 th 2004 Updated description of tasks after INTEROP Kickoff Meeting, Bordeaux.
A language to describe software texture in abstract design models and implementation.
Procedures for managing workflow components Workflow components: A workflow can usually be described using formal or informal flow diagramming techniques,
REAL TIME GPS TRACKING SYSTEM MSE PROJECT PHASE I PRESENTATION Bakor Kamal CIS 895.
Notes of Rational Related cyt. 2 Outline 3 Capturing business requirements using use cases Practical principles  Find the right boundaries for your.
LHCb Software Week November 2003 Gennady Kuznetsov Production Manager Tools (New Architecture)
Fundamentals of Information Systems, Second Edition 1 Systems Development.
Practical Investment Assurance Framework PIAF Copyright © 2009 Group Joy Pty. Ltd. All rights reserved. Recommended for C- Level Executives.
ModelPedia Model Driven Engineering Graphical User Interfaces for Web 2.0 Sites Centro de Informática – CIn/UFPe ORCAS Group Eclipse GMF Fábio M. Pereira.
E-portfolio assessment system architecture Milen Petrov Adelina Aleksieva-Petrova 5th TENCompetence Open Workshop, Sofia October 30-31, 2008.
Weaving a Debugging Aspect into Domain-Specific Language Grammars SAC ’05 PSC Track Santa Fe, New Mexico USA March 17, 2005 Hui Wu, Jeff Gray, Marjan Mernik,
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
ECIMF meeting, Brussels Copyright WebGiro AB, All rights reserved. ECIMF CWA Overview Andrzej Bialecki WebGiro, Chief System Architect.
© 2006 Pearson Addison-Wesley. All rights reserved 2-1 Chapter 2 Principles of Programming & Software Engineering.
Software Engineering Lecture # 1.
LanguageLab A Meta-modelling Environment Terje Gjøsæter and Andreas Prinz, University of Agder, Norway SDL Forum 2015, Berlin, Germany.
CPSC 871 John D. McGregor Process – an introduction Module 0 Session 3.
Bina Nusantara 19 C H A P T E R SYSTEM CONSTRUCTION AND IMPLEMENTATION.
CSE SW Metrics and Quality Engineering Copyright © , Dennis J. Frailey, All Rights Reserved CSE8314M37 8/20/2001Slide 1 SMU CSE 8314 /
OUNL’s assessment model January the 10th 2006 Colin Tattersall & Henry Hermans.
ANALYSIS PHASE OF BUSINESS SYSTEM DEVELOPMENT METHODOLOGY.
4+1 View Model of Software Architecture
Ontologies Reasoning Components Agents Simulations An Overview of Model-Driven Engineering and Architecture Jacques Robin.
© 2007 An alternate classification of LD authoring approaches TENCompetence Workshop, Barcelona Tim Sodhi, Yongwu Miao, Francis Brouns, Rob Koper Bottom-up.
More SQA Reviews and Inspections. Types of Evaluations  Verification Unit Test, Integration Test, Usability Test, etc  Formal Reviews  aka "formal.
Software Design and Development Development Methodoligies Computing Science.
Chapter 9 Database Planning, Design, and Administration Transparencies © Pearson Education Limited 1995, 2005.
Unified Modeling Language
Topic for Presentaion-2
Daniel Amyot and Jun Biao Yan
Introduction to Software Testing
UML profiles.
MANAGING THE DEVELOPMENT AND PURCHASE OF INFORMATION SYSTEMS
Overview Activities from additional UP disciplines are needed to bring a system into being Implementation Testing Deployment Configuration and change management.
Presentation transcript:

TENCompetence Assessment Model, Related Tools and their Evaluation Milen Petrov, Adelina Aleksieva-Petrova, Krassen Stefanov, Judith Schoonenboom, Yongwu Miao

Content TENCompetence Assessment Model Assessment Editor Assessment Run-time system Evaluation plan Main Results Conclusions

Main goals Design of an assessment methodology for competences, through analysis of the modern assessment methods, selection of proper methods and tools and design of basic assessment activities. Outline the major principles for planning and design of effective assessment and provides a framework and guidelines for the design of the unit of assessment.

TenCompetence Assessment Model Describes the life-cycle of the assessment Specified as formal specification using UML diagrams Optimised for competence development assessments Developed through simplification of the existing OUNL/CITO Model

Assessment model components Assessment design Item construction Assessment construction Assessment run Response rating

The reasons for simplifying OUNL/CITO assessment model The OUNL/CITO model is extensive and complex, aiming for completeness in its coverage of all forms of assessment. The TENCompetence Assessment Model, as part of the TENCompetence Domain Model, needs not to duplicate it’s components

Proof-of-concept tools Needed to validate the model Two competence assessment methods demonstrated: 360 degree feedback and Portfolio assessment Assessment Authoring tool Assessment run-time tool Developed as Java Eclipse plug-ins

Relation between TENCompetence assessment model, QTI&LD specifications and first proof-of-concept-tools

TENCompetence Assessment Data Model Why needed – to provide the needed semantics for the Assessment specification Specified as a XML schema Used to store the information about any specific details regarding any stage of the assessment process Used as an output from editing and input for the run-time

Functionality of the tools

Assessment Authoring tool Covers the following three phases of the TENCompetence Assessment Model: Assessment design, Item construction and Assessment construction Implemented as 360 degree feedback competence assessment method editor Results in an XML Schema according to the TENCompetence Assessment Data Model

The Authoring tool components Design phase Editor - define the blueprint of the assessment Question editor (item construction phase) - create and edit different type of questions (demonstration items, construction items, and selection items) Assessment Test Editor - combine assessment architect and assessment items

Assessment run-time tool Runtime environment for playing any non- traditional method of assessment Supporting the TENCompetence Assessment data model Implemented to support the Portfolio Assessment method Assessment run is based on an XML Schema according to the TENCompetence Assessment Data Model

The Runtime tool components Assessment run-time simulator – loads and perform non-traditional forms of assessment (like portfolio assessment peer review, etc) Response processing – for tracking the results from the assessment

Evaluation goals Is the model able to be used for the implementation of different competence assessment methods Are the tools capable to demonstrate the applicability of the model What is the complexity of these tasks

Evaluation Methodology Evaluate the functional quality of the tools (test case review checklist) Rate the interface and usability of the systems (end user questionnaire) Evaluate the software code quality and complexity (expert evaluation checklist)

Users involved Students for the usability testing Educational technology experts for testing the functional quality of the tools Software technology experts for the evaluation of the code quality and complexity

Design principles of ISO 9241 (Part 10) suitability to task self explanatory controllability conformity with user expectations error tolerance suitability for individualisation

Part of the given user questionnaire

Evaluation procedure Step 1: Download the evaluation bundle (user guide and assessment instruments) Step 2: Download the corresponding proof-of-concept tool Step 3: Un-package and install the software Step 4: Work with the software Step 5: Fill-in the assessment instrument Step 6: Return the filled assessment instrument

Details of the evaluation process Description Number of usersNote Software Unique users downloads 52 users3 – test users unique users/after removing test-users and anonymous users 48 users + 1 anonymous1 anonymous Software non-unique users 73 DownloadsSoftware downloads and other items from site Returned assessment instruments 33 users All valid Returned feedback form20 users3-invalid; 2-blank

ISO Evaluation Results Design principles of ISO 9241 Average result Suitability to task3,69 Self explanatory3,64 Controllability3,79 Conformity to user expectations3,08 Error tolerance3,33 Suitability for individualisation3,62

Main Evaluation Results Expert software code evaluation results: good quality of the code but very high complexity if needed to go for real life tools Functional quality results: very good functional quality Usability evaluation results: overall good quality, but several errors and issues identified

Evaluation Conclusions The tools proved the applicability of the model The Model has sufficient modelling power The complexity needed to go from proof-of-concept to real-life tools is very high and risky

Next steps 1) Further research needed into improving the mapping algorithms which are used to carry out the transformation to the data model 2) Revise and extend the TENCompetence Assessment Data Model, using elements with the functionally similar to those used in IMS LD and QTI

Questions?