Presentation is loading. Please wait.

Presentation is loading. Please wait.

José Paulo Leal | Ricardo Queirós CRACS & INESC-Porto LA Faculdade de Ciências, Universidade do Porto Rua do Campo Alegre, 1021 4169-007 Porto PORTUGAL.

Similar presentations


Presentation on theme: "José Paulo Leal | Ricardo Queirós CRACS & INESC-Porto LA Faculdade de Ciências, Universidade do Porto Rua do Campo Alegre, 1021 4169-007 Porto PORTUGAL."— Presentation transcript:

1 José Paulo Leal | Ricardo Queirós CRACS & INESC-Porto LA Faculdade de Ciências, Universidade do Porto Rua do Campo Alegre, 1021 4169-007 Porto PORTUGAL A programming exercise evaluation service for Mooshak

2 Outline 1. Introduction Context Motivation Goal 2. Architecture eLearning Frameworks E-Framework Evaluation service (service genre, expression and usage model) 3. Design 4. Conclusion

3 1. Introduction: Context  Experience of projects with evaluation components Mooshak - contest management system for ICPC contests EduJudge - use of UVA programming exercises collections in LMSs  Emergence of eLearning frameworks advocate SOA approaches to facilitate technical interoperability based on a survey the most prominent is the E-Framework (E-F)

4 1. Introduction: Motivation  Integration of systems for automatic evaluation of programs program evaluators are complex difficult to integrate in eLearning systems (e.g. LMS) program evaluators should be autonomous services  Modelling evaluation services communication with heterogeneous systems Learning Objects Repositories (LOR) Learning Management Systems (LMS) Integrated Development Environments (IDE) conformance to eLearning frameworks improves interoperability

5 1. Introduction: Motivation  Integration of evaluation service in eLearning network

6 1. Introduction: Goal 1. Architecture 1. Integration of evaluation service on eLearning network 2. Definition of an evaluation service on eLearning framework 3. Formalise concepts related to program evaluation 2. Design 1. Extend existing contest management system 2. Expose evaluation functions as services 3. Reuse existing administration functions

7 2. Architecture  eLearning frameworks  Specialized software frameworks  Advocates SOA to facilitate technical interoperability  Types: Abstract: creation of specifications and best practices for eLearning systems (e.g. IEEE LTSA, OKI, IMS AF) Concrete: service designs and/or components that can be integrated in implementations of artifacts (e.g. SIF, E-F)  Survey: E-F and SIF are the most promising frameworks they are the most active projects both with a large number of implementations worldwide.

8 2. Architecture  E-Framework initiative established by JISC, DEEWR, NZ MoE and SURF aims to facilitate system interoperability via a SOA approach has a knowledge base to support its technical model ComponentsDescriptionUser role Service genreCollection of related behaviors that describe an abstract capability No technical expert (e.g. IT Manager) Service expressionA specific way to realize a service genre with particular interfaces and standards Technical expert (e.g. Developer) Service Usage ModelThe relationships among technical components (services) used for applications Domain expert (e.g. Business Analyst) http://www.e-framework.org/

9 2. Architecture support of the online community (developers wiki) contribution for the E-Framework: Service Genre (SG) Service Expression (SE) Service Usage Model (SUM)

10 2. Architecture - SG  Text File Evaluation Service Genre responsible for the assessment of a text file text file with an attempt to solve an exercise exercise described by a learning object supports three functions ListCapabilities EvaluateSubmission GetReport

11 2. Architecture - SG  ListCapabilities function: list all the capabilities supported by a specific evaluator capabilities depend strongly on the evaluation domain computer programming evaluator: programming language compiler electronic circuit simulator: collection of gates that are allowed on a circuit

12 2. Architecture - SG  EvaluateSubmission function: requests an evaluation for a specific exercise request includes: reference to an exercise as a learning object held in a repository text file with an attempt to solve a particular exercise evaluator capability necessary for a proper evaluation of the attempt response includes ticket for a later report request or a detailed evaluation report

13 2. Architecture - SG  GetReport function: get a report for a specific evaluation report included in the response may be transformed in client side: based on a XML stylesheet able to filter out parts of the report calculate a classification based on its data

14 2. Architecture - SE  The Evaluate-Programming Exercise SE requests program source code reference to programming exercise as a Learning Object (LO) resources learning objects retrieved from repository LO are archives with assets (test cases, description) and metadata responses XML document containing evaluation report details of test case evaluations Source code + LO reference Evaluation Engine report inputoutput resource LO

15  The E-Framework model contains 20 distinct elements to describe a service expression (SE)  Major E-Framework elements: 1. Behaviours & Requests 2. Use & Interactions 3. Applicable Standards 4. Interface Definition 5. Usage Scenarios 2. Architecture - SE

16 1. Behaviours & Requests details technical information about the functions of the SE the 3 types of request handled by the SE: ListCapabilities: provides the client systems with the capabilities of a particular evaluator EvaluateSubmission: allows the request of an evaluation for a specific programming exercise GetReport: allows a requester to get a report for a specific evaluation using a ticket

17 2. Architecture - SE 2. Use & Interactions illustrates how the functions defined in the Requests & Behaviours section are combined to produce a workflow LEARNING MANAGEMENT SYSTEM EVALUATION ENGINE (correction and classification) REPOSITORY LEARNING OBJECTS LO reference and attempt LO reference LO Report 1 2 4 3

18 2. Architecture - SE 3. Applicable Standards enumerates the technical standards used on the SE content (IMS CP, IEEE LOM, EJ MD) and interoperability (IMS DRI)

19 2. Architecture - SE 4. Interface Definition formalizes the interfaces of the service expression syntax of requests and responses of SE functions functions exposed as SOAP and REST web services FunctionWeb ServiceSyntax ListCapabilitiesSOAP ERL ListCapabilities() REST GET /evaluate/ > ERL EvaluateSubmissionSOAP ERL Evaluate (Problem, Attempt,Capability) REST POST /evaluate/$CID?id=LOID ERL GetReportSOAP ERL GetReport(Ticket) REST GET $Ticket > ERL

20 2. Architecture - SE 4. Interface Definition Evaluation Response Language (ERL) covers the definition of the response messages of the 3 functions formalised in XML Schema

21 2. Architecture - SE 5. Usage Scenarios LearningExamplesIssues/Features Curricular (classes) Self-evaluationfeedback for wrong submissions Assigmentsfeedback & evaluation Examscomputes a grade Competitive (contests) IOIpoints for accepted test cases ICPCpenalizations for wrong submissions IEEExtremehigh number of participants

22 2. Architecture - SUM  Text File Evaluation SUM describes the workflows within a domain composed by SG or SE template diagram from E-F two business processes 1. Archive Learning Objects 2. Evaluate Learning Objects

23 2. Architecture - SUM Business Processes RolesDescription Archive Learning Objects Teacher searches an exercise in a Learning Objects Repository (LOR) links the most appropriate in a Learning Management System (LMS) Evaluate Learning Objects Student gets the exercise from the LMS solves the exercise in a specialized resolution environment submits the resolution to a evaluation engine (EE) receives a notification with a evaluation report

24 3. Design  Evaluation Service: Design principles & decisions support e-framework architecture extend existing contest management system – Mooshak reuse existing functions rather than implement new ones create service front controller for service maintain administration web interface map service concepts in Mooshak concepts

25 3. Design  Evaluation Service: Mapping service concepts to mooshak Service -> Contest only contests marked as serviceable several contests served simultaneously same contest can be served and managed Capability -> Contest + Language service request specify contest & language (whiting contest) controls evaluation context produces evaluation report (XML)

26 3. Design  Evaluation Service: Mapping service concepts to mooshak Service requester -> Team IDs based on remote IP address & Port basis for authentication useful also for auditing Learning Object -> Problem LOs downloaded from remote repositories converted to Mooshak problems downloaded problems used as cache

27 4. Conclusion  Definition of a evaluation service  Contribution to the E-Framework with a new Service Genre, Service Expression and Service Usage Model  Validation of the proposed model with a extension of Mooshak contest management system  Current and future work first prototype already available communication with repositories still in development integration in network of eLearning systems full evaluation of this service planned for next fall

28 Questions? Authors José Paulo Leal zp@dcc.fc.up.pt http://www.dcc.fc.up.pt/~zp zp@dcc.fc.up.pt http://www.dcc.fc.up.pt/~zp Ricardo Queirós ricardo.queiros@eu.ipp.pt http://www.eseig.ipp.pt/docentes/raq ricardo.queiros@eu.ipp.pt http://www.eseig.ipp.pt/docentes/raq Thanks!


Download ppt "José Paulo Leal | Ricardo Queirós CRACS & INESC-Porto LA Faculdade de Ciências, Universidade do Porto Rua do Campo Alegre, 1021 4169-007 Porto PORTUGAL."

Similar presentations


Ads by Google