Download presentation
Presentation is loading. Please wait.
1
FREMA: e-Learning Framework Reference Model for Assessment Design Patterns for Wrapping Similar Legacy Systems with Common Service Interfaces Yvonne Howard Learning Technologies University of Southampton, UK
2
Background What is FREMA? –JISC funded Project between Southampton, Strathclyde and Hull –Aim to produce a Reference Model of the e-Learning Assessment Domain –To aid interoperability and aid in the creation of Assessment Services for the e-Framework What is the E-Framework? –Service Oriented Architecture for e-learning systems –Layered Web Services (Domain services over Common Services) –Dynamic and evolving What is a Reference Model for Assessment? –Assessment is a broad and complex domain –Not enough to describe and define a set of services –Need a proper audit trail of decision making –Start by defining the domain –Work up through use cases to services (and example implementations) –An evolving model –Allow the Community to contribute to every stage
3
Anatomy of FREMA Reference Model Domain Definition –Overview of the domain, and how projects and standards fit within it Assessment Domain Definition Use Cases Service Profiles Gap Analysis Reference Impl’ Identifying Common Usage Patterns –Scoping the FREMA Project Gap Analysis –Mapping of Use Cases to the Services in ELF Service Profiles –Formal descriptions of those services Example Implementation –Of key/core services –Examples –Validation –Resource Common Usage Patterns Developing Use Cases –Formal descriptions of usage patterns
4
Semantic Wiki Used to build a semantic wiki (a wiki in which all the pages and links are typed) Can model all the levels of the Reference Model Enables Smart Searching and Analysis –Semantic Search –Dynamic Gap Analysis –Concept maps Open editing, but with Administrator controls
9
Analysis Tools: Gap Analysis
10
Service Usage Model Describes a scenario in which services work together Use Case Diagram Set of Abstract Logical Service Expressions Interaction Diagram
11
SUM: Description Formal as a Use Case Diagram Informal as a Narrative Description
12
SUM: Structure and Organisation
13
Service Expression Logical, abstract description
14
SUM: Functionality Workflow and processes Semi-formalised as a UML Interaction Diagram
15
Scenario: Technical Developer Will, Technical Developer ‘I want to lookup use cases and scenarios to help me design my application. This will help me to define my footprint in the assessment domain. I see there are some web services I could re-use but some are missing. What standards and patterns can I use when writing my own web services to ensure that I can interoperate with the web services I’ve chosen?’
16
Service Patterns Exemplar workflows –Described as patterns –Show how service interoperability can be achieved –Solutions to common interoperability problems Reference implementations of Service Patterns –WS–Interaction diagrams –WSDL –Java Code to download and install –Running services to try Assessment Domain Definition Use Cases Service Profiles Gap Analysis Reference Impl’ Common Usage Patterns
17
Wrapping legacy systems with a service interface Legacy Systems may contain valuable IP Do we wrap legacy systems individually even if they have similar functionality? Or build small interfaces that legacy systems can support as appropriate Granularity issues –Too small = large design overhead –Too Large = bulky and inappropriate Goal –Consolidate functionality into only a few interoperable services Robust complete
18
Design Patterns ‘ Gang of Four’ description –Describes a recurring pattern –Its solution –Context in which it applies Patterns capture the experience of software engineering and design experts 3 patterns –LCD Lowest Common Denominator –MPI Most Popular Interface –NI Negotiated Interface
19
Lowest Common Denominator Intent –simplest common interface for 2 or more components which share some common methods Motivation –Similar legacy apps with overlapping functionality –Wrap with common interface –Direct relation between methods of common interface and functionality of legacy systems Implementation –Strict intersection of functionality of legacy components –Create interfaces for individual components –Normalise methods then extract common methods –May have different data models Applicability –Feasible when intersection captures meaningful core functionality Consequences –Simple to derive –Value depends of size of intersection –Loses functionality richness
20
Most Popular Interface Intent –Rounded, robust common interface for 2 or more non– identical components with some methods in common Motivation –Similar legacy apps with overlapping functionality –Wrap with common interface –Compromise interface –Reflects best practice Implementation –Set of methods, M, chosen by experts (best practice) reflects community expectation of functionality –Legacy components intersection is a proper subset of M Applicability –Feasible when there is agreement on core functionality that should be expected Consequences –Complex to derive –May need to create and hold additional information in the wrapping service –May need analysis or mapping tables in the wrapper –May capture a broad set of functionality from legacy systems
21
Negotiated Interface Intent –Flexible common interface, preserving richness for 2 or more non–identical components, some common methods Motivation –Similar legacy apps with overlapping functionality –Wrap with common interface –Enables all functionality of legacy systems to be represented, but not necessarily available in all of the systems Implementation –Union of functionality of legacy components –Interface includes methods mthods to query which methods are supported by the wrapped legacy system –Implemented by contract describing methods available Or querying at run-time for method availability Applicability –Advisable when novel functionality not universally supported is required in the interface Consequences –Cumbersome to define –Avoids complex decisions about definitive interface –Adds runtime complexity
22
Analysis Tools: Gap Analysis
23
Interface Implementations using LCD, MPI and NI patterns 2 legacy systems from the assessment domain –TOIA Free to use Question Management system Includes an Item bank Developed in UK for Higher Education use –E3AN Free to use Open Source ‘Item bank’ of QTI questions
24
Legacy Item Bank Ontologies
25
LCD Interface MPI Interface NI Interface
26
Web interface to wrapper service and results returned
27
Lessons Learnt Writing wrapping services for legacy systems is non trivial requires close understanding of the data model (possibly reverse engineering) Complexity rises in proportion to complexity of the data model and the interface Mapping terminology used in different systems is time consuming and non-obvious Implementations may Interpret Standards differently and may cause unexpected mismatched behaviour LCD is simplest and quickest to build but may exclude valued functionality MPI selects core methods based on expert judgement but may be expensive to build, as the wrapper may have to hold functionality translation NI represents all method in a framework, is flexible but adds runtime overhead of negotiation
28
and …Thank You www.frema.ecs.soton.ac.uk ymh@ecs.soton.ac.uk
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.