Presentation is loading. Please wait.

Presentation is loading. Please wait.

MUSKET Middlesex University Skills & Education Planning Tool The role of XCRI on educational planning tools – transforming and comparing course documentation.

Similar presentations


Presentation on theme: "MUSKET Middlesex University Skills & Education Planning Tool The role of XCRI on educational planning tools – transforming and comparing course documentation."— Presentation transcript:

1 MUSKET Middlesex University Skills & Education Planning Tool The role of XCRI on educational planning tools – transforming and comparing course documentation Business Information Systems Department School of Engineering & Information Sciences

2 Agenda  Meet the team  MUSKET aim  Viewpoints on educational frameworks  Processes supported (why MUSKET?)  Case study

3 Meet the team  Balbir Barn  Associate Dean – grant holder  Geetha Abeysinghe / George Dafoulas  Principal Lecturers – project managers  Yongjun Zheng / Vikram Beedasy  Research Associates – project officers  Noha Saleeb  Evaluation & testing

4 MUSKET aim  JISC - Institutional innovation projects in lifelong learning and workforce development.  Assist employers in accrediting their in-house training with universities through work-based learning.  Help universities and students to compare courses from two different educational institutions for credit transfer by finding the % similarity between courses.

5 Viewpoints on educational frameworks Employer’s view Educational Institution’s view Learner’s view

6 Educational Institutions perspective  Educational Institutions view  To align with employers needs – inclusion in designing of programmes  Agile programme development – quick response to market needs fast track validation process,  Keeping to quality standards

7 Learners perspective  Learners view  Unbundling - personalised learning ability to select modules and create their own programmes  Studying in a number of short periods when it suits them

8 Employers perspective  Employers view  Some ownership  Involvement in the design of programmes  Alignment of skills gained to their needs

9 MUSKET objectives  To import unstructured documents containing course descriptions and export into semantic mark-up standard files.  Given two or more sets of information/data to compute the similarities and differences based on a semantic understanding.  Identify possible applications of such tools.

10 Challenge 1  How to avoid restrictions when putting together academic programmes?  Is it possible to harmonise curriculum design? Dealing with various formats INPUT: Word documents OUTPUT: XCRI-CAP MUSKET tool

11 Challenge 2  How to align higher education courses and provide consistent curriculum maps?  Is it possible to map HEI programmes to employer based training courses? One language: XCRI: eXchanging Course-Related Information – Course Advertising Profile http://www.xcri.org/

12 MUSKET tools functions 1.Word 2 XCRI Function (Transform) 2.Wrapper Function (XCRI 2 OWL & SQL 2 OWL) 3.Semantic Similarity Function (Semantic)

13 Why transformation? Course submission Course enquiries Course advisor (pathways) Course providers Employers / Prospective applicants MUSKET XCRI translation Cohesive reply based on the KB and the XML DB Employers Advise employers about Course opportunities Professional accreditation CPD Course route planning

14 Why semantic similarity?

15 Possible applications  Comparing courses from different Universities  Comparing courses from the same University  Aligning commercial provider courses to HEI ones  Combining HEI and Commercial provider courses  Accreditation of in house training  APCL / APEL decision support  Identifying study routes  Identifying progression routers  Providing course locator (most suitable course) EmployerLearner HEI EmployerLearner HEI EmployerLearner HEI EmployerLearner HEI EmployerLearner HEI EmployerLearner HEI EmployerLearner HEI EmployerLearner HEI EmployerLearner HEI

16 Case Study: MDX/EIS/BIS programmes  BSc BIM  BSc BIS  BSc BISM  BSc BIT  BSc FC  BSc IT&BIS

17 MDX/EIS/BIS programme structures

18 Programme description headings (keywords)  Awarding institution  Teaching institution  Programme accredited by  Final qualification  Programme title  JACS code  Relevant QAA subject benchmark group  Academic Year  Reference points  Aims of the programme  Programme outcomes  Computing related cognitive abilities & Learning methods  Practical abilities & Learning methods  Additional transferable skills & Learning methods  Programme structure  Levels and modules  Programme design  Curriculum map  Criteria for admission

19 Comparison criteria 1  BIS Programme aims  Knowledge and appreciation of issues related to the development of business information systems  Generic computing knowledge and skills e.g. database, modelling, methodologies that are necessary in the selection, development, evaluation and use of business information systems  Analytical modelling and other critical skills that can be used in the development, evaluation and use of business information systems throughout a graduate’s professional career

20 Comparison criteria 2  BIS Programme computing related cognitive abilities (learning outcomes)  Demonstrate knowledge of elements of Mathematics, Relational Algebra, Logic, Cognitive Psychology and Number Systems fundamental to the development, evaluation and implementation of business information systems  Make effective choices among various approaches in Analysis, Design and Implementation of Business Information Systems  Identify the professional issues (e.g. legal, social, ethical and cultural) and the critical attributes of successful IT project management in the development or appreciation of Business Information Systems

21 Comparison criteria 2 (continued)  BIS Programme computing related cognitive abilities (learning outcomes)  Employ appropriate modelling techniques, methods and methodologies for problems relevant to system development and select suitable tools to generate solutions to these problems  Verify stages of development and critically evaluate current and future use of business information systems  Research and present, in writing, rational and reasoned arguments that address a range of information handling situations and examine the impact of new technologies  Identify the benefits of strategic alignment in an organisation and the competitive advantage that can be gained from this alignment

22 Comparison criteria 3  BIS Programme module titles  DBMS  Introduction to Business Computing  Fundamentals of Multimedia and Scripting  Discovering Interaction Design  Database Systems: Design and Online  Decision Support Systems  Object-Oriented Analysis and Design  BIS Programme module titles  Professional Project Development and Management  Data Warehousing and Business Intelligence  Strategic Management and IS  Social, Professional and Ethical Issues in IS  Information Systems Project

23 Comparison criteria 4  Benchmark topics ArchitectureArtificial intelligenceComparative programming languages Compilers and syntax directed toolsComputational scienceComputer-based systems Computer communicationsComputer hardware engineeringComputer networks Computer vision and image processingConcurrency and parallelismDatabases Data structures and algorithmsDeveloping technologiesDistributed computer systems Document processinge-BusinessEmpirical approachesGames computing Graphics and soundHuman-computer interaction (HCI)Information systems Intelligent information systems technologiesManagement issuesMiddlewareMultimedia Natural language computingOperating systemsProfessionalism Programming fundamentalsSecurity and privacySimulation and modelling Software engineeringSystems analysis and designTheoretical computing Web-based computing

24 Semantic similarity – evaluation analysis  Similarity against  Programme aims  Programme outcomes  Module titles  Heading vs. content  Unweighted, weighted, perceived  Results:  Similarity between programmes  Difference between weighted and perceived similarity  Overall perceived similarity for comparison criteria

25

26

27

28

29 Case Study 1: MDX programmes

30 “MODULES” weight is 100%, other weights 0%  BIS and BISM, BIS and BIT are the most similar programs - 65%  BIS and FC are least similar - 24%  Similarities based on “modules” give higher results for BIS with BISM and BIT than with “learning outcomes” or “aims”, which implies that titles are more similar between these programs than their aims or learning outcomes. If weight of “MODULES” weight is 50%, “AIMS” 25%, “LEARNING OUTCOMES” 25%  The same distribution of similarity between BIS and the other programs exists on changing weights  The similarity gets smaller with decreasing weight of the “Modules”. This indicates that “Modules” may be the only significant category affecting similarity between the programs.  “MODULES”, “AIMS” and “LEARNING OUTCOMES” are equal - 33%  BIS and BISM are the most similar programs - 32%  BIS similarity to all other programs is almost the same with exception of BIM  BIS and FC are least similar - 23%  If 2 program lines are overlapping together in all diagrams, then this means that both diagrams are identical. This means that:  If the 2 programs are from different universities then they are a match which can help in decision making to study  If the 2 programs are from the same university, then there is a redundancy in programs and one should be eliminated from courses provided by the university.

31 Case Study 1: MDX programmes

32  The perceived similarity of the “Modules” between the 2 programs BIS and BISM is very high.  The perceived similarity of the “Modules” is larger than the perceived similarity of “aims” and “learning outcomes”, which means that the “Modules” sections in both programs are more similar than the “learning outcomes” and “aims” sections.  Using benchmark topics as search criteria, the perceived similarity of the “learning outcomes” between BIS and BISM is found to be considerably high compared to that of the “aims” and “Modules”.

33 Case Study 2: MDX vs HW

34  The perceived similarity between the 3 criteria “aims”, “learning outcomes”, and “titles” focuses on comparing their similarity in the two programs RELATIVE TO EACH OTHER, not just relative to the whole document, thus enlarges some at the expense of others. This is useful in decision making to see the real impact of a factor relative to the others on the overall program similarity.  When the search is for the “aims” section, the perceived similarity of the “aims” between the two programs of MDXand HW is very high.  On reducing the weight of the “aims”, its perceived similarity decreases, and the perceived similarity of “learning outcomes” and “titles” increases.  Perceived similarity of titles between MDXBIS and HW information systems is higher than that between the MDXprograms together.

35 Case Study 2: MDX vs HW

36  The overall perceived similarity of all 4 criteria word search (aims, learning outcomes, modules and benchmark topics) show that, compared to each other:  MDX BIS and HW information systems similarity of titles and aims are the highest categories which can help in decision making based on these criteria.  For the two HW programs, the “aims” and “learning outcomes” perceived similarities, are similarly high, which is expected within the same university since aims and learning outcomes complement each other during strategic planning of curricula.

37 Questions?  Balbir Barn (b.barn@mdx.ac.uk)b.barn@mdx.ac.uk  Geetha Abeysinghe (g.abeysinghe@mdx.ac.uk)g.abeysinghe@mdx.ac.uk  George Dafoulas (g.dafoulas@mdx.ac.uk)g.dafoulas@mdx.ac.uk  Yongjun Zheng (y.zheng@mdx.ac.uk)y.zheng@mdx.ac.uk  Vikram Beedasy (v.beedasy@mdx.ac.uk)v.beedasy@mdx.ac.uk  Noha Saleeb (n.saleeb@mdx.ac.uk)n.saleeb@mdx.ac.uk


Download ppt "MUSKET Middlesex University Skills & Education Planning Tool The role of XCRI on educational planning tools – transforming and comparing course documentation."

Similar presentations


Ads by Google