Download presentation
Presentation is loading. Please wait.
Published bySteven Caldwell Modified over 9 years ago
1
Online Peer Evaluation System Team Green Apple Team Members Ada Tse Amber Bahl Tom Nichols Matt Anderson Faculty Mentor Prof. M Lutz Project Sponsor Richard Fasse RIT Online Learning
2
Agenda Project Overview Current System Our Product Features Requirements Process Product Architecture and Design Software Process Risk Analysis and Mitigation Metrics Status & Future Goals Demo
3
Problem Statement The RIT Online Learning department is in need of an online peer evaluation system that can enhance the collaborative learning experience. Existing Tool: –Paper Alternative –Clipboard Survey System
4
Importance Group work is an important aspect in today's education system The average SE graduate does about 16 group projects
5
Current System: Clipboard Create, Deploy and Analyze –Does provide different views for analysis but more effective for analyzing surveys then Peer Evaluations. –Very Hard to identify problem groups Not integrated with myCourses Survey System Can’t deploy evaluations per group Hard to setup Reporting does not show group weaknesses No control over who takes the survey
6
Current System: Reporting View View: Percentage/ Graph
7
Current System: Reporting View
8
Solution: Custom Application
9
Peer Evaluation System Integrated with existing system –Login pass-through –Course and group data imported directly from myCourses Setup Workflow –Tailored for peer evaluations Question Templates –Reusable –Shared between instructors
10
Application Workflow Instructor Main -Create Eval Student main -Take Eval Instructor Main -Reporting 1. Create Evaluation 2. Take Evaluation 3. Analyze Results WOW!!
11
Instructor Main List of global and personal questions templates Evaluation status Evaluations listed per course
12
Solution: Create Evaluation Select Template Eval Setup Info
13
Solution: Create Templates Global/ Personal
14
Solution: Students View Instructions All students of a group.
15
Solution: Reporting Reporting (Provided with the help of multiple views) –Multiple levels of detail By group By student –Sorted by groups or individuals –Quickly identify problem groups
16
Solution: Reporting View
18
Requirements Process Mainly elicited by: –In-person Interviews Project Sponsors Subject Matter Experts Online Learning Technical Staff –UI Mockups –Evaluating RIT Clipboard Peer Evaluation Templates
19
Requirements Analysis Use Case Analysis Workflow Diagrams –Workflow Steps Constant user feedback at the end of each Sprint
20
Product Architecture and Design
21
Entity Relationships
22
Data Model Architecture
23
Package Diagram
24
Deployment Diagram
25
Software Process
26
Process: Scrum What is Scrum? –Scrum is an iterative, incremental process for developing any product or managing any work. It produces a potentially shippable set of functionality at the end of every iteration (Sprint).
27
Scrum: Sprint Typical team size 2 to 4 members Delivers working software –Typically between 1-4 week iterations Cross-functional tasks per team member New work may be uncovered by the team during development
28
Our Methodology Flavor of Scrum Differences: –Upfront requirements –Postponed the Sprint one delivery date by 2 weeks Similarities: –The whole project was implemented in chunks (Sprints) depending on the requirements prioritization (Sprint Backlogs). –Team meetings
29
Risk Analysis and Mitigation
30
Risk New Technologies –.NET Integration with myCourses –XML Feeds –Testing LDAP Authentication Complexity of business requirements
31
Risk Mitigation: Task Planning New Technologies –Allocated tasks according to skill set –Team members started off with small/simple programs –Experienced team members educated the team
32
Risk Mitigation: Development LDAP & myCourses integration –Great help from the Online Learning Complex business requirements –Incremental development & comprehensive requirements gathering
33
Risk Mitigation Plan: Software Process Use of Scrum User Feedback (Allows for midcourse corrections) Increased Product Visibility Increased Progress Visibility –Sprint Planning Through many sprints the requirements were revised many times to ensure that clarity is achieved. Throughout every sprint, each decision will be evaluated to make sure that it aligns with the overall goals of the project.
34
Risk Mitigation: Tooling Subversion for revision control Google groups Trac provides web based management –View files and changesets Automated synchronization of project documents to web site Trac provided an integrated bug tracking system
35
Data Collection
36
Metrics Backlogs –Product –Sprint Number of tasks completed for a particular sprint (work effort distributed for each sprint) Number of bugs –By Feature –By Severity –Per Sprint Total effort (man hours) for all phases
37
Effort Metrics
38
Bugs Per Feature Total # of bugs: 53 Major: 22 Minor: 11 Trivial: 20
39
Current Status Progress Key FeaturesProgress Requirements ElicitationDONE Requirements Analysis (SRS)DONE High Level ArchitectureDONE Initial Setup (DB, Environment)DONE Requirements PrioritizationDONE Sprint 1 (5 th Week)DONE Sprint 2 (7 th Week)DONE Sprint 3 (9 th Week)DONE Integration TestingDONE Final Release05/19/2006
40
Future Enhancements More views for reporting –Currently our application supports 2 views High-level groups + students Team member + responders + questions Better support for answer type –Currently our application supports Text Type Radio Button
41
Reflections Great Team!!! –All team members were new to the group Appropriate Software Process Model Delays in Sprint 1 –Unknown Technologies –.NET 2.0
42
Demo Peer Evaluation System
43
Questions Thank you!
45
Supporting Data
47
Challenges Uniformity –Rating System –Question System Faculty View Different User Types Synchronization with myCourses
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.