Presentation is loading. Please wait.

Presentation is loading. Please wait.

SOAR – Preparing for Launch Task Force Information January 2015.

Similar presentations


Presentation on theme: "SOAR – Preparing for Launch Task Force Information January 2015."— Presentation transcript:

1 SOAR – Preparing for Launch Task Force Information January 2015

2 Today’s Agenda 1. Welcome 2. Methodology underlying the initiative 3. Support service & academic program lists 4. Survey questions for authors 5. Process & workflow for authors and Task Forces 6. Next steps for launch 7. Q&A

3 Greetings!

4 Coordination Committee Goals for Methodology Goal 1: Define overarching initiative methodology – to provide clear, transparent guidelines for program definition, survey questions – that will drive data entry & program evaluation process, – safeguard process integrity, – and generate valid, meaningful outcomes. Goal 2: Listen to and incorporate Task Force feedback to accomplish the above Goal 3: Facilitate efficiency & effectiveness with sensitivity to time, effort of Task Forces, Authors, Approvers

5 Academic Methodology Overview (Sensible, Consistent) Step 1: List of “programs” populated using Campus Connection “academic plans” for AY 2013-14, including – Majors, 2 nd majors, minors, certificates, pre-majors Step 2: “Program” refined to “a function engaged in by faculty,” leading to adjustments, per Task Force input – 2nd Majors (removed as not distinct) – Essential Studies, Service Courses, Research/Scholarly/Creative Activity, Service (added) Step 3: Based on a list of “centers” and “institutes,” additional items added to Academic Programs list if functions engaged in by faculty Step 4: Vice Presidents & Deans provide input regarding the programs identified for their unit. In no case are programs allowed to be “rolled up” (i.e., any of the above combined into one “program”)

6 Support Services Methodology Overview (Sensible, Consistent) Step 1: List of “programs” populated using PeopleSoft Department Code numbers as of AY/FY 2013-14, excluding those associated with academic departments Step 2: Based on a list of “centers” and “institutes,” additional items added to Support Services Programs list based on function Step 4: Vice Presidents & Deans provide input regarding the programs identified for their unit. In no case are programs allowed to be “rolled up” (i.e., multiple Department Codes combined into one “program”) Because Code number may encapsulate multiple “functions” (i.e., designated purpose, activity, or service that, regardless of its size, does not fundamentally change), VPs & Deans identify any additional programmatic functions within each Code Step 4: Guideline: anything falling outside of management, leadership, or administration (M/L/A) should be identified as a separate function

7 Support Service & Academic Program List Overview (Task Force-driven)

8 Survey Questions for Authors (Sensible, clear, meaningful, doable)

9 Three-Step Process & Workflow Overview (Straightforward for all) Authors (Department Heads) Electronic submission via SharePoint Data & survey questions provided Approvers (Deans or Vice Presidents) Receive electronic submission & data files Approve & submit Authors’ program surveys Taskforces Receive approved program surveys Conduct reviews of program surveys

10 Author Process & Workflow Overview (Straightforward) Preparation Information session on survey questions & data Authoring Data & survey questions provided Delegate writing as appropriate Completion Electronic submission to Approvers via SharePoint

11 Task Force Process & Workflow Overview (Straightforward) Preparation Norming sessions to establish review consistency Review All reviews are submitted anonymously Completion Data are aggregated & used to categorize each program

12 Survey Questions for Authors (Sensible, clear, meaningful, doable)

13 Survey Questions for Authors (Easy Data Entry)

14 SOAR Process Creates Common Language for Dialogue Generates snapshots based on collective peer evaluation Becomes information tool Creates common language for dialogue, discussion Predicated on commonly held & applied guidelines for time, effort

15 Evaluation Rubric Straightforward; Easily Internalized, Applied, Understood, & Interpreted Straightforward; Easily Internalized, Applied, Understood, & Interpreted

16 Support Service Programs Total Number of Programs: 200 Review time per Program will vary – Guideline of 15 minutes per Program Process predicated on load-sharing – Rubric, norming, inter-rater reliability Time Commitment per TF Member: – Reviewing 1/2 of programs: 25 hours Approximately 2.5 hours/week

17 Academic Programs Total Number of Programs: 800 Review time per Program will vary – Guideline of 15 minutes per Program Process predicated on load-sharing – Rubric, norming, inter-rater reliability Time Commitment per TF Member: – Reviewing 1/4 of programs: 50 hours Approximately 5 hours/week

18 Academic Programs Total Number of Programs: 800 – “Service” programs: 83 – “Service course” programs: 82 – “Essential Studies” programs: 79 – “Research/Scholarly Activity” programs: 83 Each adds approximately 20 total hours: – Reviewing 1/4 of programs: 0.5 hour/week

19 Task Force Process Methodology Review Sheet Easy online access to Program Evaluation Sheet Anonymous Submission automatically aggregates data for processing without identification

20 Sample Academic Program Output

21 Sample Support Service Program Output

22 Timeline for Initiative Completion Jan. 12-15 – process meetings with TFs, Authors Feb. 2 – distribute survey questions, program-specific data to Authors Feb. 16 – Priority deadline for Author survey completion – Approval by Deans & Division Heads begins March 2 – Final deadline for Author survey completion – 4 weeks total to author – assumes staggered submission – Priority deadline for approved reports by Deans & Division Heads March 16 – Final deadline for approved reports from Deans & Division Heads – 4 weeks total to Approve – assumes staggered submission May 15 – TFs complete their program evaluations – 10 weeks total June 1 – TF co-chairs complete executive summary of TF evaluation results and recommendations

23 Timeline for SOAR Completion Jan. 12-15 – process meetings with TFs, Authors Feb. 2 – distribute survey questions, program-specific data to Authors Feb. 16 – priority deadline for Author survey completion – Approval by Deans & Division Heads begins – Rolling Task Force review begins as soon as practicable March 2 – Final deadline for Author survey completion – 4 weeks total to author – assumes rolling submission – Priority deadline for approved reports by Deans & Division Heads – Rolling Task Force review continues March 16 – Final deadline for approved reports by Deans & Division Heads – 4 weeks total to approve – assumes rolling submission – Final deadline for Task Force review to begin May 15 – Task Forces complete their program reviews – 9-13 weeks total review time


Download ppt "SOAR – Preparing for Launch Task Force Information January 2015."

Similar presentations


Ads by Google