Annual Performance Review (APR) and CREP Teacher & Student Survey 21 st CCLC Spring Institute March 16, 2016.

Slides:



Advertisements
Similar presentations
ProgressBook User Start-Up
Advertisements

Welcome to the Online Employment Applicant Tutorial Click here for next screen.
AIMSweb Benchmark Online Training For AIMSweb Teacher Users
AIMSweb Progress Monitor Online User Training
Randall Kirk, WVEIS Programmer Extraordinaire (Shawn L. Hawkins, Teacher Quality Coordinator June 23, 2011)
REQUESTING TRANSCRIPTS Student Training Presentation.
Introduction to Online Data Collection (OLDC) Community Based Abstinence Education September, 2009.
DEPARTMENT OF COMMUNICATION STUDIES RESEARCH PARTICIPATION SYSTEM.
Please note: Our website changes periodically. The screen and link examples in this presentation may appear slightly differently. Harford County Public.
Copyright © 2007 Learning Point Associates. All rights reserved. TM Introduction to PPICS for Washington 21st CCLC Grantees Michael Hutson Research Associate.
Need your MyMathLab card with your access code Need a Valid Address Need to know Purdue’s zip code is and your course ID for your Class You.
Welcome to the Turnitin.com Instructor Quickstart Tutorial ! This brief tour will take you through the basic steps teachers and students new to Turnitin.com.
8 Items to Enter Into CFDC
Using MyMathLab Features You must already be registered or enrolled in a current MyMathLab class in order to use MyMathLab. If you are not registered or.
Edgecombe County Public Schools Schoolnet Assessment Training (HOME BASE) November/December 2013 Accountability Services ( ) 1.
Application Process USAJOBS – Application Manager USA STAFFING ® —OPM’S AUTOMATED HIRING TOOL FOR FEDERAL AGENCIES.
Student Employment Student Training Note: This is a template that can be utilized to create your own institutional specific Student Employment Student.
Panorama High School E.G.P./ Training to Put Students’ Grades on the Website Wednesday, September 29,
1 Entering Grades and Indicators in the Standards-Based Report Card (SBRC) Users Manual for SBRC On-line Entry Interim Progress ReportsInterim Progress.
If you are very familiar with SOAR, try these quick links: Principal’s SOAR checklist here here Term 1 tasks – new features in 2010 here here Term 1 tasks.
1 NC WISE Parent Assistant A user-friendly web application to help parents track their children’s progress in school.
Career Services Center Employer Training. This is the main login page. The link can be found at Employers.
Getting Started with Moodle Getting Started Logging In Entering Your Address Viewing a Course Navigating Your Course’s Homepage Personalizing Your.
Online Admissions Tour When each slide comes to an end ‘Next’ will appear at the bottom right of the screen, please left click your mouse to move onto.
Online Reporting Guide
Moodle (Course Management Systems). Assignments 1 Assignments are a refreshingly simple method for collecting student work. They are a simple and flexible.
Learning.com for New Users. This presentation will help educators… Login to Edit your Learning.com educator account Access resources.
Roster Verification RV Presentation to School Administrators Spring 2012.
Profile and Performance Information Collection System (PPICS)
TATS – View/Update Consultant Profile Department of Health and Human Services Health Resources and Services Administration HIV/AIDS Bureau.
Log on to the site using your User ID and Password and select journal and click “Log In” Click here to create a new account Click here to check the system.
APR(Annual Performance Report) Reporting Packet Guide Need Help? Submit “Have a Question” in Kids Care Center.
Mathematics and Science Partnerships: An Introduction for New State Coordinators February /2013.
Social Innovation Fund Creating an Application in eGrants Technical Assistance Call 1 – 2:00 p.m. Eastern Time on Friday, March 19, ;
Submitting Course Outlines for C-ID Designation Training for Articulation Officers Summer 2012.
Using MyMathLab Features of MyMathLab You must already be registered or enrolled in a current MyMathLab class in order to use MyMathLab. If you are not.
Mtivity Client Support System Quick start guide. Mtivity Client Support System We are very pleased to announce the launch of a new Client Support System.
Student Quick Start Guide Prepared by: Information Services Division Perpustakaan Sultan Abdul Samad Universiti Putra Malaysia
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Page 1 of 42 To the ETS – Create Client Account & Maintenance Online Training Course Individual accounts (called a Client Account) are subsets of the Site.
Pennsylvania Bar Foundation – PA IOLTA Loan Repayment Assistance Program ONLINE Application Tips & Hints.
Arizona Western College Assessment Workshop 2 Workshop Objectives Enter collected evidence of assessments and notate the findings in your assessment plan.
SAILS Training Germanna Community College Adapted from the VCCS PowerPoint, Fall 2013.
21 st CCLC APR System Webinar Tanya Morin Gary Sumnicht Alison Wineberg April 25 and 26, 2016.
Attendance & Grading 1.Logging In & Navigating the Mail Menu 2.Viewing & Printing Your Attendance Roster 3.Creating TBA Schedules 4.Positive Attendance.
21 st CCLC APR System: Updates for Entry of Spring 2015 Data Tanya Morin Gary Sumnicht Alison Wineberg January 5, 2016.
SchoolSuccess for Coordinators
Training Guide for Residents
Make-Up Testing/Undo Student Test Submissions
Introduction to Powerschool Gradebook and tienet
GKIDS Mid-Administration Workshop
PeerWise Student Instructions
Arizona Western College Assessment for Administrative Departments Part 2 Workshop Objectives Enter collected evidence of assessments and notate the findings.
Welcome! To the ETS – Create Client Account & Maintenance
The University of Delaware Higher Education Consortia
USAJOBS – Application Manager
Arizona Western College Assessment Workshop 2
Multi Rater Feedback Surveys FAQs for Participants
Multi Rater Feedback Surveys FAQs for Participants
Using MyMathLab Features
Washington 21st CCLC Data Collection Webinar Samantha Sniegowski
How do I utilize EngradePro?
Rogers Sourcing Supplier Login Process For Returning Users
Navigating Through Canvas
Navigating Through Canvas
How to Create and Start a Test Session
Study Island Student Demo:
Using MyMathLab Features
Climate Surveys.
CSI Human Resource (HR) Data Collection Prep
Presentation transcript:

Annual Performance Review (APR) and CREP Teacher & Student Survey 21 st CCLC Spring Institute March 16, 2016

Overview  Review of first submission  Importance of the terms  Data collected  Most common issues  Preparing for the next submission  Teacher Survey Overview  Introduction to Student Survey

Review of First Submission Thank you for your hard work!

Importance of the Terms  The data had to be entered by terms. Why? GPRA indicators are intended to demonstrate extent of improvement over time.

Levels of Access  SEA Super User  SEA User  Grantee User (Subgrantee)

SEA Super User  Only one person may be assigned to this level  User management  Manage profile data for grantees and centers  Manage APR data  Manage state award and competition data  View visualized reports (Unless Delegated)  Certify data

SEA User  Two people may be assigned to this level  User management (if assigned)  Manage profile data for grantees and centers  Manage APR data  Manage state award and competition data (If Assigned)  View visualized reports (if assigned)

Grantee User (Subgrantee )  Multiple people may be assigned to this level  Manage profile data for their grant and associated centers  Manage APR data for their centers

Data Collected  Center information  Whether Extended Learning Time is implemented  Feeder school information  Partner information  Activities  Staffing  Participation  Outcomes

Most Common Issues  Log-in  Personnel  Center  Activities  Staffing  Participation  Outcomes

Issues Logging In  Never received the from the Tactile Group with login information  Password must be changed every 60 days  Browser problems  Response time was slower than stated from the Tactile Group  Link was not posted to the VDOE website  Too many requirements for the password

Suggestions for Improving Login Problems  Post the link  Allow password to remain valid for a longer duration of time  Quicker response or a response period to questions or request for support from USED  Reduce the number of characters for the password

Issues with the Assigned Personnel  The person assigned by VDOE was not the person who needed to complete the survey

Suggestions for Assigning Correct Personnel  Department will add a section to the contact information form to collect this information  Contact form will be updated yearly.

Issues with Center Data  Feeder schools  Co-applicant should be considered a partner.  When checking summer data during the second round of checking n February, some data was missing.  As partners were added, they did not appear in the review.  Once the data was entered, several times, when reviewed again, it had disappeared.  After entering the initial school information, it wasn't made clear that you had to click on the school name to proceed to enter specific data.

Issues with Center Data  Difficult to figure out how to input data, as the instructions to do so were limited. The process to get to the data entry was counterintuitive and difficult to remember each time I logged in  Consistency across program sites  The issue was not in entering the data, the data that was asked for was the issue  None after realizing the little arrows at the top corners moved it from page to page

Suggestions for improving Center Data Problems  Improve technical support  Having to back track to collect the grades; this was a new data collection  State specific directions  Add these instructions to the system  Maybe not have to click so many times back and forth

Suggestions for Improving Center Data Problems  Really helpful to be able to view fall data when entering spring data  Data request lengthy  Make it more obvious to get to the data input, instead of using a sidebar menu and having to click within that menu  Dates need to be highlighted at the beginning of the document so grantees know when due, instead of going through the entire document to find out

Issues with Activities Data  College and career readiness definition is too restrictive  Hard to locate the activities section--- hidden in the upper right hand corner  Difficult to find the weekly option  Confusion about what kind of activities fit in which category  Difficult to determine if data "submitted" to VDOE

Suggestions for Improving Activities Data  Allowing to print or save a copy of what was submit for our records  List items down straight instead of placing them to the side at the top...confusing  An additional category added under Enrichment for "Other" to allow program offerings that are not listed to be recognized and accounted  Ability for centers to add more than one entry per category, thus allowing for further explanation of all the “wonderful” activities we do with our children

Suggestions for Improving Activities Data  Many of activities are both literacy and English language learner support; it would be helpful to be able to have a box for "was ___________ also English language learner support.”  Data request lengthy

Issues with Staffing Data  The issue with the system for reporting data about staffing is that it doesn't give a means to account for staffing changes.

Suggestions for Improving Staff Data Entry  Specify "School Day Teachers" as "After School Teachers" and clarify it is those teachers working with the 21 st CCLC program  Provide a section for comments

Issues with Participant Data  Numbers greater than 999 corrected  Very time consuming entering data for the different groups  Gathering data are the challenge  Race/ethnicity categories do not match the ones used in school system which resulted in many students counted in "other" category when in fact they do identify in one particular race in our school data system

Issues with Participant Data  Difficult to be consistent across seasons with student turnover  Somewhat confusing  Difficult to know if data "submitted" to VDOE; definitions of participants always challenging to interpret  Asks for total number of participants for all grades, however state assessment scores only can be entered for SOL grades causing an error; may have had 30+ day students that did not take SOL and thus data are skewed

Suggestions for Improving Participant Data Entry  Advance notice of data collection categories  List it in the center of the page. Tell the user when you have completed the section and all sections  For Student Attendance, many students participated more than 25 days but less than 30. Suggest lowering the minimum day category to "less than 25."

Suggestions For Improving Participant Data Entry  Helpful to know that spring data were going to be cumulative before logging in to enter the data  More explanation about what qualifies a student as "limited English language proficiency”  Allowing to see fall data when entering spring data

Issues with Outcome Data  Summer outcomes – none  Fall outcomes – grades  Spring outcomes – state assessments, grades, and teacher surveys  “Needs Improvement” – if the student does not have an “A” reporting as needing improvement  Definition of “proficient”  Time it takes analyzing the data

Issues with Outcome Data  Outcomes ill-defined; outcomes new and may not correspond with grant profile. If data not collected, cannot respond.  Time consuming entering data for different groups.  Data were collected by the center; however, the information was broken into these categories.  Asking respondent to determine if a student needs to improve in math or reading based on the year before SOL test is not valid.

Issues with Outcome Data  Vagueness of the questions and the subjective interpretation of the reporter  Consistency with information from many sources; as a program grade data are not collected because does not reflect growth in the same way as standardized assessments  Lengthy

Issues with Outcome Data  Needed additional knowledge of what outcomes would be tracked before being asked to report them  Not knowing when collecting data that it would need to be broken down in the method that it was requested  Non SOL grade students are included in the participant data; however, no SOL scores to assess them and thus the numbers do not tally

Issues with Outcome Data  Outcomes for grades (report card marks) confusing  No parameters offered about what specific grades (subject areas) to use and how to define needing progress or making progress.  Different scale in some schools for K, 1-2, and 3-5, all with their own set of letter marks.  Each center defined "at risk" and "improvement"

Issues with Outcome Data  Asked to address improvement in "English grades" This left definition to each center so not reflective of consistent progress for 21 st CCLC participants  Not informed to collect report card marks last year and researching and collecting data very time consuming and frustrating  State assessment area was unclear. We have students in K -5 in our after school programming and so the SOL data is only reflective of a fraction of participants.

Issues with Outcome Data  Current 3 rd graders did not have an SOL score from last year.  To measure improvement in the SOL test had to go back two years which was frustrating to school partners who were asked to quickly gather data  There were no outcomes asked of the K-2 grades for state assessments when there are other state assessments and county benchmarks used to measure their progress.

Issues with Outcome Data  Teacher survey unclear because it grouped homework completion and class participation together when some students improved on only one of the areas and thus no way to document  Counterintuitive to enter the data in semester form rather than year  State APR data and the collection of most data entered for full year rather than semesters  Separating data into semester format frustrating and did not relate to program goals or correlate with data already collected  Extra time spent collecting the specific data and entering it during unusual time windows.

Preparing for the Next Submission of APR Data  Tentative date for submission of the data is spring 2016  Submission includes summer 2015, fall 2015, and spring  This will be the yearly timeframe for USED APR data collection

Teacher and Student Surveys

Teacher Survey  School day teachers complete the survey for students with 30+ days of attendance.  Only one survey per student needs to be completed.  English teacher at the secondary level  Survey will be completed using the CREP survey system.  Teachers will receive a confirmation.  to themselves or to the program coordinator/site director  Teachers will have an opportunity to complete another survey without having to log back in again.

Teacher Survey  Questions have been modified to correlate with the questions that are asked on the new USED APR System.  Coordinators will receive a report by grant, eliminating tally sheets.  The report can be sorted and used to input data into on the new USED APR System for the data.

Teacher Survey  The survey will be available from March 1, 2016, through March 31,  Program coordinators should have already received an with a username and password.  Forward the link and letter to school day teachers that explains the purpose and importance of the survey.

Teacher Survey  There is not any action necessary  If your teachers have already submitted surveys for students with less than 30 days of attendance; OR  If more than one teacher submitted a survey for a student.

Student Survey  Evaluation results do not tell the whole story  Other ways to determine success  Grades  Teacher survey  Homework and participation  Behavior  Student survey - new

Student Survey  Purpose – to determine student perception and benefit of the 21 st CCLC program  Will receive a report of the results  Results will  Not be used for the USED APR Data System  Be used to write the evaluation of the program in Virginia

Student Survey  Online survey through CREP  Questions for elementary and middle school  Questions for high school  Types of questions  Short, easy to read statements  Select all that apply  Yes/sometimes/no  Agree/not sure/disagree

Student Survey  Suggestions for student access to the survey  Link to teachers to write on the board or a sheet of paper  Bookmark the webpage  Create a shortcut on the desktop  Directions in the to coordinators  Which students will complete the survey  Time frame to consider when deciding which students  Grade levels

Survey Monkey  When would be the best time to complete Virginia’s Annual Performance Report?  at the same time we are completing the USED APR (this spring)  in the summer  in the fall the following school year Please complete by Friday, March 18, 2016, at 4 p.m.

Contact Information Marsha Granderson- USED APR System Education Specialist (804) Tiffany Frierson- Teacher and Student Surveys Education Specialist (804)