Presentation is loading. Please wait.

Presentation is loading. Please wait.

TPE: Housekeeping and Updates November 13, 2014 Dave Volrath, Planning and Development Officer And TPE Action Team.

Similar presentations


Presentation on theme: "TPE: Housekeeping and Updates November 13, 2014 Dave Volrath, Planning and Development Officer And TPE Action Team."— Presentation transcript:

1 TPE: Housekeeping and Updates November 13, 2014 Dave Volrath, Planning and Development Officer And TPE Action Team

2 Agenda Appreciation and acknowledgement to LEA partners Grants – What’s out there – Status – Current challenges in payment Brief review of the MSEB report Update on CTAC report and SLO-MOU Update on Convenings and Pipeline Starting to think about next year’s data collection Q & A

3 Status of 3 Grants Implementation – $497K still on the table to be claimed – 5 LEAs have claimed nothing – 4 LEA have claimed partial – Only funds encumbered as of September 2014 can be claimed – Likelihood these funds can be kept for LEAs; mechanism not determined iPad – 12 LEAs have not claimed yet – Grants were in excess of the required amount; plan to “replenish” Sustaining – Only 8 LEAs have submitted C-125 and narrative – Anticipate it takes at least 3 weeks to create a NOGA

4 Getting Paid Mindset at MSDE is “audit readiness” You need to provide: – Your summary invoice to MSDE – The supporting invoices and proof of payment – The supporting invoices must transparently and precisely map to your request for payment – The AFR screenshot For the iPads, this is a breeze. Not so easy for Implementation with multiple parts.

5 An almost successful approach This real example almost fits the bill. The invoices are clearly marked. This Title IIA comment confused Accounting. Disentangling Invoice 4072 from Title IIA or a clearer narrative would do the trick.

6 Another example, more problematic How are these items linked? What is the PM invoice? Where is the Dell invoice? Pages and pages of sign-in sheets were provided but not in a way Accounting was able to understand. The FICA relates to?

7 Getting Paid: General Thoughts This is not a one-off conversation. This will apply to all RTTT grants. Please anticipate that Accounting will only be comfortable with a simple set of documents that “tic and tie.” Don’t send 200 pages of sign-in sheets. A one page example is fine, but provide a summary page that cleanly explains the number of units, hourly rate, total bill.

8 Quick Overview of TPE Ratings 43,805 teachers and 1,112 principals MSDE only provided descriptive statistics although we did delve into some of the subtleties Data have gone to WestEd for the inferential analysis Poverty and minority slides were received with considerable interest and concern MSA had a small effect, and more often helped than harmed LEAs should have a look at – Any changes to non-MSA teachers when MSA is removed – Performance of accrued points at rating level transitions

9 Composition of the State n = 43,805 The 5 largest LEAs represent 67% of teacher ratings

10 Summary view of 43,805 teacher ratings

11 Statewide distribution of teacher ratings by grade span configuration

12 Statewide distribution of teacher ratings by LEA size Large LEAs: Anne Arundel, Baltimore City, Baltimore County, Carroll, Charles, Harford, Howard, Prince George’s Medium LEAs: Calvert, Cecil, Saint Mary’s, Washington, Wicomico, Worcester Small LEAs: Allegany, Caroline, Dorchester, Garrett, Kent, Queen Anne’s, Somerset, Talbot

13 Statewide distribution of teacher ratings by LEA geographical location Central LEAs: Anne Arundel, Baltimore City, Baltimore County, Harford, Howard Eastern LEAs: Caroline, Cecil, Dorchester, Kent, Queen Anne’s, Somerset, Talbot, Wicomico, Worcester Southern LEAs: Calvert, Charles, Prince George’s, Saint Mary’s Western LEAs: Allegany, Carroll, Garrett, Washington

14 Restoring MSA to models slightly moves teacher ratings toward Effective and has minimal effect on Ineffective

15 Delta for MSA teachers: minimum effect on “Ineffective” ratings 86.6% of teachers stay in the same rating category; All 143 “Delta +1” teachers rose from Ineffective to Effective 925 of 980 “Delta -1” teachers went from Highly Effective to Effective

16 Schools in the highest quartile for poverty have more ineffective and fewer highly effective teachers than do schools in the lowest quartile for poverty Poverty is defined using the method for the Annual APR report: n FARMS/Enrollment sorted into quartiles

17 Schools in the highest quartile for minority students have more ineffective, fewer highly effective teachers than do schools in the lowest quartile for minority Minority is defined using the method for the Annual APR report: n non-White/Enrollment sorted into quartiles

18 Strand I Schools (meeting all annual indicator targets) have more highly effective teachers than do Strand 5 schools (failing to meet annual indicator targets) Strands are derived from the 2013 School Progress Index; Data for 42,442 teachers linked to an SPI Strand

19 Distribution of OFFICIAL TPE Teacher Ratings MSA Excluded; N=43,805

20 Composition of the State n = 1,112 The 5 largest LEAs represent 61% of principal ratings

21 Statewide distribution of principal ratings by grade span configuration

22 Schools in the highest quartile for poverty have more ineffective and fewer highly effective principals than do schools in the lowest quartile for poverty Poverty is defined using the method for the Annual APR report: n FARMS/Enrollment sorted into quartiles

23 Schools in the highest quartile for minority students have more ineffective, fewer highly effective principals than do schools in the lowest quartile for minority Minority is defined using the method for the Annual APR report: n non-White/Enrollment sorted into quartiles

24 At the Statewide level, distribution of principal ratings are generally consistent across SPI Strands. Strand 4 schools have both the most highly effective (53.3%) and the most ineffective principals (2.5%) Strands are derived from the 2013 School Progress Index; Data for 1066 principals linked to an SPI Strand

25 Distribution of OFFICIAL TPE Principal Ratings MSA Excluded; N=1,112

26 The TPE Team was very cautious and made no “pronouncements” The Team suggested: Actual differences in teacher and principal performance Differences in LEA evaluation model performance Precision in fitting cut scores

27 SLO Headlines “Real Progress in Maryland” reaffirmed what we already knowReal Progress in Maryland – It’s a heavy lift – Penetrating the classroom is hard – Managing all the logistics is hard, and good systems are critical – The closer folks are to the work, the better they feel about it – SLO investments are a good place for Sustaining dollars

28 Update on CTAC/SLOs MSDE is working with CTAC to annotate SLOs to create an LEA resource. Samples should represent: –Various grades and subjects –Assessed and non-assessed areas –Teacher AND Principal samples –Aligned to the Quality Rating Rubric No sample will be identified by LEA

29 Update on SLO Collaboration MOU Fall convenings Overview of feedback December focus on communication Plans to close the Quality Control loop for PY 5

30 Next Year’s Data Collection There will be no MSA strand to collect The APR variables we discussed last year festered with USDE again: eligible for tenure, retained, promoted, compensation Headwinds behind interest to link teachers to preparatory programs or strands MSEB interest in how 1 st /2 nd year teachers fare one year later

31 Contacts Dave Volrath, Planning and Development Officer David.Volrath@maryland.govDavid.Volrath@maryland.gov, 410 767 0504 Ben Feldman, TPE Team Ben.Feldman@maryland.govBen.Feldman@maryland.gov, 410 767 0142 Today’s data release on: LEA/School Teacher-Principal Evaluations.LEA/School Teacher-Principal Evaluations


Download ppt "TPE: Housekeeping and Updates November 13, 2014 Dave Volrath, Planning and Development Officer And TPE Action Team."

Similar presentations


Ads by Google