Presentation is loading. Please wait.

Presentation is loading. Please wait.

T ECHNOLOGY R EADINESS A NALYSIS. What We Will Cover  Who are we?  Why are we doing this?  Who are we meeting?  What are we seeking?  What methodology.

Similar presentations


Presentation on theme: "T ECHNOLOGY R EADINESS A NALYSIS. What We Will Cover  Who are we?  Why are we doing this?  Who are we meeting?  What are we seeking?  What methodology."— Presentation transcript:

1 T ECHNOLOGY R EADINESS A NALYSIS

2 What We Will Cover  Who are we?  Why are we doing this?  Who are we meeting?  What are we seeking?  What methodology are we utilizing?  What have we found?  What can we do for you? 2

3 Who Are We?  Team of four consultants  Rob Cardelli – Team Lead - Education and Government Consultant with 20+ years of Experience - Consultant for 2014-17 State of South Carolina Educational Technology Plan  Brenda Bryant - Former School Teacher in Richland 2 District, S.C.  Bob Jones - S.C. Technology consultant with 30 years of experience  Heather Sutton - S.C. Technology and Management Consultant from Orangeburg, S.C. 3

4 Who Are We?  Completely independent and objective  Findings presented directly to each district’s superintendent and staff 4

5 Why Are We Doing This?  Voluntary Only District superintendents are invited to participate.  Hired by S.C. Department of Education In response to the Abbeville Lawsuit Verdict and General Assembly request. Legislature wants to ascertain if districts are ready for Online Testing in 2017 and beyond and to identify any needs they may have to solidify their readiness. 5

6 Why Are We Doing This?  Sharing Findings Plan to offer results of our assessments and findings to the other districts to help share what variables should be examined to adequately prepare their own districts for testing. 6

7 Who Are We Meeting With? 7 Wave 2  Abbeville  Bamberg 1  Bamberg 2  Barnwell 19  Barnwell 29  Barnwell 45  Berkeley  Clarendon 1  Clarendon 2  Clarendon 3  Chesterfield  Florence 1  Florence 2  Florence 3  Florence 5  Hampton 1  Laurens 55  Laurens 56  Lexington 4  Marlboro  McCormick  Orangeburg 4  Orangeburg 5  Saluda  Williamsburg Wave 1  Allendale  Dillon 3  Dillon 4  Florence 4  Hampton 2  Jasper  Lee  Marion  Orangeburg 3 Wave 3  Colleton  Calhoun  Edgefield  Sumter  Darlington

8 8 Think About IT !

9 What Are We Seeking? 9  The assessment’s focus is to determine if the district is ready for Online Testing in 2017.  We are analyzing the possible impact if additional classes are to be tested in subsequent years.  We are focusing on the following elements. Employees/Staff Students Vendors Processes Facilities Technology

10 Online Testing Variables FacilitiesHardware Number of Test Labs Issues with Space Age of Schools Desktops Laptops Operating Systems System Memory Infrastructure Available Bandwidth Wireless Access Points Switches/Routers Age of Hardware Databases Access Issues Brands of Equipment Other Factors Maximizing e-Rate? Knowledge of Grants? Staff to write Grants? How do Tech Staff & Teachers Work Together Are Practice Tests Available? Do Districts Collaborate? What Tech Vendors Districts Work With? Ability to Wire Building Hardware Availability Distance between Labs and Equipment Specific Areas of Examination 10 Performance Issues Number of Staff to Support Server Rooms & Storage Closets Quality and Availability of Support Staff Other Considerations Support Strategies (i.e. VDI) Backup/Spare Equipment Teacher & Student Skills Teachers ability to work with technology Students knowledge Of computers/keyboards Issues where English is 2 nd language Tech Support Team Ability To Educate Teachers and Students Impact of Technologies on Support Process Tech Support Staff Abilities Online Testing Requirements Relationship of Tech Staff and Teachers Tech Staff Morale and Risk if Key Staff leave Testing Schedule Impact on Other Groups

11 Send Data Gathering Workbook Send Data Gathering Workbook Initial Onsite Visit Initial Onsite Visit Initial Benchmarking Initial Benchmarking Draft Final Report Draft Final Report Collaborative Edit Final Report Collaborative Edit Final Report 3 1 4 6 7 What Methodology Do We Utilize? 11 Schedule Final Presentation Schedule Final Presentation 8 Conduct Final Presentation and Improvement Roadmap Conduct Final Presentation and Improvement Roadmap 9 Implement Necessary Improvements Collaborative Relationship Introductory Call Introductory Call 2 Collaboratively Edit Evaluation Findings Collaboratively Edit Evaluation Findings 5 Onsite

12 Evaluation/Rating Definitions  Readiness Status will be rated using the following numbers: 1 The district is unable to conduct testing based on current conditions. 2 The district could feasibly conduct testing but there will be multiple risks with a high degree of likelihood of major issues occurring. 12

13 Evaluation/Rating Definitions  Readiness Status will be rated using the following numbers: 3 District could meet the Online Testing needs for 2017. Additional work is needed to prepare for 2017 and may not be able to handle additional classes for online testing unless current technology capacity is enhanced. 4 The district is in good shape. They need to expand their testing capabilities beyond 2017’s testing requirements. 5 The district is prepared for 2017 and beyond. They do not have any measurable risks associated with Online Testing for 2017 or beyond. 13

14 What Have We Found ? 14  Not all districts are prepared for online testing in 2017.  Quality strategic planning directly impacts the level of preparedness.  Number of technical support staff does NOT directly correlate to level of preparedness. Some districts use their resources more efficiently.  There is some disparity in collaboration with technical staff and teachers. Where technical staff are constantly in “fire-fighting” mode there is less time to proactively strategize.

15 15  Technical knowledge level varies among districts.  Facilities’ age, available space, and proximity of testing labs affects preparedness.  Districts that collaborate with other districts appear to be better prepared. What Have We Found ?

16 16  Not all districts have the same knowledge of Online Testing Requirements.  There are opportunities for districts to collaborate and share costs of limited resources.  Most districts have little insight into tools or services available by other state agencies. What Have We Found ?

17 17  Some districts rely too heavily on their local technology vendors to provide skills/services solutions.  There is a need to communicate with local school boards regarding the role of technology in schools. Disconnects between the school board and the school administration appear common.  Some districts have multiple “single points of failure.” If a key technology resource were to leave, the ability to conduct online testing may be at risk. What Have We Found ?

18 18  Multiple districts have issues with the facilities. There are limitations related to internal wiring. This requires additional investment in wireless technology.  Multiple districts have less than ideal facilities for computer rooms and networking closets. The rooms are not properly air conditioned and in some cases there is a risk of water damage. If there are poor weather conditions the week of testing, this may impact the school’s ability to administer electronic testing. What Have We Found ?

19 19  There are multiple schools where the younger students are not prepared for the keyboarding skills required for online testing. These faculty and their associated students should be given advance training to become adequately prepared.  Some schools have disadvantaged children in their community that don’t have computer knowledge What Have We Found ?

20 20  Many of the districts have help desk ticketing systems that have significant back logs. These districts have little time to focus on preparing for online testing. Instead, these districts are focused on classroom technology which directly impacts a teacher’s ability to educate students.  Some districts don’t have the staff or knowledge to apply for available grants. Some districts are leaving money on the table that other districts may be leveraging. What Have We Found ?

21 21  Some districts that are planning to use CATE labs for testing may not have considered the impact on the classes that depend on access to those labs for instruction.  Information Security awareness differs among districts and sometimes within individual schools. District Information Security policies need to be standardized and communicated and implemented evenly to ensure all stakeholders are adequately prepared. What Have We Found ?

22 What Can We Do For You? 22  Share all lessons learned  Encourage collaboration discussions between districts  Provide a targeted analysis to your district  Provide guidance with development of your strategic roadmap  Help guide your team through your assessment and assist with your benchmarking activities

23 Available State Procured Contracts 23  State Fiscal Accountability Authority (SFAA) http://procurement.sc.gov/PS/agency/PS-agency-sw-contracts.phtm http://procurement.sc.gov/PS/agency/PS-agency-sw-contracts.phtm  Goods and Services Contracts http://procurement.sc.gov/PS/agency/PS-agency-goods-and-services.phtm http://procurement.sc.gov/PS/agency/PS-agency-goods-and-services.phtm  Information Technology Contracts http://procurement.sc.gov/PS/agency/PS-agency-info-technology.phtm http://procurement.sc.gov/PS/agency/PS-agency-info-technology.phtm  Division of Technology (DT) http://www.admin.sc.gov/technology/technology-operations/services http://www.admin.sc.gov/technology/technology-operations/services

24 Study Team Contact Information 24 Rob Cardelli, President Peak Performance Technologies Inc. Cell: (678) 570-3598 rcardelli@peakperformancetech.com


Download ppt "T ECHNOLOGY R EADINESS A NALYSIS. What We Will Cover  Who are we?  Why are we doing this?  Who are we meeting?  What are we seeking?  What methodology."

Similar presentations


Ads by Google