Presentation is loading. Please wait.

Presentation is loading. Please wait.

2014 UBTech Big Data and Learning Analytics SIG Anthony Bichel, Ph.D. Leading Edge Learning.

Similar presentations


Presentation on theme: "2014 UBTech Big Data and Learning Analytics SIG Anthony Bichel, Ph.D. Leading Edge Learning."— Presentation transcript:

1 2014 UBTech Big Data and Learning Analytics SIG Anthony Bichel, Ph.D. Leading Edge Learning

2 Analytics: Student Success Science

3 “Big Things Have Small Beginnings” What problems are we trying to solve? What questions are we trying to answer? What are the right elements to measure? What level or mode of analysis works best for our particular problem? How does this align with other institutional or strategic priorities?

4 Learning Analytics Defined & (Re)Defined Learning analytics is the “use of data, statistical analysis, explanatory and predictive models to gain insights and act on complex issues.” Educause 2012 “Learning analytics refers to the interpretation of a wide range of data produced by and gathered on behalf of students in order to assess academic progress, predict future performance, and spot potential issues.” Horizon Report 2012 “Learning analytics is the field associated with deciphering trends and patterns from educational big data, or huge sets of student-related data, to further the advancement of a personalized, supportive system of higher education.” Horizon Report 2013

5 Analytics Drivers Institutional Compliance / Accountability – We have to Institutional Performance – Its good for us Student Benefit – Its good for them Student Management – Because we can “All things are subject to interpretation. Whichever interpretation prevails at a given time is a function of power and not truth.” Friedrich Nietzsche

6 Five Steps of Analytics “Data is the foundation of all analytics efforts.” Capture – Selecting and Organizing – Policy Decisions Report Predict Act Refine

7 Examples of Learning Analytics Models VLE Dashboards Data Visualization Social Network Analysis Discourse Analytics Predictive Analytics Adaptive Learning Disposition Analytics

8 Data Visualization

9

10 The Benefits of Analytics “Learning analytics can help faculty improve teaching and learning opportunities for students” (Hrabowski, Suess, & Fritz, 2011; Mattingly, Rice, & Berg, 2012). The most valuable things about data are: (1) data about behavior and (2) changes in behavior Ways in which analytics can help educational institutions improve student achievement: Monitoring individual and cohort performance Identifying outliers for early intervention Predicting potential so that all students achieve optimally Preventing attrition from a course or program Identifying and developing effective instructional techniques Analyzing standard assessment techniques and instruments Testing and evaluation of curricula

11 Thematic Issues Institutional Success Factors Policy Concerns Privacy & Consent Faculty Evaluation

12 Background Research Surveyed north Texas ISD’s about analytics use Surveyed Association of College & University Policy Administrators (ACUPA) membership Surveyed 24 institutions implementing BbA and using either Banner or PeopleSoft SIS

13 Institutional Success Factors Organizational Capacity Leadership & Vision Tactical Readiness Faculty Development Learning Spaces

14 Capabilities Priorities Culture Will Future Opportunities Strategic Choke Points Future Challenges Challenges Organizational Capacity

15 DA SE CE Key: DA: Data Analysis SE: Statistical Expertise CE: Content Expertise PE: Policy Expertise Leadership & Vision DA PE

16 Tactical Readiness Educational institutions often lack the skills necessary to incorporate data into their everyday workflows, such as administrators and staff who can inculcate and support a data-based culture. Examples include: – Data analytics experts – Data visualization specialists – Data managers – Instructional designers – Graphic designers – Digital media specialists – App developers – Marketing manager for analytics and technology In addition to faculty, how many staff will require assistance and what will the nature of that help be? How many IT support positions will be needed as analytics usage grows?

17 Models of Faculty Development Volunteer-driven – Initiated by individual faculty member seeking specialized assistance Facilitator-driven – Typical classroom style of group instruction/training Technology-driven – Mediated support services delivered on-line via email, blogs, wikis, streaming and interactive video, or some form of social media Analytics-driven – Faculty receives robust, actionable, student and cohort performance profiles (dashboards) that demonstrate learner performance and/or mastery in near to real-time that result in dynamic adjustments to course content, delivery mode or learning pathways.

18 Faculty Development Modes Pre-Analytics – Lower costs – Improve efficiency – Increase productivity Post-Analytics – Change the ways that faculty think about and use information – Challenge the assumptions and bias that faculty bring to decision making – Faculty use data to generate new insights and contexts to serve students better “It's like they say in the Internet world - if you're doing the same thing today you were doing six months ago, you're doing the wrong thing.” - Bruce Feiler

19 Faculty Development Challenges The 2 nd law of Faculty Thermodynamics: Any project that requires sustained faculty energy probably won’t succeed. Faculty development for using analytics in the classroom involves data about faculty performance in the classroom. Dashboards are not the nightly news. Follow-up and follow-through is critical when dealing with analytics. Individual faculty are not the only players – faculty managers and student support staff need to be involved.

20 Typical Faculty Profile (R1) Total Faculty = 1,621 – Tenured faculty = 446 (28%) – Tenure track = 179 (11%) – Non-tenured = 996 (61%) Faculty Profile by Rank – Professor = 221 – Assoc. Professor = 234 – Assist. Professor = 170 – Other faculty = 992 – Unknown = 4 Dept. Teaching Assistants = 703 Total Instructors = 2,324 Current Faculty LMS Users = 30% (486) – 50% (810)

21 UTA Suggestions All Common Core (CC) faculty must use Bb All CC faculty must attend specialized training for using analytics in the classroom All CC faculty, student support staff and department chairs must attend a joint meeting to discuss roles, responsibilities and desired outcomes

22 Learning Spaces: Collaborative Classrooms The archetype collaborative classroom is defined by a technologically infused and socially-networked space that can be arranged, rearranged or disarranged for whatever purpose is required during a f2f class. Popular models include the MIT “Technology Enabled Active Learning (TEAL)” and University of Iowa “Transform, Interact, Learn, Engage (TILE) Classrooms. http://www.classrooms.uiowa.edu/TILE.aspxhttp://web.mit.edu/edtech/casestudies/teal.html

23 Challenges of Collaborative Classrooms Classroom control or lack thereof Much more time intensive Student-centered activities often take more time than allotted Individual differences among students Students must prepare outside of class TILE / TEAL classrooms are not oriented for discussion because students can’t see another Lack of a focal point in the room makes lectures problematic, if not outright counterproductive Faculty development and support essential to success

24 McKinsey Global Institute The classroom is where many of the most valuable applications of data will evolve. Improved instruction can be enabled by developing: Personalized learning plans for students, Frequent feedback on teacher performance, and Targeted professional development programs for educators. “We estimate the potential value from improved instruction to be $310 billion to $370 billion per year worldwide.”

25 Policy Concerns

26 Policy Drivers Creating a culture of evidence on campus requires clear policies and processes with respect to the use of data. Developing trust and normalized processes that transcend individual personalities and arbitrary decision making are key. Issues include: Data Governance Roles and Responsibilities Data Politics “Data access is perhaps one of the most important – and difficult – policy issues to clarify.” Data Privacy and Fair Information Practices De-Identification of Data

27 Policy Issues “A key component of an analytics program is first to identify the policy questions to be answered and then to engage in a risk-management exercise, including a cost/benefit analysis, to determine if analytics will provide the answers or feedback needed.” Rodney J. Petersen, Educause Review July/August 2012 Ethics Privacy Ownership of data Best practices “All things are subject to interpretation. Whichever interpretation prevails at a given time is a function of power and not truth.” Friedrich Nietzsche

28 Case Study: UTA Policies/Materials Impacted by Analytics Implementation FERPA Training for Faculty and Staff Handbook of Operating Procedures (HOP) – Rights, Responsibilities and Duties of Faculty Members – Annual Review and Comprehensive Evaluation of Faculty UT System Rules and Regulations of the B.O.R. UT System Documents

29 Privacy & Consent

30 By Nathan Jurgenson Identity in the Analytics Age

31 Data Privacy & Fair Information Practices Notice / Awareness Choice / Consent Access / Participation Integrity / Security Enforcement / Redress Federal Policies (Privacy Safeguards)

32 Student Rights Under FERPA The Family Education Rights and Privacy Act (FERPA) affords students certain rights with respect to their education records, including: The right to consent to disclosures of personally identifiable information contained in the student's education records, except to the extent that FERPA authorizes disclosure without consent. One exception which permits disclosure without consent is disclosure to school officials with legitimate educational interests. A school official is a person employed by the University in an administrative, supervisory, academic, research, or support staff position or a person or company with whom the University has contracted. FERPA was written before the Internet

33 Privacy Concerns “Someday we'll laugh about the way we used to worry about a "credit score," because the data will be so much deeper, more intrusive, and more deeply hidden than credit scores ever were.” Blogger Tim_Sims Students shed streams of data about their academic progress, work habits, learning styles and personal interests as they navigate educational websites. All that data has potential commercial value: It could be used to target ads to students or their families, or To build profiles on them that might be of interest to employers, military recruiters or college admissions officers. The law is silent on who owns that data. Kathleen Styles, the Education Department’s chief privacy officer, acknowledged that much of the data is likely not protected by FERPA – and thus can be commercialized by the companies that hold it.

34 “Notice and consent fundamentally places the burden of privacy protection on the individual – exactly the opposite of what is usually meant by a “right.” “As a useful policy tool, notice and consent is defeated by exactly the positive benefits that big data enables: new, non ‐ obvious, unexpectedly powerful uses of data. It is simply too complicated for the individual to make fine ‐ grained choices for every new situation or app.” “The purpose of notice and consent is that the user assents to the collection and use of personal data for a stated purpose that is acceptable to that individual... this framework is increasingly unworkable and ineffective.”

35 “Ultimately the vision for Knewton is that everyone should have their own learning profile that’s free, secure, hosted in the cloud, and just follows them around forever.” Jose Ferreira, CEO, Knewton “…We’re extremely cognizant that we hold a data set that is the most important data set in ones’ life, other than maybe your healthcare data.“ “…In 30 years, the human race will be totally dominated by data science…In terms of education and healthcare, I don’t think there will be an invasion of privacy because there’s not going to be any marketing of it.”

36 Just Because it is Accessible Doesn’t Make it Ethical It may be unreasonable to ask researchers to obtain consent from every person, but it is unethical for researchers to justify their actions as ethical simply because the data is accessible. Privacy doesn’t mean the same to everyone and the end of it will affect people of different social and economic classes differently.

37 Faculty Evaluation

38 Sloan-C Presentation There are many areas of data that could be used to create a faculty performance dashboard: Student surveys Faculty surveys Peer surveys Faculty issues Professional development Workload GPA and grading Defining what the organization deems important for data collection, is the first step in creating faculty performance dashboards.

39 Facets of Faculty Evaluation Traditional Educational Setting: (1) number of assigned courses (2) development of course curriculum (3) number of drops (4) GPA distribution (5) number of refereed students in comparison to number of graduates (6) timely grading (7) timely final grade submission (8) peer review Online Environment: (1) using the defined curricula (2) number of drops (3) GPA distribution (4) weekly online course activity (5) student responsiveness (6) timely grading (7) timely final grade submission (8) faculty development opportunities (9) peer review process Analytics Infused Instruction (1) Individual student learning outcomes (2) Cohort performance profiles (3) Adaptive curriculum (4) Optimized modalities

40 Mapping Faculty Concerns Instructional Concerns: – What are faculty obligations? – How do we handle the increase in students seeking assistance? – Who contacts the students, under what conditions, and for how long? – Should faculty know what resources are available to students needing help? – What do faculty need to know about how FERPA applies? – Are faculty obliged to create new material or to update materials based on evidence (data) that indicates consistent failings or misunderstandings? Professional Development Concerns: – Will there be an orientation program that clarifies roles and responsibilities? – What resources are available to assist faculty with course redesign? – What expertise is needed to respond to data-driven alerts? Faculty Evaluation Concerns: – Is there an opt-out option? – Analytics can reveal which teaching techniques are more effective than others, are faculty obligated to adapt accordingly? – What added time constraints might be incurred by using analytics? – How will this new evidence of teaching effectiveness be incorporated into faculty evaluation/review?

41 Academic Freedom


Download ppt "2014 UBTech Big Data and Learning Analytics SIG Anthony Bichel, Ph.D. Leading Edge Learning."

Similar presentations


Ads by Google