THANK YOU FOR SUBSCRIBING
Be first to read the latest tech news, Industry Leader's Insights, and CIO interviews of medium and large enterprises exclusively from Education Technology Insights
THANK YOU FOR SUBSCRIBING
Like many schools across the country, Carnegie Mellon University’s first response to the COVID-19 pandemic was a flurry of action as we worked to move our 4,900 course-sections to remote instruction. In time, we realized the opportunity the crisis presented. Several months in, among other lessons, I learned that thoughtfully using learning analytics can help learn in remote instruction.
As Executive Director of CMU’s Simon Initiative, I have been working on expanding access to and improve the quality of education for many years. The pandemic highlighted the importance of something we already knew from research-active, asynchronous learning activities can be a crucial tool for instructors to make their teaching more effective and equitable. Providing asynchronous opportunities offers mechanisms for students to personalize and review their learning while also bridging some of the equity-related digital divide issues that synchronous online instruction can exacerbate.
Yet these types of activities can often be a challenge for educators to integrate into their instructional practice, in part because their asynchronous nature means that the students’ learning and misconceptions aren’t readily apparent to educators—how can we make this work more visible, so that educators can better target their synchronous efforts to students demonstrated needs?
At CMU, we know that data can tremendously benefit both students and teachers, so we capture learner’s interactions to drive powerful feedback loops. Learning analytics take a wide variety of forms. The learning dashboard in our Open Learning Initiative courseware provides educators with an estimate of how well their class is meeting learning objectives and allows them to drill down to better see specific skills and misconceptions for driving class discussion or examples; it can also provide insights on individual learners for targeted support and guidance. Other tools can improve the design of learning activities, from pedagogical audits that can help make activities more robust to learning curve analysis, which helps designers understand how well their learning model is aligning with actual student performance and adjust accordingly. Still, other approaches can help both students and educators gain a new, shared perspective on activities; Docuscope, for example, provides a visualization of the rhetorical moves made in student writing.
Automated approaches should not be the goal of learning analytics. A good system should augment and support the people engaged in teaching and learning; these human-inthe-loop systems are a hallmark of the Simon approach. Too often, automation is paired with opaque systems of only limited usefulness, and frequently are focused on (or at least used) in solving problems that are at arm’s length to the core work of our educational institutions—identifying potential plagiarism, proctoring exams, rather than providing feedback or enacting learning. Such approaches are frequently bereft of a real evidence base (substituting marketing literature for real science). And beyond the concern that these tools may be literally in opposition to, rather than enacting learning, educators are increasingly worried about the ways that these applications gather and share data and the ways that they can surveil educators and learners alike.
Read Also
I agree We use cookies on this website to enhance your user experience. By clicking any link on this page you are giving your consent for us to set cookies. More info