Can Learning Analytics Enhance Student Success In Online And Remote Learning? Two Georgia Tech Case Studies

Farahnaz Soleimani, Research Scientist, and Jeonghyun Lee, Assistant Director of Research in Education Innovation, Georgia Tech’s Center for 21st Century Universities (C21U)

Farahnaz Soleimani, Research Scientist, and Jeonghyun Lee, Assistant Director of Research in Education Innovation, Georgia Tech’s Center for 21st Century Universities (C21U)

With student needs and circumstances changing on what seems to be a daily basis, universities are increasingly recognizing the value of big data to solve educational problems. In order to comprehensively explore the complex factors that impact student outcomes, the Center for 21st Century Universities (C21U) leverages data-driven studies to help faculty and administrators make informed decisions that foster the success of students at Georgia Tech. To achieve this research goal, C21U takes various learning analytics and machine learning approaches and couples them with data science and educational theory. In the past few years, two cases illustrate how C21U has applied learning analytics techniques to understanding and enhancing students’ learning experiences.

Online Master’s in Analytics (OMSA) Applicant Success Prediction

The C21U research team aims to establish an equitable model to predict the success of applicants to Georgia Tech’s Online Master’s in Analytics (OMSA) program. This program has seen an exponential increase in applicants since the program’s inception in the Fall of 2017, with more than 10,000 applicants to date. The workload for faculty and staff to process these applications is immense, and Georgia Tech has requested predictive models to help in this decision process. The criteria that application reviewers use in their admissions decision is fundamentally different from a traditional master’s level program. In a traditional program, the goal is to create a ranked list of the best candidates and then make offers until all positions are filled, but the goal of OMSA is to admit any candidate who is likely to succeed in the program. We do not want to admit a student who is not prepared to succeed in the rigorous program, but we do want to admit every student who can succeed. This approach is a challenge because the applicant pool is much more diverse than the on-campus applicant pool and reviewers may not have personal experience instructing this broader pool of students.

Due to the above factors, using past application data to predict the success of new applicants is of great value to the OMSA program. In this project, we are interested in identifying a set of application features that have the most significant impact on program admission and completion. Specifically, machine learning approaches are implemented to establish the models which predict whether an applicant would be admitted to the program. In the proposed models, types of applicant data includes demographic information, academic history (e.g., degrees earned), test scores (e.g., TOEFL, GRE), and supplemental information such as the applicant’s background in computer programming. Through our preliminary data processing, we identified hundreds of unique variables (e.g., college GPAs, college duration, the applicant’s plan to pursue a Ph.D.) from ~4,000 existing variables. We also processed the PDF documents of each applicant’s statement of purpose and letters of recommendation to extract metrics corresponding to different characteristics of these documents and to measure the quality of language. This is of great importance, particularly since these documents do not typically provide uniform knowledge about the applicants due to the absence of a standardized format for evaluation and the subjective nature of the narratives. Therefore, in addition to exploring the influence of standardized application features on the applicants’ success, we plan to investigate the extent to which the set of text-based application features would predict successful admission to academic programs.

We hope that the results of this project can be used to provide the admission office staff and faculty with guidance on the selection of applicants and to help them make informed decisions during the admission process. In the long run, our next steps will include using applicant data to model the successful completion of the analytics program’s three core courses, graduation, and eventual job placement. We hope that this research can be used to enhance the program and course design in meaningful ways, such as improving course content and prerequisite training.

Key Performance Indicator (KPI) Dashboard Tool in Canvas Courses

The COVID-19 pandemic has greatly impacted the role of faculty as they transitioned from face-to-face instruction to remote delivery of courses. Keeping this context in mind, C21U set out to build an analytics tool for faculty to address salient challenges that they have faced in online course settings. Based on findings of a survey administered during the end of the Spring 2020 semester, C21U observed that faculty often had difficulty receiving spontaneous feedback from their students during emergency remote teaching because of the reduced interactivity and lack of social cues in online environments. This was found to be a critical issue because it made it difficult for faculty to make necessary adjustments or redesign their instruction methods to improve students’ learning experiences. Another common challenge was that faculty already made extra time commitments and effort to adapt to remote teaching and thus, had little to no time to create a new instructional tool. Therefore, taking these issues into consideration, we designed the Key Performance Indicator (KPI) tool to enable all faculty to monitor students’ learning and progress quickly and easily.

“The KPI tool is designed to give faculty near real-time insight into how students are doing in a course.”

The primary goal of the KPI tool is to give faculty easy access to continuous weekly feedback from students and provide a snapshot of how students are performing in their course in near real time. Faculty typically only gain insight to student learning with low frequency, high stakes tests. They may only get a sense of how students feel about a course with an end of the semester survey. The KPI tool is designed to give faculty near real-time insight into how students are doing in a course. Any instructors who are interested can install the tool in Georgia Tech’s Canvas learning management system (LMS) with minimal effort (only a few clicks). This feature makes the tool rapidly scalable across many Canvas courses at Georgia Tech. Once the tool is installed in Canvas, both instructors and students can access it in the course navigation bar. Students are directed to a short anonymous survey that asks them to rate their learning experience in five dimensions, including: pace of course modules, self-mastery of learning goals, support for learning, clarity of course materials, and class participation. Then instructors can view a dashboard that shows their students’ average ratings on a weekly basis. These ratings are visualized in a color-coded heat map, which enhances the interpretation of the trends across dimensions and over time. Although we have not implemented it yet, the tool also provides the capability to aggregate the dashboard views so that a school chair or dean could gather a quick sense of how a semester is going across a discipline and identify and address potential problem areas before they become too large.

Since C21U launched our pilot of the KPI tool in the Summer 2020 semester, instructors from more than 50 courses have participated in testing the tool. Our pilot participants have generally shared positive feedback through surveys and interviews. For example, participants on average perceived the tool as somewhat useful and satisfactory. Many of them reported that they liked the simplicity and ease of use of the tool. Some participants indicated that they try to use students’ weekly ratings to improve their instruction methods. However, several issues remain in terms of students’ low submission rates. We observed that no instructor required their students to complete the KPI survey and there were typically only a few students who volunteered to submit their responses in consecutive weeks. Also, several instructors have pointed out that students already receive many surveys and the KPI survey might overburden them.

Moving forward, we will release the newest version of the KPI tool in the Fall 2021 semester, which now operates in real time and provides a visually enhanced dashboard. We will continue to explore what types of faculty support and guidance will lead to increased use of this formative assessment tool. One possibility is to provide faculty with sample language that they can include in syllabi, course announcements, and built-in reminders to help both faculty and students, especially those from online course environments, see the value in the KPI tool.

Final Thoughts

Overall, our two case studies highlight how data analytics can guide and support the learning journey of students in online learning environments. From admissions to class participation and graduation, our research helps our institution predict areas where students may struggle and empowers administrators and faculty to take actionable steps to help at-risk students get back on track. Furthermore, findings from our research inform the development of accessible and sustainable tools that can support faculty by enabling them to make informed decisions to improve instruction and learning programs.

Weekly Brief

Read Also

Our AI Crisis isn't Technical. It's Human.

Our AI Crisis isn't Technical. It's Human.

Fatma Mili, Interim Dean College of Humanities and Social Sciences, Montclair State University
Active Engagement is the Foundation of Effective Leadership

Active Engagement is the Foundation of Effective Leadership

Amber Pleasant, Program Director of Adult Education and English Language Learning, Aims Community College
4 Keys to Managing Change

4 Keys to Managing Change

Sten Swenson, Director of Information Technology, North Carolina State University
Empowering Leadership through Innovation in Higher Education

Empowering Leadership through Innovation in Higher Education

Brian Fodrey, Assistant Vice President, Business Innovation, Carnegie Mellon University
The New Era of Education

The New Era of Education

Yrjö Ojasaar, Investment Partner, Change Ventures
Redefining Readiness: A Path Toward a Technology-Agnostic Future

Redefining Readiness: A Path Toward a Technology-Agnostic Future

Fatma Elshobokshy, Director of the Center for the Advancement of Learning (CAL), University of the District of Columbia