THANK YOU FOR SUBSCRIBING
Be first to read the latest tech news, Industry Leader's Insights, and CIO interviews of medium and large enterprises exclusively from Education Technology Insights
THANK YOU FOR SUBSCRIBING
Since the first online courses launched, there has been some level of scrutiny over the quality and rigor of the online experience. Efforts like No Significant Difference were launched to combat negative perceptions, federal regulations were enacted (source), and regional accreditors unveiled distance education standards. Meanwhile, traditional/face-to-face learning continued unfettered by the same scrutiny.
Since the early 2000s, we’ve seen numerous quality standards developed in support of online learning, including Quality Matters, OLC Scorecard, OSCQR, and the iNACOL (now Aurora Institute) standards for K12 online. Yet collectively, we’ve still struggled with how to measure our effectiveness in supporting student success online. Many factors contribute to the student experience including access both to proper technology and adequate broadband, but there are other factors around student AND faculty confidence in technology and the learning platform as well as course design, class size, student supports available, and so on.
But are these factors different from face-to-face teaching?
With an intense focus on 21st-century skills and digital and information literacy, are there any on-ground programs that can be completed without the use of a computer and the internet? If so, are those programs properly preparing our students for the workplace?
COVID-19 certainly accelerated the blurring of lines regarding learning modalities. We’ve invented new methods and labels to describe how we delivered education during this period. We created a blanket label of ‘Emergency Remote Teaching’ to distinguish rapidly developed, digitally delivered courses from the carefully designed asynchronous online courses (that have met all the federal, regional, and local quality standards) – and some regulations were adjusted to accommodate those emergency measures. If there is agreement that digital literacy is core to all 21st-century learning experiences (as well as employability/career advancement), shouldn’t we develop a common set of key performance indicators (KPIs) to measure effectiveness and progress on all education experiences?
This has been a core question and discussion amongst the members of the WCET Digital Learning Systems and Consortia Leaders Group. The WICHE Cooperative for Educational Technologies, under the Western Interstate Commission for Higher Education (WICHE), has been a leader in the “practice, policy, & advocacy of digital learning in higher education.” The digital learning systems and consortia group has been focused on the effective use of technology in the classroom for over a decade and recently developed a vision statement as a call to action:
Regardless of educational modality, focus on effective practices for the continuous improvement of teaching, learning, and student support services.
In support of this call, the group has recommended a set of Digital Education KPIs that we invite systems and institutions to adopt and measure continuous improvement efforts towards 21st-century, digitally delivered learning experiences, regardless of modality.
Digital Education KPI Categories
‘Readiness’ is a central theme for the KPIs listed below. We ask not only whether the student is ready for technology-driven education, but also if the faculty have the necessary skills and support and if the institution is ready to provide the necessary infrastructure to support all parties. The following list is a work in progress and not meant to be an exhaustive list of variables.
Student Readiness
• Technology Access: Do students have access to current, appropriate, and dedicated technology to be successful?
• Broadband Access: Do students have access to consistent and safe high-speed internet?
• Tech Confidence Level: Do students have the requisite competencies and confidence to use the supporting technologies to the best of their abilities?
• Online Learning Confidence: Do students have the necessary strategy and skills to succeed in digital delivery format?
Key Question: How are institutions defining and assessing student readiness?
Student Engagement
• Class Participation: How are you tracking substantive attendance, beyond the login rate?
• Content Engagement: How are you measure content engagement, beyond views?
• Completion: How are you measure time to completion or time on task?
• Technology: Which platforms/tools have better engagement? Does the device/browser use impact participation?
• Campus Activity: Are the students engaged in any campus programs beyond the course?
Key question: Are the students present and engaged/active?
Student Assistance
• Academic Supports: What services (tutoring, writing lab) are available to students, and when?
• Resources: Are your resources (e.g., library) available in a digital format?
• Accommodations: How do you support students with disabilities remotely?
Key question: What student supports are being utilized vs. needed?
Student Demographics
• Demographic Information: How are you tracking student success against measures such as age, gender, race, ability, Pell eligibility, ZIP code, prior GPA, 1st Gen, etc.?
• Academic Load: How are you tracking academic variables such as credit/course load and their impact on student success?
• Environmental Impacts: How are you tracking outside influences (such as food/housing insecurity, medical care, child/eldercare) and their impact on student success?
Key question: what environmental factors impact student success?
Student Outcomes
• Course Completion
• Grade / GPA
• Persistence (term to term)
• Time to credential / degree/graduation
• Transfer Success (credits and standing)
• Employability / Career Advancement / Earnings
• Outcomes Measures
• Student Debt
• Student Experience/Satisfaction
Key question: how do we define student success?
" Regardless of educational modality, focus on effective practices for the continuous improvement of teaching, learning, and student support services "
Faculty Readiness
• Technology Access
• Mobile vs. PC device
• Tech Confidence Level
• PC, LMS, Tools
• Broadband Access
• Min Speed
• Mobile vs. WIFI
• Online Learning Confidence
• Past online experience
• Past online development
• Training cycle
Key Question: How are institutions handling faculty readiness?
Faculty Factors
• Login Rate
• Response Time / Frequency
• Workload
Key question: How engaged are our faculty?
Course Factors
• Average enrollment/class size
• Student Demographics / Diversity
• Completion rate
• Performance rate
• Student Feedback
• Quality measures/standards (UDL, QM, OLC, etc)
• Course structure/design
• Accessibility
• Assessment / Evaluation
• Resources Utilized
Key question: How do course variables impact student success?
Institutional Readiness
• Culture
• Policy
• Regulatory Alignment
• Infrastructure
• Professional Development
• Faculty Supports
• Student Supports
• Risk Tolerance
Key question: How do institutions prepare for digital education?
By offering these Digital Education KPIs to assess student, faculty, and institutional readiness, we hope that quality and efficacy for all education, regardless of modality, become enhanced and better aligned.
Fortunately, we can already see that some effort has been made to align face-to-face with established online learning practices. For example, the Association of College and University Educators recently released their Effective Practices Framework, which seems to mirror many of the best practices developed in support of online learning as well as wrapping in ‘Understanding by Design’ (backward design) and elements of Universal Design for Learning.
We hope to support the continued shift towards a unified approach to digital education.
Read Also
I agree We use cookies on this website to enhance your user experience. By clicking any link on this page you are giving your consent for us to set cookies. More info