The Effective Use of AI for Assessment and Exam Integrity

Gladys M. Bennett, Ph.D., Director of Testing at Norfolk State University

There are many ways in which today’s cutting-edge technologies are being used to positively impact higher education. Recent advancements in artificial intelligence (AI) have been used to optimize efficiency in the development, delivery, and assessment of student learning. In higher education, the primary purpose of exams and assessments is to measure the students’ understanding and application of concepts and principles, identified as learning outcomes. Inherent to establishing reliability and validity of AI assessment tools is the effective measurement of course learning outcomes. Exam integrity is essential to effective measurement.

Methods for maintaining exam integrity during test administration include in-person proctoring, live remote proctoring, and automated remote proctoring using AI, which ensure that students are taking their own exams and not using unauthorized aids to gain an unfair academic advantage over other test takers.

To ensure the reliability and validity of AI assessment measures in higher education, it is imperative that results be evaluated in correspondence with course learning outcomes, and that integrity guidelines are established for appropriate use of AI tools.

Defining AI

AI allows computer systems to simulate human processes in performing complex tasks, using algorithms developed by human programmers. Generative AI can create new content resembling real world data, based on the data sets they were trained on. It is the ultimate responsibility of the user to verify the accuracy of the AI output resulting from data input.

Benefits and Dangers of AI for Assessment

It has been widely acknowledged that there are potential benefits and dangers in the use of AI. Potential benefits of AI for exam management include the use of algorithms, developed by human programmers, to simplify content creation, exam scheduling, exam delivery, proctoring, instant scoring, and feedback. AI also provides Computer Adaptive Testing (CAT) which customizes exam delivery to student responses, thereby reducing exam time. These benefits can reduce faculty time devoted to exam management thereby increasing time for more engaged student instruction. In addition to exams, faculty use authentic assessments such as presentations, projects, experiments, role plays, and others. For authentic assessments, AI can provide students with immediate access to expansive digital information for completion of graded assignments.

"To ensure the reliability and validity of AI assessment measures in higher education, it is imperative that results be evaluated in correspondence with course learning outcomes, and that integrity guidelines are established for appropriate use of AI tools."

Potential dangers of AI include bias in datasets used to train the software, capacity to create misinformation, and lack of transparency in how AI decisions are made. Additionally, AI has the potential to negatively impact the effectiveness of remote proctoring of exams. There are a variety of remote proctoring solutions available using AI to flag the use of unauthorized devices and resources, to flag other persons in the room, or block the use of AI or other websites during the assessment (Respondus LockDown Monitor, ProctorU, Honorlock, etc.). There have been several instances of false flags due to imprecision in some AI algorithms. Research has found many false positives with AI detection tools. It is therefore important not to use results of these tools alone to accuse students of violating exam integrity policy.

“Some experts argue that AI-based learning platforms could hinder critical thinking skills and reduce human interaction, which is an essential aspect of learning.” This “may lead to a sense of isolation or disconnection from the learning process.”

Recommendations to Increase Integrity in the Use of AI for Exams and Assessments

To increase integrity in the use of AI for exams and assessments, universities must establish policies which includes strategies for addressing the following:

• Accessibility and accommodations for diverse populations (i.e., special needs, ESL, low socio economic)

• “Avoiding bias, ensuring privacy of users and their data, and mitigating environmental risks”

• Dissemination of information on how and when AI can be used for homework assignments and unproctored assessments.

• Guidelines for citing the use of AI when used for graded assignments.

• Define expectations for students; when AI is acceptable to use AI and when it is not. • Specify faculty permissions required to use AI for assignments.

• Make clear the limitations of AI and the students’ responsibility for verifying the accuracy of information produced from AI prompts.

• Procedures for obtaining students acceptance of AI policies and procedures (possibly in course syllabi, or another document through electronic signature)

• Training program for students and faculty on how to utilize AI appropriately and responsibly in support of student learning and faculty instruction.

• Faculty training on strategies to protect intellectual property used for testing and assessment.

Weekly Brief

Read Also

Riding the AI Wave: Opportunities and Responsibilities for Educational Institutions

Riding the AI Wave: Opportunities and Responsibilities for Educational Institutions

Richard Walker, Associate Director (Digital Education), University of York
Herding Faculty: How Course Coordinators Drive Assessment of Learning

Herding Faculty: How Course Coordinators Drive Assessment of Learning

Kent Seaver, Director, Academic Operations, the University of Texas, Dallas
Designing with AI: Why Instructional Designers Still Need Human Mentors

Designing with AI: Why Instructional Designers Still Need Human Mentors

Melody Buckner, Associate Vice Provost, Digital Learning and Online Initiatives, University of Arizona
Leading Learning Technology: Reflections on Leadership, Innovation and the Future

Leading Learning Technology: Reflections on Leadership, Innovation and the Future

Rob Howe, Head of Learning Technology, the University of Northampton
Rethinking Student Services for a New Era of Higher Education

Rethinking Student Services for a New Era of Higher Education

Joseph Granado, Vice President of Student Services, Midland College
Designing Innovation through People, Not Ideas

Designing Innovation through People, Not Ideas

Nathan Kraai, Director of Innovation and Design Thinking, the Fenn School