: Colleges turned to software to catch cheaters during the pandemic — inside one school’s decision to ban it

United States

Last spring, as college faculty across the country were forced to make dozens of decisions in a matter of days about what elements of their in-person courses could be translated into Zoom rooms, administrators at the University of Michigan-Dearborn made at least one choice for them: The environment in which students would be tested. 

Officials at the school told faculty that they couldn’t use remote proctoring software, which are programs that monitor students during tests for behavior that could indicate they’re cheating. 

Administrators didn’t want to leave a decision with major implications for students up to individual professors during such a harried time, even though they believed faculty had students’ best interest at heart, said Mitchel Sollenberger, the school’s associate provost for undergraduate programs and integrative learning. 

“A large portion of our faculty have never taught online,” he said. “We knew we were going to have faculty responding to going online in ways that would have negative ramifications for students.” 

“They might in the moment decide, ‘let me use this proctoring service,’” Sollenberger added. 

Schools and faculty raced to sign up for remote proctoring software

That’s what hundreds, if not thousands, of instructors across the country did. Between 2019 and 2020, the number of institutions partnering with Proctorio, a leading remote proctoring software company, went from 400 to 2,000, according to the company. As of May 2021, the company had about 3 million active weekly users and was working with roughly 2,500 institutions. 

“There was just this influx of ‘Hey we need this software and we need it now, which was really different,’” said Mike Olsen, the chief executive officer of the company. Pre-pandemic, hammering out a contract between Proctorio and a university could take up to nine months, he said, but once classes went remote, schools were calling to say, “we need it yesterday,” he said. 

Schools that already had contracts with the company also started increasing the use of its program. “Now all of the sudden because campuses are closed, every single exam including your weekly quizzes, were now using some form of our software,” Olsen said.

Of some schools’ decision to ban the use of remote proctoring technology, like his company’s software, Olsen said, “I find it very strange.”

Some students and faculty could feel “ripped off” if they know cheating is taking place at their school, he noted, and some professions, such as working in a pharmacy where graduates would have access to drugs, “require this sort of level of integrity.”  

Academic integrity industry started taking off in 2017

The academic integrity industry first started taking off in 2017 amid an uptick in distance learning and has continued to grow in the years since. The range of services offered by the sector include software that locks down browsers, programs that analyze papers for plagiarism and monitoring services that either make it easier for a human to watch a student take a test without physically being in the room or use artificial intelligence to keep an eye on students. 

Data from 1,751 universities collected by ListedTech, a market research firm that tracks the use of education technology indicates that roughly 47% of these schools use one of these programs, nearly 25% use two and close to 15% use three or more. 

It’s easy to see why instructors reached for these services in the early months of the pandemic. Faculty were asked to very quickly transform their in-person courses to online classes. Changing tests that are normally monitored in the classroom to be cheat-proof when taken from a student’s bedroom can be a heavy lift. 

‘It wasn’t like these softwares got more invasive, there was just more attention.’

— Mike Olsen, CEO of Proctorio, a remote-proctoring software company

But critics of remote-proctoring software say the programs invade students’ privacy, discriminate against underserved students and don’t solve challenges related to academic integrity. Now, a movement of instructional designers, faculty and administrators are hopeful that the controversy surrounding the use of remote proctoring software during the pandemic will usher in a new approach to assessing students that persists even when students return to classes in person. 

They’re urging faculty to use tests that focus more on measuring students’ critical thinking skills and less on what they can memorize. These tests have the added benefit of being much harder to cheat on, they say. 

“People have really started to rethink ‘Why am I doing things the way that I’ve been doing them for a number of years?’” said Kim Manturuk, associate director of research, evaluation and development at Duke University. “We’ve been really encouraged by the fact that there’s a lot of attention right now to how we can carry innovations forward from the pandemic.” 

UofM Dearborn used CARES Act funds to help faculty transition to remote learning

Providing faculty and staff with the tools and support necessary to rethink their courses and the way they assess their students isn’t cheap. The ask comes at a time of financial uncertainty for colleges. Many schools’ business models were already stretched and the pandemic squeezed budgets even further. 

At Dearborn, officials used some of the nearly $ 7 million the school received from the federal government through the CARES Act to provide resources to instructors as they pivoted their courses. That included hiring Sarah Silverman, an instructional designer, who has also engaged in activism surrounding remote proctoring. 

Some of the issues she sees with the software schools use to monitor student during exams include: they ask students to let a company into their testing space, which can be a violation of privacy; the cost to universities and, in some cases, to individual test-takers; and that the technology requirements for the software often make it difficult or impossible for some students to use.  

In addition, because the software, which in some cases watches students through the camera on their computer, sometimes flags physical movement in a student’s test-taking space as suspicious, “they tend to be very biased towards students who can produce that distraction-free environment,” Silverman said. 

At Dearborn, many of the students don’t have the luxury of taking a test in their private bedroom, Sollenberger said. More than 40% of the school’s students receive a Pell grant, the funding the government provides to low-income students to pay for college. In addition, many are caring for children, parents, siblings or other relatives. 

“When you think of higher education, you’ve got these students, who, they come from the dorm, they roll out of bed and they might have some work-study, but other than that they’re not having too many obligations,” he said. That’s not the case at University of Michigan-Dearborn, which is part of the reason why administrators felt strongly that proctoring software wouldn’t make sense for the schools’ students. 

Critics have concerns about privacy, bias

Critics of the software say that in addition to putting students who live with or care for others at a disadvantage, the proctoring programs, which may measure things like eye movement, are ableist. The programs, which use machine learning, also raise questions about how students of color fare in these systems. Facial recognition technology has been shown to do a poor job of recognizing non-white faces.  

Over the course of the pandemic other schools have grappled with use of the technology. In March of 2020, the University of Illinois signed on to use Proctorio for a year and decided to extend the short-term license through August of 2021. In January, officials contacted instructors to remind them that they would lose access to the service starting in fall 2021. 

In the memo, the school said some had raised concerns about remote proctoring “including issues related to accessibility, privacy, data security and equity,” adding that officials “take those concerns seriously” and are investigating remote proctoring tools for longer-term use. 

At San Francisco State University, the faculty senate passed a resolution banning the use of online proctoring technology in courses taught at the school starting in spring 2021. There are a few exceptions to the restriction, including if the student is taking the test as part of earning a certification or license that requires use of the technology. For faculty to receive an exemption, they need to request approval by an appropriate dean in consultation with technology, accessibility, and teaching and learning officials at the school

Olsen, Proctorio’s CEO, said as more schools and classes started using the company’s technology during the pandemic, Proctorio began to hear more about concerns about the software. Olsen said he attributes that uptick in part to the fact that students and faculty were suddenly forced to cope with learning and testing online. 

“It wasn’t like these softwares got more invasive, there was just more attention,” he said. 

In the fall 2020 semester, the company added a new feature that tells students before their exam where video and audio recording data will be stored and for how long. Proctorio only stores the data for as long as each institution has requested it be saved, the company said. 

As for questions surrounding whether remote proctoring software is biased against students who are disabled, not white or can’t produce a distraction-free environment, Olsen said instructors have discretion as to what they do with the information they get from the program. If an instructor decides they want Proctorio to record students taking a test, the software would alert the instructor to behavior the instructor has set the software to highlight as suspicious. For example, talking during the exam. 

“Maybe it’s their kid asking for a glass of water or maybe it’s a roommate giving them the answer to question one,” he said. The instructor can review the footage and decide how to handle what’s been identified. “It’s not going to kick you out, it’s just going to flag whatever those behaviors are,” Olsen said of the software. 

Setting aside concerns about remote proctoring technology’s implications for equity, advocates of abandoning it say using these programs may hinder instructors’ efforts to promote academic integrity. 

‘Destroys that trust relationship’

Aloha Sargent, a technology services librarian at Cabrillo Community College, trains instructors on how to teach remotely. As part of that training she emphasizes the significance of the connection between a student and teacher. She’s been active in the movement countering remote proctoring technology in part because it can undermine that student-instructor dynamic, Sargent said.  

“The most important factor for student success in an online course is their relationship with their instructor. They need to feel that there’s an actual human being that cares about them and who wants them to succeed,” she said. 

“That relationship between the student and instructor is built on trust and remote proctoring destroys that trust relationship … it’s more likely to make students want to cheat because of how they feel about having to use that software,” she said. 

The questions surrounding how best to monitor students in a remote environment have added new urgency to conversations that have been taking place among curriculum specialists for years about whether there are better ways to assess students than a closed-book multiple choice exam. 

Some best practices include having students take several low-stakes exams instead of one or two high-stakes tests and providing more flexibility with exam dates — as Manturuk noted, she came up with all of the due dates for assignments in her courses at her kitchen table with little context about how those dates line up with students’ other commitments. 

Instructional designers push for different types of tests

More broadly, these advocates are pushing instructors to move towards what they call authentic assessments or assignments and tests that evaluate a student’s ability to apply the skills and concepts they’re supposed to learn in a course. 

For example, instead of asking sociology students to define certain theories of crime and crime reduction, instructors could ask students to look up the website of an elected official and explain which theory of crime or crime reduction best fits with their platform, Manturuk said.  It’s much harder to find the answer to the second question online, and to answer it, students have to know the material well enough to apply it to a real-world situation, she said. 

Though it may take less time for an instructor to come up with a multiple choice exam than an assessment of this type, the more traditional test “is very quickly in the public domain,” Manturuk said. Investing time in creating authentic assessment pays off on the back end, she said, because instructors aren’t worried about or hunting down students who they believe may have cheated. 

At Contra Costa College in San Pablo, California, Maritez Apigo, distance education coordinator, open educational resources coordinator, and English professor, became concerned about use of the remote-proctoring technology after students complained to her over the summer. 

“It’s really stressful, it causes a lot of anxiety and we just wanted to ask if there’s any way that there can be less use of it,” Apigo said they told her.

As chair of the academic senate’s distance education sub-committee, Apigo helped write guidance for fellow faculty that described online-proctoring software as “highly problematic” and suggested alternatives. After some debate in the academic senate over whether it could be issued campuswide, the guidance ultimately gained a majority vote, she said. 

After it was published and shared on social media, officials at other schools were contacting Apigo asking for permission to use it, she said. 

At the same time that instructors are rethinking the ways they test students, companies are positioning themselves to take advantage of what they see as the future of assessment. Some expect that even after this period of questioning surrounding data privacy, equity and the best ways to teach and test students, instructors will stick to more traditional modes of assessment, said James Wiley, Eduventures principal analyst at ACT NRCCUA, which provides data and research to higher education institutions.

But even if test-taking moves away from high-stakes closed book exams, other education technology companies will be prepared to benefit. These companies are betting “assessment will become more project-based, more essay-based,” Wiley said. “They want to position themselves accordingly.”