This 5 star Proctorio review was written by GenAI.
We're All Trying To Find The Guy Who Did This.gif
| Stand Against Proctorio's SLAPP! | https://linkletter.org |
| Canadian Privacy Library | https://privacylibrary.ca |
| Bluesky | https://bsky.app/profile/linkletter.org |
This 5 star Proctorio review was written by GenAI.
We're All Trying To Find The Guy Who Did This.gif
I needed to edit https://linkletter.org yesterday.
Proctorio will drag this out as long as they can. I was 36 when I was sued, now I'm 41. One day I will be free from this meritless lawsuit, but I will never be defeated and have the pro bono legal counsel needed to sustain the fight as long as it takes.
New research confirms that OpenCV, the open source facial detection software used by Proctorio to control access to exams, is racially biased.
Lucy Satheesan was right! Proctorio is wrong.
Students of colour have been reporting Proctorio's inability to "see" them for years: https://vimeo.com/672407261
It is time to end this harmful practice and get rid of Proctorio's automated decision-making tool, because every day it harms more vulnerable students.
Research Link: https://racismandtechnology.center/wp-content/uploads/202506-performance-differential-with-opencv.pdf
It took almost two years, but Turnitin has stopped some of the harm of its AI detection tool.
They never admitted their false positive rate for AI Scores between 0-19%, but if it was even 1%, millions of students could have been falsely accused.
I'll never forget my dinner with Brewster Kahle (and other brilliant folks) at the Internet Archive Canada headquarters. I am so inspired.
I asked ChatGPT: "Is Proctorio racially biased?"
It answered: "Research conducted by a student found that Proctorio's software failed to recognize Black faces 57% of the time, which raises significant concerns about fairness and equity in the application of this technology."
Link: https://chat.openai.com/share/8545441f-c779-4bed-9c19-3f7a1ffbfaaf
Pedagogy, not policing. Students, not spying. Antiracism, not eye-tracking. Racially biased technology such as facial detection must not be allowed to make high-stakes decisions about whether a student can access their exam that day. Defund academic surveillance software!
(image by DALLĀ·E 3)