Ian Linkletter

@Linkletter
1,033 Followers
447 Following
308 Posts
Emerging Technology & Open Education Librarian standing for freedom of expression, fair dealing rights, and academic freedom.
Stand Against Proctorio's SLAPP!https://linkletter.org
Canadian Privacy Libraryhttps://privacylibrary.ca
Blueskyhttps://bsky.app/profile/linkletter.org

This 5 star Proctorio review was written by GenAI.

We're All Trying To Find The Guy Who Did This.gif

I needed to edit https://linkletter.org yesterday.

Proctorio will drag this out as long as they can. I was 36 when I was sued, now I'm 41. One day I will be free from this meritless lawsuit, but I will never be defeated and have the pro bono legal counsel needed to sustain the fight as long as it takes.

New research confirms that OpenCV, the open source facial detection software used by Proctorio to control access to exams, is racially biased.

Lucy Satheesan was right! Proctorio is wrong.

Students of colour have been reporting Proctorio's inability to "see" them for years: https://vimeo.com/672407261

It is time to end this harmful practice and get rid of Proctorio's automated decision-making tool, because every day it harms more vulnerable students.

Research Link: https://racismandtechnology.center/wp-content/uploads/202506-performance-differential-with-opencv.pdf

Oh great, Microsoft Teams wants my voiceprint and faceprint. Somehow I doubt Privacy Impact Assessments have been conducted for this new collection and use of biometric data.

It took almost two years, but Turnitin has stopped some of the harm of its AI detection tool.

They never admitted their false positive rate for AI Scores between 0-19%, but if it was even 1%, millions of students could have been falsely accused.

Link: https://guides.turnitin.com/hc/en-us/articles/22774058814093-AI-writing-detection-in-the-new-enhanced-Similarity-Report

Racial, skin tone, and sex disparities in automated proctoring software. This study shows how facial detection software can be disastrous for students of colour being judged by AI proctoring like #RespondusMonitor and #Proctorio https://www.frontiersin.org/journals/education/articles/10.3389/feduc.2022.881449/full
Frontiers | Racial, skin tone, and sex disparities in automated proctoring software

Students of color, particularly women of color, face substantial barriers in STEM disciplines in higher education due to social isolation and interpersonal, ...

Frontiers

I'll never forget my dinner with Brewster Kahle (and other brilliant folks) at the Internet Archive Canada headquarters. I am so inspired.

@brewsterkahle

I asked ChatGPT: "Is Proctorio racially biased?"

It answered: "Research conducted by a student found that Proctorio's software failed to recognize Black faces 57% of the time, which raises significant concerns about fairness and equity in the application of this technology."

Link: https://chat.openai.com/share/8545441f-c779-4bed-9c19-3f7a1ffbfaaf

ChatGPT

A conversational AI system that listens, learns, and challenges

Pedagogy, not policing. Students, not spying. Antiracism, not eye-tracking. Racially biased technology such as facial detection must not be allowed to make high-stakes decisions about whether a student can access their exam that day. Defund academic surveillance software!

(image by DALLĀ·E 3)

This you?