Charles Logan

@charleswlogan
308 Followers
446 Following
192 Posts
Learning Sciences PhD Student at Northwestern University / Critical Digital Pedagogies and Literacies / Dad Life
Toward Abolishing Online Proctoring: Counter-Narratives, Deep Change, and Pedagogies of Educational Dignityhttps://jitp.commons.gc.cuny.edu/toward-abolishing-online-proctoring-counter-narratives-deep-change-and-pedagogies-of-educational-dignity/
Applying the Baldwin Test to Ed-Techhttps://www.civicsoftechnology.org/blog/applying-the-baldwin-test-to-ed-tech
Pronounshe/him/his

Come join me and @MMFCiccone at the upcoming Civics of Technology conference as we hear how folks practice everyday resistance and refusal of ed-tech; reflect on how we might adapt the strategies in our own contexts; and raise possibilities for emerging Luddite pedagogies.

The free, two-day, online event includes keynotes from Dr. Luci Pangrazio on Thursday, August 3rd, and Dr. Roxana Marachi on Friday, August 4th. Register at: https://www.civicsoftechnology.org/2023conference

2023 Conference — Civics of Technology

Civics of Technology

Hot(take) Chatbot Summer: Considering Value Propositions

Some thoughts on the proposed ChatGPT Business enterprise, cheating and detectors, and the "is it worth it" question six months after my last post

https://autumm.edtech.fm/2023/06/16/hottake-chatbot-summer-or-apathy-in-the-belly-of-the-beast/

Hot(take) Chatbot Summer: Considering Value Propositions

They say… one of the keys to successful blogging is regular and systematic publishing of posts. Well dear reader, it has been six months since my last entry and alas I must admit I’m just not…

Is a Liminal Space
@BenPatrickWill @shannonmattern @hypervisible @funnymonkey @douglevin @BenPatrickWill That one was new to me...and it's not great! (Understatement.) The Yahoo article also cites an Atlanta Public Schools' school board member praising the tech, which has me worried this spy shit is being used much more broadly, and/or Davista's PR team is good at planting positive reviews from other major metropolitan areas in order to drum up business.

AI machines aren’t ‘hallucinating’. Their makers are | Naomi Klein

“These models are enclosure and appropriation machines, devouring and privatizing our individual lives as well as our collective intellectual and artistic inheritances. And their goal … was always to profit off mass immiseration, which, under capitalism, is the glaring and logical consequence of replacing human functions with bots.”

https://www.theguardian.com/commentisfree/2023/may/08/ai-machines-hallucinating-naomi-klein

AI machines aren’t ‘hallucinating’. But their makers are

Tech CEOs want us to believe that generative AI will benefit humanity. They are kidding themselves

The Guardian
Another reason to resist this brand of AI snake oil: "Experts argue schools are just a cheap training ground for technology vendors to test and improve their object detection software so that they can eventually sell it elsewhere." https://theintercept.com/2023/05/07/ai-gun-weapons-detection-schools-evolv/
AI Tries (and Fails) to Detect Weapons in Schools

Companies like Evolv and ShotSpotter market their AI-powered gun detection systems to schools nationwide, but weapons still slip through.

The Intercept

This is a paper close to my heart, about how datafication plays out inequitably in education, and how educator responses make rational sense

‘Technology is not created by the sky’: datafication and educator unease

https://www.tandfonline.com/doi/full/10.1080/17439884.2023.2206137

Postprint - bit.ly/datafication_unease

‘Technology is not created by the sky’: datafication and educator unease

The pressure towards digital education is felt everywhere including in places with extreme digital divides. Resource-constrained educational environments are particularly threatened by datification...

Taylor & Francis

Solidarity with all students fighting online proctoring and its racist, dehumanizing academic surveillance software.

"But when Pocornie, who is Black, tried to scan her face, the software kept saying it couldn’t recognize her: stating 'no face found.'"

https://www.wired.com/story/student-exam-software-bias-proctorio/

This Student Is Taking On ‘Biased’ Exam Software

Mandatory face-recognition tools have repeatedly failed to identify people with darker skin tones. One Dutch student is fighting to end their use.

WIRED
Because we were looking for more things to do when these clowns decided to write "the letter," and cite our #StochasticParrots paper while saying the opposite of what we write, we @emilymbender Angelina McMillan-Major and @mmitchell_ai wrote a statement in response.
https://www.dair-institute.org/blog/letter-statement-March2023

I see people asking: How else will we critically study GPT-4 etc then?

Don't. Opt out. Study something else.

GPT-4 should be assumed to be toxic trash until and unless #OpenAI is *open* about its training data, model architecture, etc.

I rather suspect that if we ever get that info, we will see that it is toxic trash. But in the meantime, without the info, we should just assume that it is.

To do otherwise is to be credulous, to serve corporate interests, and to set terrible precedent.

Fascinating story: https://www.politico.com/news/2023/03/07/privacy-loophole-ring-doorbell-00084979

At first the police just wanted two hours of footage from this guy's doorbell Ring cam.

"It was just the beginning.

They asked for more footage, now from the entire day’s worth of records. And a week later, Larkin received a notice from Ring itself: The company had received a warrant, signed by a local judge. The notice informed him it was obligated to send footage from more than 20 cameras — whether or not Larkin was willing to share it himself."

The privacy loophole in your doorbell

Police were investigating his neighbor. A judge gave officers access to all his security-camera footage, including inside his home.

POLITICO