AI Forensics

@aiforensics_org
157 Followers
13 Following
4 Posts

AI Forensics (previously http://Tracking.Exposed)
is a European non-profit that investigates influential and opaque algorithms.

https://www.aiforensics.org

AIAlgoritmic Auditing

Thanks @[email protected] for trusting us for two years in a row as MTF Awardee & creating a strong civil society ecosystem for digital rights!

Read our profile - AI Forensics: The Detectives Researching AI Harms🕵️‍♂️🕵️‍♀️ : https://foundation.mozilla.org/en/blog/ai-forensics-the-detectives-researching-ai-harms/

AI Forensics: The Detectives Researching AI Harms

This is a profile of Mozilla Technology Fund awardee AI Forensics. A non-profit which investigates influential algorithms. It was previously known as Tracking Exposed, a project which had been pioneering new methods to hold big tech platforms accountable since 2016.

Mozilla Foundation

📣 Repeat after me: The #DSA is not a tool for censorship!

Suggesting that blocking access to online platforms during protests could be justified under Europe's DSA is wrong and misleading! Commissioner Breton
needs to clarify. Our statement among others with @accessnow @ecnl @bitsoffreedom @article19 @Freiheitsrechte @mozilla @aiforensics_org ⤵️
https://accessnow.org/press-release/dsa-internet-blocking-statement/

Civil society statement: Commissioner Breton needs to clarify comments about the DSA allowing for platform blocking - Access Now

Requesting clarification recent comments suggesting that blocking of online platforms could be an enforceable and justified measure under the DSA.

Access Now

🔍💼 Check out the DSA Stakeholder Event where our director Marc Faddoul sheds light on the role of Adversarial Audits in #AlgorithmicAccountability!

🎥:https://www.youtube.com/watch?v=3ruDVtlPF8w

🗣️“Most of the algorithmic harms happen at the margins & cannot necessarily be seen in aggregate statistics provided by companies' risk assessment reports. Analysing individual personalisation dynamics for the most vulnerable users is key to preventing harms.”

📩 Subscribe to our newsletter: http://eepurl.com/iqug_-/

Workshop. Conducting DSA risk assessments – algorithms in the spotlight!

YouTube

Access to public data is key for watchdogs like us working to uncover societal risks stemming from social media. But #Tech giants like Facebook and Twitter are cutting critical access to study them. 🚨 #DSA

The @EU_Commission must now step in, we demand together with other organizations including @amnesty_digital_de, @awo, @snv_berlin, @aiforensics_org & more! ⤵️
https://algorithmwatch.org/en/dsa-empower-public-interest-research-data-access/

DSA must empower public interest research with public data access - AlgorithmWatch

Access to “public data” is key for researchers and watchdogs working to uncover societal risks stemming from social media—but major platforms like Facebook and Twitter are cutting access to important data analytics tools to study them. The EU must now step in to ensure that researchers aren’t left in the dark.

AlgorithmWatch