Gwénaëlle André

@cybergwen
22 Followers
34 Following
37 Posts
PhD Student #digital #youth #individuation #technicity
digitalyouth
digital literaciesindividuations
simondon
PhD Thesis Defence - Gwénaëlle André (Zoom)

You are invited to attend Gwénaëlle André's thesis examination

Eventbrite
A common argument I come across when talking about ethics in AI is that it's just a tool, and like any tool it can be used for good or for evil. One familiar declaration is this one: "It's really no different from a hammer". I was compelled to make a poster to address these claims. Steal it, share it, print it and use it where you see fit.

https://axbom.com/hammer-ai/

#AiEthics #DigitalEthics
If a hammer was like AI…

Computations will “estimate” your aim, tend to miss the nail and push for a different design. Often unnoticeably.

Axbom

OpenAI admits that AI writing detectors don’t work

No detectors "reliably distinguish between AI-generated and human-generated content."

https://arstechnica.com/information-technology/2023/09/openai-admits-that-ai-writing-detectors-dont-work/?utm_brand=arstechnica&utm_social-type=owned&utm_source=mastodon&utm_medium=social

OpenAI confirms that AI writing detectors don’t work

No detectors “reliably distinguish between AI-generated and human-generated content.”…

Ars Technica
The Free Software Foundation is dying

Récits de vie des jeunes immigrants. Article paru. https://ojs.lib.uwo.ca/index.php/cie-eci/article/view/15206
Récits de vie de jeunes immigrants diplômés des écoles de langue française à Toronto et à Edmonton | Comparative and International Education

Facebook, Google give police data to prosecute abortion seekers

Social-media sites are inundated with police requests for user data and may cooperate even if not legally required to, one legal expert told Insider.

Business Insider
Compelling article on the human costs of teaching machines https://time.com/6247678/openai-chatgpt-kenya-workers/
Exclusive: OpenAI Used Kenyan Workers on Less Than $2 Per Hour to Make ChatGPT Less Toxic

A TIME investigation reveals the poor conditions faced by the workers who made ChatGPT possible

Time
Open Data isn't just about transparency, it's about equity. That's why we need it in the future of AI. #copyrightweek https://www.eff.org/deeplinks/2023/01/open-data-and-ai-black-box
Open Data and the AI Black Box

Artificial Intelligence (AI) grabs headlines with new tools like ChatGPT and DALL-E 2, but it is already here and having major impacts on our lives. Increasingly we see law enforcement, medical care, schools and workplaces all turning to the black box of AI to make life altering decisions– a trend we should challenge at every turn.  The vast and often secretive data sets behind this technology, used to train AI with machine learning, come with baggage. Data collected through surveillance and exploitation will reflect systemic biases and be “learned” in the process. In their worst form, the buzzwords of AI and machine learning are used to "tech wash" this bias, allowing the powerful to buttress oppressive practices behind the supposed objectivity of code. It's time to break open these black boxes. Embracing Open Data in the development of AI would not only be a boon to transparency and accountability for these tools, but makes it possible for the would-be subjects to create their own innovative and empowering work and research. We need to reclaim this data and harness the power of a democratic and open science to build better tools and a better world.

Electronic Frontier Foundation

So, today we published our newest report "Enough Deception" on #deceptivedesign (aka #darkpatterns), with a main focus on "popular" designs targeted at consumers in Norway.

We also sent letters to ten companies, including international brands such as Ryanair, Wish & Ticketmaster.

Our report is available in English, and so are the letters we sent to the companies: https://www.forbrukerradet.no/side/companies-use-design-to-take-our-time-money-and-personal-data

Companies use design to take our time, money and personal data

Many companies use deceptive design to hold on to customers, increase sales, or acquire personal data. In many cases, this is illegal, the Norwegian Consumer Council says.

Forbrukerrådet