20 Followers
19 Following
91 Posts
Long time choral conductor. Current data engineering manager. Passionate progressive, especially in the rights of all queer and poly people to love whom they want to love without fear.
Once again, Alexandra Petri is a national treasure.
Apologies to Britney…
Oops, indicted again
I played with your votes
Got lost in my head
Ooh, baby, baby
Oops, I think I'm above
The law, but the jury says that
I'm not that innocent

As predicted, ML models suffer irreversible damage when you train them on generated data, a phenomenon these researchers are calling "model collapse". Uncurated datasets are effectively poisonous.

Who, apart from anyone who thought about it for a few seconds, could have predicted.

https://arxiv.org/abs/2305.17493v2

The Curse of Recursion: Training on Generated Data Makes Models Forget

Stable Diffusion revolutionised image creation from descriptive text. GPT-2, GPT-3(.5) and GPT-4 demonstrated astonishing performance across a variety of language tasks. ChatGPT introduced such language models to the general public. It is now clear that large language models (LLMs) are here to stay, and will bring about drastic change in the whole ecosystem of online text and images. In this paper we consider what the future might hold. What will happen to GPT-{n} once LLMs contribute much of the language found online? We find that use of model-generated content in training causes irreversible defects in the resulting models, where tails of the original content distribution disappear. We refer to this effect as Model Collapse and show that it can occur in Variational Autoencoders, Gaussian Mixture Models and LLMs. We build theoretical intuition behind the phenomenon and portray its ubiquity amongst all learned generative models. We demonstrate that it has to be taken seriously if we are to sustain the benefits of training from large-scale data scraped from the web. Indeed, the value of data collected about genuine human interactions with systems will be increasingly valuable in the presence of content generated by LLMs in data crawled from the Internet.

arXiv.org

Welp. This has crossed my social timelines again today. And the ONE RULE I want to carry over from that bird place is this one: if the Tom Holland Lip Sync Battle Umbrella video arrives in your timeline, you must share the Tom Holland Lip Sync Battle Umbrella video.

Enjoy.

https://youtu.be/jPCJIB1f7jk

Lip Sync Battle - Tom Holland

YouTube
Pat Robertson, James Watt, Ted Kaczynski. That’s a pretty good three day streak.

Can't help myself. From the indictment:

a. On August 18, 2016, Trump stated, “In my administration, I’m going to enforce all laws concerning the protection of classified information. No one will be above the law.”

b. On September 6, 2016, Trump stated, “we also need to fight this battle by collecting intelligence and then protecting, protecting our classified secrets. … we can’t have someone in the Oval Office who doesn’t understand the meaning of the word confidential or classified.“

I really should make this my new Zoom background
Pride is not a party, it is not a parade. It is a protest, it is an outpouring of rage. It is our statement that we are here, we are visible, we are not going away. We are different, we are not just like you, and we are proud of the fact that we are different. We are not going to hide our differences just to make you more comfortable, or help you avoid having conversations with your family members. We will not be silent, we will not conform. We will be our true, authentic selves.
The cruelty IS the point. Never forget that. It is not about "protecting" anyone, no one is being "protected" here. It is totally about being able to be cruel towards people you don't like. https://edition.cnn.com/2023/05/20/us/mississippi-judge-denies-transgender-high-school-graduation-dress/index.html