Washington's Lottery AI site turned user photo into porn
Washington's Lottery AI site turned user photo into porn
When Megan, a 50-year-old mother based in Tumwater, visited the new AI-powered mobile site from Washington’s Lottery on March 30, she thought she was in for some frivolous fun. Test Drive A Win allows users to digitally throw a dart at a dartboard featuring dream vacations you can pay for with the money you win in the lottery. Depending on where the dart lands, you can either upload a headshot or take one on your phone to upload, and the AI superimposes your image into the vacation spot.
Megan landed on a “swim with the sharks” dream vacation option. She was shocked at one of the AI photos Washington’s Lottery spit out. It was softcore porn.
So I can totally see this happening. Government contracts with an genAI company and company drops the ball and erroneously includes the function for pornography or doesn’t select the correctly curated training data (I’m unsure how exactly these work). It may be quite difficult to spot this error by the Washington government is the occurrence rate is very low or none of their test training data prompted pornography to be generated. Perhaps it was only keyed to make porn (when not specifically prompted to) on certain subsets of matched facial features? I’m not suggesting this, but perhaps that affected user looks a lot like a popular porn star? It could also totally be the government’s fault for quickly selecting an AI package and not looking what it could do; but with government bureaucracy there could’ve been quite a few people with oversight.
My bigger question is WTF is this system even doing? If you win money in the lottery, you can select to apply it to a vacation package if your random draw hits it? Why wouldn’t you just take the money and buy your own? Maaaaybe if it heavily discounts the vacations or something. Seems like an unnecessary step in the lottery process.
Prior to launch, we agreed to a comprehensive set of rules to govern image creation, including that people in images be fully clothed.
Apparently they thought about it, but neglected to think that some "vacation" images in the training data might not be tagged with the clothing worn or that the model might sometimes consider only pants to be fully clothed because some of the training data might show topless women in public and not be tagged. Or topless men.