@kristiedegaris Your flying example is so useful.
We need to clamber out of the developer sandbox we’re trapped in, because energy use, human exploitation and social impact really matter here too. We need to get to the point where people can evaluate and then limit their user-side AI needs in informed ways. We need far higher ethical and transparency standards on the corporate side. We need the courage to limit use and drive up safety planning.
We expect planes and flying to be held to safety standards. We exert some degrees of consumer pressure on pricing and reliability. I suspect many of us who fly for reasons of geography also offset if we can, and limit our use to try to minimise harm. We think about the planet and each other.
Let’s get to this point with AI, and stop accepting that our only option is to provide endless free sandbox testing for hypesellers.