I just stumbled upon a good way to answer the question "are you afraid of AI?" from folks who are less technically inclined.

Ask them if they consider the daily weather report “artificial intelligence" and if they fear it.

Same thing: a bunch of real world inputs that yield a prediction that may be right or wrong. There may be a tornado - or maybe just cloudy.

Of course, if we place to much reliance on those predictions things can go haywire.

@chockenberry I’m quite technically inclined. I’m not afraid of (what we’re colloquially calling) AI, I’m afraid of people and what they might do with it.
@chockenberry I think people might be afraid of things that are not captured by this analogy, for example losing jobs.

@mwichary Yeah, it doesn't cover the repercussions of the prediction technology.

The automation of weather prediction (satellites, computer models, etc.) may have had affected employment, but there's only so much you can do with a simple analogy.

@SingingNala @chockenberry And some environments are easier to predict than others. Where I live in coastal Texas, for example, I assume no prediction beyond about 2-3 days is worth anything, as the weather around here just changes too fast. In order to truly predict such a thing I think you'd need a model on the scale of the entire globe or at least this hemisphere, otherwise you can't track fronts and other developments outside the area that sweep in and change things anyway. To say nothing of changes caused by us.
@SingingNala @chockenberry Those predictions told us that it was a clear and sunny day with barely any clouds *while* we were waiting in the line at the county fare and it was raining heavily.
@SingingNala @chockenberry Also a good representation of systemic bias. Would you expect a model that was carefully cultivated and trained on decades of data of the U.S. Midwest to give accurate, or even close, figures in the tropics? In Scandinavia? Of course not! Why? Because the training data is completely different from the reality it's being used for! Or even if a model were accurate to the current region, what about climate change? We don't have enough data on the actual effects of climate change on weather systems because it keeps slowly changing and we only really recognized it in the past decade or so. A model that assumes that climate is perfect and unchanging will obviously diverge farther and farther from the reality. And the only way to remedy that is to figure out a predictor that takes both the causes and effects of climate change into account and is far more precise than the current data we have to give to it.
@SingingNala @chockenberry A more contemporary Ai example of bias, unintentional or not, creeping into the system. I heard once of a camera that used AI in its autofocus and image enhancement features in the mid 2010s. It was developed in silicon valley, by silicon valley engineers. Many of whom are white men. As it approached public release, suddenly a black person ended up in front of it. And it couldn't figure them out. It had assumed that the general skin tone of *humans* is all the same, and so it had difficulty handling this case. The shapes were correct, this looks like a face, but without the skin tone correlation in its dataset it has a low confidence rating and likely would either jump erratically or just give up.
@chockenberry The main difference being that weather prediction algorithms aren’t trained on god-knows-what content vaccuumed up in bulk from, say, Reddit. The algorithm in the abstract isn’t so much the issue; it’s how much of the model was trained on garbage and nonsense, categorized in whatever way by wage slaves working in terrible conditions. We need something better curated and structured than the brute-force approach we use now.
@chockenberry I’m gonna use that. I love it. I remember in the early 90s weather prediction was truly bad because it didn’t have “ai” yet.
@chockenberry it's a decent analogy but it's still off the mark because weather predictions are at least typically within sane bounds and have discrete possibilities, LLMs can be _wildly_ wrong and hide that subtly.

Also nobody's going around saying weather reports are going to obsolete meteorologists + revolutionise the world.

Having said all that, obviously it's impossible to capture all the nuances so it's a good quick form descrpition!

@chockenberry I get the analogy, but it's not a great one...

Weather data isn't the result of me spending 20 years creating the weather, for AI to come and scrape it and present it as a paid product, for example.

Or suggesting it replaces everything from seismologists to umbrella salesmen.

@chockenberry What scares me about generative AI is that it uses predictions to create things. Weather predictions don’t scare me because the weather AI isn’t actually creating the weather.
@brentsimmons @chockenberry no but a bad weather forecast can conceivably get somebody killed.
@distinct @brentsimmons @chockenberry Bad weather forecasts *do* get people killed in the shipping and fishing industries, among others
@brentsimmons @chockenberry CARROT likes to pretend otherwise.
@brentsimmons @chockenberry Certainly would have made this a lot less funny: https://youtu.be/iXuc7SAyk2s
Weather map goes crazy live on the air

YouTube

@brentsimmons @chockenberry I sense it may be worth while to point out that any meteorologist worth their keep will tell you they won’t predict the weather. They forecast it.

That’s why I don’t immediately see a parallel between AI and Wx.

@chockenberry weather lacks malicious inputs. Failed instrumentation, maybe.

Still a good analogy for trust models.

@chockenberry my answer has been, “I’m not afraid of AI, I’m wary of what capitalism and authoritarianism will do with it”

@chockenberry The book 'Boom Town' (my favorite nonfiction book of the past 10 years) has a chapter on a famed Oklahoma weatherman who was just a cheesy TV dude until a devastating tornado hit, and he realized he could actually save lives—him saying one thing or another on TV could mean people stay or go.

So he worked as hard as he could to get good forecasts, and he also honed his image to become 100% likable and trustworthy…

@chockenberry Weather forecasting isn't just a calculation; it's also a job in judgment, in PR, in empathy.

This is the kind of thing you lose if you "automate" forecasts, or if you see them as just "inputs and outputs."

Add that to the AI analogy as well 🙂 This is partly where the "fear" (really, it's more concern and annoyance) of AI comes from.

@neven @chockenberry See also James Spann in Birmingham. One of the greats.
@neven @chockenberry I love that book. Having grown up in Oklahoma City in Gary England's prime, I can attest to the reverence folks had for him.
@neven @chockenberry the Space City Weather guys talk about this every hurricane season. Great people and great weather forecasts if you’re ever in Houston, their blog is the only thing I read nearly daily https://spacecityweather.com/
Space City Weather

Space City Weather
@neven @chockenberry Isn’t that a plot point in “LA Story?”
@neven @chockenberry That sounds like a book I need to read!
@dcrooks it’s an excellent book that explains a lot about OKC and urban development and evolution.
@neven @chockenberry as an Oklahoma native, I can attest to this
@chockenberry I don’t like how we’ve redefined AI to refer to various large data models. I’m not afraid of those, but I am very concerned about real sentient AI.