This is true, by the way, though there's some context and nuance I had to leave out for space. We don't use Owen's definition much anybody, but the common modern definition of dinosaurs as the last common ancestor of Triceratops and modern birds and all its descendents also implies the same thing.
I think we're getting used to the latter (that birds are dinosaurs), but I don't know if we will ever accept the former (that sauropods aren't dinosaurs).
Richard Owen, in 1842, defined dinosauria as the last common ancestor of Iguanodon and Megalosaurus and all the descendents of that ancestor. However, for the past several years, we have found evidence that ornithischians (Iguanodon) evolved from saurischians (Megalosaurus) after the Sauropods split off. If the recent evidence (e.g., Baron et al. 2017) is correct, then by Owen's original definition such classic 'dinosaurs' as diplodocus and apatosaurus are not true dinosaurs, but flamingos are.
The Wright brothers built the worst airplane. Imagine how much better aircraft would be today if they'd only waited until somebody else had built it first and then improved on it.
Inside me there are two wolves. One wolf loves to pick up great finds at used bookstores. The other wolf is too busy to read. They hate each other so, so much
@tao Believe it or not, the most interesting part of your thread to me was your statement that the best collaborations probably have a roughly equal number of people from both groups. That's a statement that's much more profound that it seems at first glance, and I'll be chewing on that for a while.
I find AI doomerism annoying and overblown. I don't think many proponents of AI doomerism are really thinking for themselves; they're repeating the opinions of a small intellectual/writer class who personally don't really see much value in current large language models. I believe that many people are bad at writing and summarizing data quickly, so for a huge number of people, LLMs do provide value. Failing to recognize this is a brag—as if saying, "I personally don't need LLMs!"
This doesn't mean I don't think there are any downsides to the current LLM craze—I'm getting tired of the slop, too, and I don't think every new product needs an assistant button or toolbar—but there are downsides to every new technology. It's always been two steps forward, one step back. I can't think of a single time in the past that didn't have something I liked about it, but I still prefer to live in the present, and I don't think that will change in twenty, thirty, fifty years.
For instance, computers have been designing new computers for a long time now. Considering how many transistors modern chips have, it's impossible to lay out a new, competitive processor the way our ancestors in the 70s and 80s did, with a mechanical pencil and graph paper. We use generative AI. We've just been doing it for so long that we don't call it that, because that phrase wasn't in common usage until recently. We call it computer-aided design (and a few other things).
Most people have trouble with progress. It's always been kind of a difficult concept to wrap one's head around. Our economy has progressed, our society has progressed, and yes, our technology has progressed and is progressing (and these things are all related, they all caused friction, and they all faced and continue to face the opposition, to greater or lesser degrees, of high status groups). The actual form progress takes is always new, but as a phenomenon it's as old as humanity.