It is a misrepresentation of history to claim that during "AI winters" between 1974-1980 and 1987-1993 people didn't believe in #AI.
Even during the long quiet period of artificial neural networks before AlexNet in 2012 people who worked in software and machine learning in general understood that neural networks are the way to go, but it was simply difficult to tune shallow neural networks to perform well across different domains. It required a lot of trial and error and deep experience to make them work well, but there were numerous successes which just didn't translate into wider adoption because the appetite for paying for trial and error was limited.
Similarly, practically no one claimed there are no exoplanets before the first ones were found in the early 1990s. We knew they were there, we knew we had the technology to discover them, but there simply wasn't enough capital around to specifically focus on finding them.
When the first exoplanets were found, the research funding strangely behaved as if finding the first exoplanets changed pre-existing beliefs. Suddenly there was money for missions such as Kepler to scale up exoplanet finding.
Whose beliefs changed? Why did this have an effect on funding priorities at all?
We see the same in deep learning, generative AI and large language models. Some findings were indeed surprising, but capital markets are behaving as if we didn't know these capabilities were there to dig up before, although we did. Now there's capital to do research properly once again.
The same applies also to extrasolar life. Everyone pretty much knows there has to be non-Earth-based life out there even in our galaxy, possibly even in our solar system (e.g. microbes in the icy moon subsurface oceans). Yet we somehow behave as if we didn't know that.
Somehow our society goes through these hype cycles and winters not because of discovery and disillusionment, but because we live in two realities, what we know is true, and what the capital thinks is true.
It seems to me this is largely because the people who know aren't generally the people who decide where to allocate funds.
And at this very moment, the relevant amounts of capital are managed by people who act as if the #AGI and the #TechnologicalSingularity won't happen.
Maybe we could improve this somewhat?