The air begins to leak out of the overinflated AI bubble

https://lemy.nl/post/997100

I don’t think AI is ever going to completely disappear, but I think we’ve hit the barrier of usefulness for now.

Thank fucking god.

I got sick of the overhyped tech bros pumping AI into everything with no understanding of it…

But then I got way more sick of everyone else thinking they’re clowning on AI when in reality they’re just demonstrating an equal sized misunderstanding of the technology in a snarky pessimistic format.

As I job-hunt, every job listed over the past year has been “AI-drive [something]” and I’m really hoping that trend subsides.
“This is an mid level position requiring at least 7 years experience developing LLMs.” -Every software engineer job out there.
That was cloud 7 years ago and blockchain 4
Yeah, I’m a data engineer and I get that there’s a lot of potential in analytics with AI, but you don’t need to hire a data engineer with LLM experience for aggregating payroll data.

there’s a lot of potential in analytics with AI

I’d argue there is a lot of potential in any domain with basic numeracy. In pretty much any business or institution somebody with a spreadsheet might help a lot. That doesn’t necessarily require any Big Data or AI though.

Reminds me of when I read about a programmer getting turned down for a job because they didn’t have 5 years of experience with a language that they themselves had created 1 to 2 years prior.

I’m more annoyed that Nvidia is looked at like some sort of brilliant strategist. It’s a GPU company that was lucky enough to be around when two new massive industries found an alternative use for graphics hardware.

They happened to be making pick axes in California right before some prospectors found gold.

And they don’t even really make pick axes, TSMC does. They just design them.

They didn’t just “happen to be around”. They created the entire ecosystem around machine learning while AMD just twiddled their thumbs. There is a reason why no one is buying AMD cards to run AI workloads.
One of the reasons being Nvidia forcing unethical vendor lock in through their licensing.

I feel like for a long time, CUDA was a laser looking for a problem.
It’s just that the current (AI) problem might solve expensive employment issues.
It’s just that C-Suite/managers are pointing that laser at the creatives instead of the jobs whose task it is to accumulate easily digestible facts and produce a set of instructions. You know, like C-Suites and middle/upper managers do.
And NVidia have pushed CUDA so hard.

AMD have ROCM, an open source cuda equivalent for amd.
But it’s kinda like Linux Vs windows. NVidia CUDA is just so damn prevalent.
I guess it was first. Cuda has wider compatibility with Nvidia cards than rocm with AMD cards.
The only way AMD can win is to show a performance boost for a power reduction and cheaper hardware. So many people are entrenched in NVidia, the cost to switching to rocm/amd is a huge gamble

Go ahead and design a better pickaxe than them, we’ll wait…
“He didn’t earn his wealth. He just won the lottery.” “If it’s so easy, YOU go ahead and win the lottery then.”

My fucking god.

“Buying a lottery ticket, and designing the best GPUs” totally the same thing, amiriteguys???"

In the sense that it’s a matter of being in the right place at the right time, yes. Exactly the same thing.

The fact you don’t see it for whatever reason doesn’t make it wrong.

they did NOT predict generative AI, and their graphics cards just HAPPEN to be better situated for SOME reason.

This is the part that’s flawed. They have actively targeted neural network applications with hardware and driver support since 2012.

Yes, they got lucky in that generative AI turned out to be massively popular, and required massively parallel computing capabilities, but luck is one part opportunity and one part preparedness. The reason they were able to capitalize is because they had the best graphics cards on the market and then specifically targeted AI applications.

You do realize Nvidia is a lot older than 2012 right?

Like sure they saw it coming, but not at a point where if they were doing CPUs instead of GPUs, than they wouldn’t have been able to capitalize on it to nearly the same degree. See intel trying just that.

They HAPPENED to find better uses for their GPUs, that was never a given.

His engineers built it, he didn’t do anything there

They just design them.

It’s not trivial though. They also managed to lock dev with CUDA.

That being said I don’t think they were “just” lucky, I think they built their luck through practices the DoJ is currently investigating for potential abuse of monopoly.

Yeah CUDA, made a lot of this possible.

Once crypto mining was too hard nvidia needed a market beyond image modeling and college machine learning experiments.

Imo we should give credit where credit is due and I agree, not a genius, still my pick is a 4080 for a new gaming computer.
The tech bros had to find an excuse to use all the GPUs they got for crypto after they bled that dry
If that’s the reason, I wouldn’t even be mad, that’s recycling right there.

The tech bros had to find an excuse to use all the GPUs they got for crypto after they bled that dry switched to proof-of-stake.

I’m not a fan of BTC but $50,000+ doesn’t seem very dry to me.

No, it’s when people realized it’s a scam

Well, they also kept telling investors all they need to simulate a human brain was to simulate the amount of neurons in a human brain…

The stupidly rich loved that, because they want computer backups for “immortality”. And they’d dump billions of dollars into making that happen

About two months ago tho, we found out that the brain uses microtubules in the brain to put tryptophan into super position, and it can maintain that for like a crazy amount of time, like longer than we can do in a lab.

The only argument against a quantum component for human consciousness, was people thought there was no way to have even just get regular quantum entanglement in a human brain.

We’ll be lucky to be able to simulate that stuff in 50 years, but it’s probably going to be even longer.

Every billionaire who wanted to “live forever” this way, just got aged out. So they’ll throw their money somewhere else now.

I used to follow the Penrose stuff and was pretty excited about QM as an explanation of consciousness. If this is the kind of work they’re reaching at though. This is pretty sad. It’s not even anything. Sometimes you need to go with your gut, and my gut is telling me that if this is all the QM people have, consciousness is probably best explained by complexity.

ask.metafilter.com/…/Is-this-paper-on-quantum-pro…

Completely off topic from ai, but got me curious about brain quantum and found this discussion. Either way, AI still sucks shit and is just a shortcut for stealing.

Is this paper on quantum propeties of the brain bad science or not?

I have seen a lot of people interested in the idea of quantum consciousness making a big deal of this paper: Ultraviolet Superradiance from Mega-Networks of Tryptophan in Biological Architectures....

That’s a social media comment from some Ask Yahoo knockoff…

Like, this isn’t something no one is talking about, you don’t have to solely learn about that from unpopular social media sites (including my comment).

I don’t usually like linking videos, but I’m feeling like that might work better here

www.youtube.com/watch?v=xa2Kpkksf3k

But that PBS video gives a really good background and then talks about the recent discovery.

Was Penrose Right? NEW EVIDENCE For Quantum Effects In The Brain

YouTube

some Ask Yahoo knockoff…

AskMeFi predated Yahoo Answers by several years (and is several orders of magnitude better than it ever was).

And that linked accounts last comment was advocating for Biden to stage a pre-emptive coup before this election…

www.metafilter.com/activity/306302/…/mefi/

It doesn’t matter if it was created before Ask Yahoo or if it’s older.

It’s random people making random social media comments, sometimes stupid people make the rare comment that sounds like they know what they’re talking about about. And I already agreed no one had to take my word on it either.

But that PBS video does a really fucking good job explaining it.

Cuz if I can’t explain to you why a random social media comment isn’t a good source, I’m sure as shit not going to be able to explain anything like Penrose’s theory on consciousness to you.

Activity from The Manwich Horror | MetaFilter

It doesn’t matter if it was created before Ask Yahoo or if it’s older.

It does if you’re calling it a “knockoff” of a lower-quality site that was created years later, which was what I was responding to.

Great.

So the social media site is older than I thought, and the person who made the comment on that site is a lot stupider than it seemed.

Like, Facebooks been around for about 20 years. Would you take a link to a Facebook comment over PBS?

My man, I said nothing about the science or the validity of that comment, just that it’s wrong to call Ask MetaFilter “some Ask Yahoo knockoff”. If you want to get het up about an argument I never made, you do you.
A.I., Assumed Intelligence
More like PISS, a Plagiarized Information Synthesis System

I’ve noticed people have been talking less and less about AI lately, particularly online and in the media, and absolutely nobody has been talking about it in real life.

The novelty has well and truly worn off, and most people are sick of hearing about it.

The hype is still percolating, at least among the people I work with and at the companies of people I know. Microsoft pushing Copilot everywhere makes it inescapable to some extent in many environments, there’s people out there who have somehow only vaguely heard of ChatGPT and are now encountering LLMs for the first time at work and starting the hype cycle fresh.
It’s like 3D TVs, for a lot of consumer applications tbh

Oh fuck that’s right, that was a thing.

Goddamn

3D has been a thing every 15 years or so

3D TVs were a commercial fad once and I haven’t seen them in forever.

2016 may have been the end of them

en.m.wikipedia.org/wiki/3D_television

livingetc.com/…/can-you-still-buy-3d-tvs-explaini…

3D television - Wikipedia

Yes but 3D is always a thing periodically.

I used shutter glasses with two voodoo2 cards…

I used shutter glasses with the sega master system back in 87. They were rad af

segadoes.com/2015/02/23/sega-3d-glasses/

Sega 3D Glasses (Master System, 1987) - Sega Does

Not even Sega's bold, but uncomfortable 3D Glasses could make the American public embrace the Master System.

Sega Does
Yeah, now we are gonna get the reality of deep fakes; fun times.
Wall Street has already milked “the pump” now they short it and put out articles like this
Welp, it was ‘fun’ while it lasted. Time for everyone to adjust their expectations to much more humble levels than was promised and move on to the next sceme. After Metaverse, NFTs and ‘Don’t become a programmer, AI will still your job literally next week!11’, I’m eager to see what they come up with next. And with eager I mean I’m tired. I’m really tired and hope the economy just takes a damn break from breaking things.

But if it doesn’t disrupt it isn’t worth it!

/s

I just hope I can buy a graphics card without having to sell organs some time in the next two years.
My RX 580 has been working just fine since I bought it used. I’ve not been able to justify buying a new (used) one.
Don’t count on it. It turns out that the sort of stuff that graphics cards do is good for lots of things, it was crypto, then AI and I’m sure whatever the next fad is will require a GPU to run huge calculations.

I’m sure whatever the next fad is will require a GPU to run huge calculations.

I also bet it will, cf my earlier comment on rendering farm and looking for what “recycles” old GPUs lemmy.world/comment/12221218 namely that it makes sense to prepare for it now and look for what comes next BASED on the current most popular architecture. It might not be the most efficient but probably will be the most economical.

The air begins to leak out of the overinflated AI bubble - Lemmy.World

AI is shit but imo we have been making amazing progress in computing power, just that we can’t really innovate atm, just more race to the bottom. —— I thought capitalism bred innovation, did tech bros lied?

/s

I’d love an upgrade for my 2080 TI, really wish Nvidia didn’t piss off EVGA into leaving the GPU business…
If there is even a GPU being sold. It’s much more profitable for Nvidia to just make compute focused chips than upgrading their gaming lineup. GeForce will just get the compute chips rejects and laptop GPU for the lower end parts. After the AI bubble burst, maybe they’ll get back to their gaming roots.

move on to the next […] eager to see what they come up with next.

That’s a point I’m making in a lot of conversations lately : IMHO the bubble didn’t pop BECAUSE capital doesn’t know where to go next. Despite reports from big banks that there is a LOT of investment for not a lot of actual returns, people are still waiting on where to put that money next. Until there is such a place, they believe it’s still more beneficial to keep the bet on-going.

AI doesn’t need to steal all programmer jobs next week, but I have much doubt there will still be many available in 2044 when even just LLMs still have so many things that they can improve on in the next 20 years.

I’m just praying people will fucking quit it with the worries that we’re about to get SKYNET or HAL when binary computing would inherently be incapable of recreating the fast pattern recognition required to replicate or outpace human intelligence.

Moore’s law is about similar computing power, which is a measure of hardware performance, not of the software you can run on it.

Unfortunately it’s part of the marketing, thanks OpenAI for that “Oh no… we can’t share GPT2, too dangerous” then… here it is. Definitely interesting then but now World shattering. Same for GPT3 … but through exclusive partnership with Microsoft, all closed, rinse and repeat for GPT4. It’s a scare tactic to lock what was initially open, both directly and closing the door behind them through regulation, at least trying to.