Large Language Model Performance Doubles Every 7 Months

https://lemmy.eco.br/post/14624591

Large Language Model Performance Doubles Every 7 Months - Brasil

By 2030, AI will greatly outperform humans in some complex intellectual tasks. Discover how LLMs are doubling their capabilities every seven months.

This is like measuring the increasing speeds of cars in the early years and extrapolating that they would be supersonic by now by ignoring the exponential impact that air resistance has.
Very good analogy. They’re also ignoring that getting faster and faster at reaching a 50% success rate (a totally unacceptable success rate for meaningful tasks) doesn’t imply ever achieving consistently acceptable success.
Air resistance has cubic not exponential impact

My son has doubled in size every month for the last few months. At this rate he’ll be fifty foot tall by the time he’s seven years old.

Yeah, it’s a stupid claim to make on the face of it. It also ignores practical realities. The first is those is training data, and the second is context windows. The idea that AI will successfully write a novel or code a large scale piece of software like a video game would require them to be able to hold that entire thing in their context window at once. Context windows are strongly tied to hardware usage, so scaling them to the point where they’re big enough for an entire novel may not ever be feasible (at least from a cost/benefit perspective).

I think there’s also the issue of how you define “success” for the purpose of a study like this. The article claims that AI may one day write a novel, but how do you define “successfully” writing a novel? Is the goal here that one day we’ll have a machine that can produce algorithmically mediocre works of art? What’s the value in that?

Or like looking at the early days of semiconductors and extrapolating that CPU speed will double every 18 months …smh these people
Since CPU speeds are still doubling every 18 months you have a solid point!

Yup, that’s what I was alluding to, while it may not still be the case for transistors, they did manage to take 50 odd years to get there, push that trend line from the figure 50 years heh (not saying you should, 5 seems much more conservative)

Take a look at Nvidias pace wrt Moore’s law (of FLOPS) netrouting.com/nvidia-surpassing-moores-law-gpu-i…

How NVIDIA is Defying Moore's Law and Dominating the GPU Market

Learn how NVIDIA is surpassing Moore's Law, revolutionizing GPUs with innovations in AI, gaming, and research, and leading the future of visual computing.

Netrouting - High-Performance Dedicated Servers