PC processors entered the Gigahertz era today in the year 2000 with AMD's Athlon — AMD hit marketing gold with its 1 GHz Athlon, beat Intel by a nose

Consumer PCs have long abandoned the multi-GHz race for core count and NPU inflation.

Tom's Hardware
Isnt it that at some point the GHz just aren’t useful anymore or rather not physically possible. I think they abandoned it for a good reason.
I think there are two parts to this. There are factors beyond clock rate; clock rate alone doesn’t give a full picture. Going from say 166 mHz to 1 gHz brings radical performance improvements without too many drawbacks, once you go above 3-4 gHz, the marginal increase in clock rates becomes increasingly expensive in terms of heat management.

Watch out for your prefixes, 166 mHz would be one operation every 6 seconds.
I don’t think there ever has been a CPU that slow ;-)

(small letter “g” doesn’t exist as a prefix, but could be confused as the unit gram-Hertz)

I once made an incredibly limited cpu on paper, basically had the whole cpu in logic gate on a piece of paper. Tried to run the most basic programm on it by hand and i can assure you thet it was much slower than 166mhz XD

You get rate limited by cache. The literal physical distance between cache(3) (tiny ram(s) in the processor) and processor can’t be zero. So those signals must travel over a distance at the speed of conduction. Having multiple processors allows tasks to be done simultaneously, effectively multiplying processing speed.

But more speed is particularly useful with bad/legacy software(single thread). SolidWorks is a good example.