#DearMastomind: What is it that actually makes system performance under heavy Web use slow?

Is it just memory exhaustion? Disk swapping? CPU / cores?

(Omitting any consideration of network / bandwidth here. Just looking at slow-as-fuck desktop/laptop systems.)

This with an eye to speccing out some new kit. Guides to what to look for / avoid / what's unnecessary expense would be handy.

(Main driver is an iMac 17,1 Intel Core5 8 GB RAM & Fusion drive, which I manage to pig out routinely.)

Thinking 16--32 GB RAM may be a minimum. Principally driving Firefox on Linux. Known high-water mark is 1750+ tabs. Yes, I know I have a problem, thank you for caring.

I've written Chrome AND Chromium, as well as anything based on them, out of my life.

Other loads are typically far smaller, though there may be some compiles, occasional large datasets (postgresql, sqlite, R, Python), and document compiles (LaTeX, pandoc), light audio/image edits. Mostly I live in bash / vim / mutt if at all possible.

#ComputerHardware #SystemPerformance #Firefox #Webbrowsing

@dredmorbius IME it's memory leaks and runaway Javascript. I've managed to have Firefox exhaust 32GB RAM all by itself once or twice with a far saner number of tabs open.

I'd go for lots of RAM and a CPU with high per-core performance.

@bthylafh The problem with leaks is that feeding them merely fuels the fire. They'll eat all you give 'em.

I've been a strong fan of limiting browser resources per tab (or per site) to something exceedingly limited.

As it is, I limit JS sharply, generally through uMatrix, disabled by default. Of course, it's hard to ascribe specific performance issues to specific sites.

Why a site must remain fully in memory and consuming CPU cycles when not focused is ... a puzzler. There are a very few application sites that really must do this, but generally, no.

(Mastodon might be one of those.)

I'm also somewhat inclined to abandon Web (or Electron) apps in favour of native where possible. And of graphical Web for text (I do use w3m heavily, where possible).

@dredmorbius yes, it's memory exhaustion. 64-bit firefox likes to start up 26GB WebContent processes, even if you only have 2GB physical RAM. processor speed is barely even a factor any more, i suspect - put it this way, my main machine is a core 2 duo with 4GB RAM and it's the RAM i feel the pinch on

@dredmorbius Educated guess based on the code I've read: Make sure your CPU has a sizable cache. And, as you noted, plenty of RAM.

I do get the impression browsers are memory bound.

@alcinnz What counts as "sizeable cache" these days?

There's also the whole Intel (Heartbleed / Meltdown / Rowhammer class bugs) vs. AMD question. Or ARM devices (many still 32 bit from what I'm seeing).

@dredmorbius I'm sorry, I don't know hardware well. My knowledge is extremely basic there.
@dredmorbius as much ram as you can get away with. 64+gigs, why not?

@dredmorbius i'm not sure how firefox stores data but it's probably not that different in the end from chrome, where things like the windows & tabs one has open are stored in sqlite. and i think read/written from/to by a whole bunch of different threads. and there's some data-shuffling/updating that's happening. neither chrome nor the os are good about keeping all forward progress from getting caught, waiting behind some io to complete.

there's a lot of work on remedies like write-back throttling, which work to try to keep systems from stalling while waiting for io.

the problem compounds if one is also swapping tabs in & out, running low on memory, & thus also hitting disc to swap. so more ram helps. but it's not enough. a lot of that slowness is in io work.

@dredmorbius
another main culprit is that most os'es let processes/threads run around between different cores, and that lack of core affinity & having so many cold caches really reduces response times.

telling the browser to only use 2-4 cores & moving other processes on the system to other cores can sometimes really help responsiveness. alas. keep the os from turning things into an unnecessary/pointless eternal juggling act.

@dredmorbius

I've seen servers reduced to a crawl a few times. Notably when aaronsw and a few other folks induced me to put Dr. B's piano key sound samples up at OLPC and we did 5.6 TB of traffic that morning until my boss came over to my desk and went "HENRY!" and walked away. lolol.

When traffic overwhelms the 1TB pipe, that's true slowness.

Miss you dude [Aaron].

@dredmorbius

The servers I built (in 2009) for OLPC were 24 core Xeon boxen w/ 192 GB of ECC memory; pretty sure they would have handled much higher traffic. The problem then becomes hard interrupts from the ethernet controller I believe.

@dredmorbius

It didn't kill the pipe, it slowed the machine I/O to a crawl with massive amounts of small files being downloaded, each way of striking a key at each volume, for the whole piano, high quality samples.

All the other connections were slow-ish [for MIT] but fine.

@dredmorbius
My guess: disk swapping which comes from memory exhaustion.
When your system starts to use swap space, every task means having to 3 things instead of 1.
And if 'pig out' means your drive is full, you're making that bottleneck much worse.

My tips:
1) get at least 32GB RAM (I have 64 as I plan to use it for 5+ more years).
2) get an NVMe drive. preferably for all your data, but a 64GB+ one for (just) your OS, should help as well.
3) get an AMD Ryzen CPU; more cores for less money.

@dredmorbius
As Linux only uses virtual memory as a last resort, memory is likely to be the most important with regard to processing speed, so faster memory and more of it.
More CPU cores give you more threads, and an SSD over a HDD might help initial loading times.
@dredmorbius Firefox needs 8 GB RAM or more to itself. With current prices, I see no reason to settle for less than 32 GB total. Get an AMD Ryzen CPU, fastest you're willing to pay for. Somewhat fewer cores with higher clock is probably preferable for web browsing, more cores will win for large build jobs that can take advantage of them.
@mansr @dredmorbius If you're a tab person like me then Firefox will frequently exhaust 16GB easily and happily chew up any swap space.
I've gotten a bit better at doing manual GC on my tabs, but it's still not low enough to keep browsers happy.
@tbr @mansr @dredmorbius Add to this that firefox tends to crash more quickly the less RAM you have. No, swap doesn't help. And no, I don't know why, but less RAM = less time between crashes.

@attilakinali At 8 GB, those crashes are fairly rare, and even at 1500+ tabs (though def. not all loaded), performance is ... not unreasonable.

I seriously need to engage on a major purge though.

@tbr @mansr

@dredmorbius @tbr @mansr At 8GB MTBC is around 3h for me. At 32GB it's around 3-4 days. And at 64GB it's around 10 days.

While performance is otherwise ok, I consider crashes this often a ... nuisance, to put it politely. When I once complained to an ff dev at Fosdem about ff crashing so often, he was seriously surprised that I am not restarting ff daily like everyone else.

@attilakinali @dredmorbius @mansr
One thing that kinda helps me is to open about:memory and do "minimize memory usage". Need to do it before it gets too slow though.
Basically it just forces several GC runs. So it doesn't help for long.