I'm still constantly baffled by just how absolutely beyond shit modern computers are
@OpenComputeDesign x86 was a mistake? :)
@OpenComputeDesign
16 bit was a mistake?
transistors were a mistake?
how modern are we talking? :)

@kabel42 @OpenComputeDesign

16-bit/early-32-bit was my favorite era. (Basically, the #68k era ;)

Computers were just becoming capable, but not too big for their britches.

@rl_dane @kabel42

Yeah, tbh, we really should have stopped at 32-bit

@OpenComputeDesign @rl_dane I had a good time with my first amd athlon 64 but sure, simpler times :)

@kabel42 @OpenComputeDesign

I think computers were honestly better when they were limited to absolutely no more than 1GB RAM, no more than 256 colors, and no more than 1024x768 screen resolution.

1GB RAM: no LLMs
256 colors: no horrid low-contrast soupy interfaces
XGA Resolution: no horrid empty spaces and bloated interfaces

I keep wanting to make that as an OS 😄

(If only I had the skillz)

@rl_dane @OpenComputeDesign
256 Colours is very limited, but i'd like to see what software would be like if hardware stopped at 1G RAM and maybe 16bit colour :)
And 2 cores, I don't miss not being able to use the comupter while something is compiling

@kabel42 @OpenComputeDesign

> 256 Colours is very limited, but i'd like to see what software would be like if hardware stopped at 1G RAM and maybe 16bit colour :)

16bit color still has the problem of allowing for crappy low-contrast interfaces.
When using palette color, the interface itself must be designed to use as few colors as possible to leave more room for displaying images.

Also, with good dithering at XGA resolutions, depending on the image, it's really hard to tell 8-bit from truecolor

Source: used a computer that was limited to 8-bit color at XGA resolution for many years ;)

Actually, I kinda want to make a challenge on that. I wonder if I can come up with some test images for that. :D

> And 2 cores, I don't miss not being able to use the comupter while something is compiling

If you think you can't use your computer while it's compiling on only one core, then modern kernel schedulers are an abject failure.

I did all kinds of things on my computer while it was crunching away at stuff on Linux circa 2000, and it was more stable than today. :/

@rl_dane

Half agree but, realistically, a scheduler will only get you so far.

@kabel42 @OpenComputeDesign

@pixx @rl_dane @kabel42

I mean, surely they can get us farther than they do. Modern schedulers _SUCK_

@OpenComputeDesign

It's kinda funny that maxing cpu with audio playing results in stutter for me on linux but not on plan9

But I'm not running a datacenter so who cares what i want 😂

@rl_dane @kabel42

@pixx @OpenComputeDesign @rl_dane and schedulers have become more optimized for interactive use in the last decades :)

@kabel42
So they claim. They're just not good.

@OpenComputeDesign @rl_dane

@pixx @OpenComputeDesign @rl_dane
At least they are not only optimizing for compiling the linux kernel,... that is at least better

@kabel42 @pixx @OpenComputeDesign

Really? Even better than Colivas' old Completely Fair Scheduler?

Because I'd rather have a scheduler that was buttery smooth than one that delivered the best performance for gaming or whatever.

@kabel42 @pixx @rl_dane

Hard disagree. (as I was starting this reply, my brother asked why I paused the game we were playing, and I said because I had to argue about schedulers. "OH MY FUCKING GOD NO ONE CAN POSSIBLY CLAIM SCHEDULERS ARE GETTING BETTER" -my brother)

Modern schedulers went from slowing down a bit sometimes, to just hard locking for seconds to even minutes on end.

@OpenComputeDesign @pixx @rl_dane
In Linux? Not in my experience.
Install a 2.6 kernel and tell me that is better for interactive use.

@kabel42 @pixx @rl_dane

I guess we have very different experience. Even my mom is constantly complaining about how much worse her Linux experience has gotten with every single upgrade, and it's almost always scheduling issues, or memory management issues

@OpenComputeDesign @kabel42 @pixx

How did you arrive at that conclusion, though?

I mean, back in 2019, I was convinced that linux stability when to absolute crap until I realized I was likely suffering from a bad SSD.

We can say that the kernel should'nt be able to get locked up in IO contention so easily, but it's still an extenuating circumstance.

I think you have to work a little harder to rule out hardware issues and general software brittleness vs. kernel and scheduling, TBH.

@rl_dane @kabel42 @pixx

I guess every single computer I own suffers slightly more hardware failure coincidentially every single time I upgrade software? If true, I feel like that's an even _worse_ sign for modern software :P

@OpenComputeDesign @kabel42 @pixx

When I was (I think) suffering from SSD failures back in 2019-2020, linux would lock up hard for several seconds at a time when running pacman -Syu.

I understand that (what appears to have been) media failures can really gum up the works of a computer, but for the kernel to lock up THAT hard because of bad i/o contention is just not a good sign.

And I even tried the -rt kernel builds. Scarcely any better.

@pixx @OpenComputeDesign @kabel42

> But I'm not running a datacenter so who cares what i want 😂

The sad truth of modern #Linux.

@rl_dane @OpenComputeDesign
Ok, if you say 256 colors i think 256 colour VESA windows 9x :)

Cores:
I couldn't get a video to play more frames than it dropped while updateing my system in 2003 :)
You could solve that with hardware accelklerated video, but that would be a kind of second core? :)

@kabel42 @OpenComputeDesign

Ok, that's fair, but I wonder if BeOS would've done better at the time.

@rl_dane @kabel42

Maaaaaannnnnnn, really wish BeOS took off :(

@rl_dane @kabel42

Love Haiku. So close to dailying it, ngl

@OpenComputeDesign @kabel42

Nice! I like it a lot, but the ship on mouse-driven interfaces has sailed, for me. I'm tired of them. :P

@kabel42 @rl_dane

I'm a huge fan of hardware accelerator cards. And given that there are people that buy big fancy GPUs just for the codecs, I must not be the only one. Bring back dedicated codec cards!

@OpenComputeDesign @rl_dane I had a mpeg2 decoder in a tv cards, but only in my desktop and i think with a pentium 2 that couldn't decode/render sd tv in software :)

@kabel42 @OpenComputeDesign

Take a look at this sample I pulled from https://unsplash.com/photos/a-boat-travels-on-a-canal-in-front-of-buildings-t8OzFHgBjYk

(Remind me to delete this later to save space XD)

You might want to reply to the parent toot, not this one XD

I also made 16-bit (well, 15.96578428-bit lol) and 15-bit images, but they looked worse than the 8-bit images because there's no way to dither anything more than 256 colors in GIMP.

#rlDaneFindThis #rlDaneFindThisLater #DeleteMe

@rl_dane @OpenComputeDesign I remember 256 colors looking much worse, bat that could be modern dithering?

@kabel42 @rl_dane

I adore the dithering asthetic, ngl

@kabel42 @OpenComputeDesign

I think resolution has a lot to do with it, also the nature of the image.

I just lucked out to randomly pick an image that wasn't super-duper colorful.

8-bit color can obviously handle a monochrome (mono-hue) image perfectly.

When you have the majority of two out of three primary colors (and their mixtures) together (let's say red, blue, magenta, etc), then you get 256/65536 coverage (0.39% or 1:256), which is definitely not great, but still workable.

When an image has a lot of all three primary hues, and is high resolution, the number of colors climbs to be large percentage of the total number of pixels in the image. So assuming an XGA resolution image with each pixel being a unique color, a 256-color palette would give you a 0.033% or 1:3072 coverage of the color values, which is getting pretty rough.

A 16x16 pixel, 256-color image would by definition be a true color image, as long as the palette is picked based on the colors in the image. ;)

@rl_dane @OpenComputeDesign
Is 256 color VESA indexed? and even if it is, you cant choose all colors some colors are already chosen by other parts of the ui, right?

@kabel42 @OpenComputeDesign

I don't think the VESA mode works any differently from any other 8-bit mode, but I don't know specifically.

The way it generally worked was that the OS tries to make various apps attempting to display truecolor images share the color palette, but the foreground process got the priority.

IIRC, a part of the palette would be mostly static to always have a base of safe colors to choose from, but the rest was selected based on what was onscreen.

But I don't know if the OS had a truecolor buffer, or if the applications had APIs to request a certain set of colors. I'm guessing the latter, due to memory constraints, but I'm not sure. A truecolor framebuffer is only 3x larger than an 8-bit one.

@rl_dane @OpenComputeDesign
I've never done anything with indexed colors (not by hand) my only experience with 8 bit color is 332 IIIRC

@kabel42 @OpenComputeDesign

What's 332?

Ahh, nvm. XD

Ok, now I better understand what you were saying before.

Yeah, non-indexed 8-bit is rough. Not fun at all. Just not enough color to go around.

@rl_dane @OpenComputeDesign 3 bit red, 3 bit green, 2 bit blue (maybe blue and red are switched?)

@kabel42 @OpenComputeDesign

Nah, blue is definitely the least important of the three.

@kabel42
We also had lower resolutions when 256 colours was the only option (640x480 maximum usually).
@rl_dane @OpenComputeDesign

@ddlyh @kabel42 @OpenComputeDesign

Yes, initially, the only 8-bit color mode on the PC was "MCGA" 320x200.

@rl_dane @kabel42

Hear me out...

I almost kinda prefer low color depth + dithering to almost-but-not-quite-enough-colors-to-not-have-colorbanding.

4KHDR480hz displays seem silly when standard bit depths still color band. At least 256 colors is honest about it's limitations.

@OpenComputeDesign @kabel42

Man, the color banding at 24-bit is so annoying.

Like, it's fine for almost anything UNTIL you point your camera at the sky. XD