16-bit/early-32-bit was my favorite era. (Basically, the #68k era ;)
Computers were just becoming capable, but not too big for their britches.
I think computers were honestly better when they were limited to absolutely no more than 1GB RAM, no more than 256 colors, and no more than 1024x768 screen resolution.
1GB RAM: no LLMs
256 colors: no horrid low-contrast soupy interfaces
XGA Resolution: no horrid empty spaces and bloated interfaces
I keep wanting to make that as an OS 😄
(If only I had the skillz)
> 256 Colours is very limited, but i'd like to see what software would be like if hardware stopped at 1G RAM and maybe 16bit colour :)
16bit color still has the problem of allowing for crappy low-contrast interfaces.
When using palette color, the interface itself must be designed to use as few colors as possible to leave more room for displaying images.
Also, with good dithering at XGA resolutions, depending on the image, it's really hard to tell 8-bit from truecolor
Source: used a computer that was limited to 8-bit color at XGA resolution for many years ;)
Actually, I kinda want to make a challenge on that. I wonder if I can come up with some test images for that. :D
> And 2 cores, I don't miss not being able to use the comupter while something is compiling
If you think you can't use your computer while it's compiling on only one core, then modern kernel schedulers are an abject failure.
I did all kinds of things on my computer while it was crunching away at stuff on Linux circa 2000, and it was more stable than today. :/
Take a look at this sample I pulled from https://unsplash.com/photos/a-boat-travels-on-a-canal-in-front-of-buildings-t8OzFHgBjYk
(Remind me to delete this later to save space XD)
You might want to reply to the parent toot, not this one XD
I also made 16-bit (well, 15.96578428-bit lol) and 15-bit images, but they looked worse than the 8-bit images because there's no way to dither anything more than 256 colors in GIMP.
I think resolution has a lot to do with it, also the nature of the image.
I just lucked out to randomly pick an image that wasn't super-duper colorful.
8-bit color can obviously handle a monochrome (mono-hue) image perfectly.
When you have the majority of two out of three primary colors (and their mixtures) together (let's say red, blue, magenta, etc), then you get 256/65536 coverage (0.39% or 1:256), which is definitely not great, but still workable.
When an image has a lot of all three primary hues, and is high resolution, the number of colors climbs to be large percentage of the total number of pixels in the image. So assuming an XGA resolution image with each pixel being a unique color, a 256-color palette would give you a 0.033% or 1:3072 coverage of the color values, which is getting pretty rough.
A 16x16 pixel, 256-color image would by definition be a true color image, as long as the palette is picked based on the colors in the image. ;)
I don't think the VESA mode works any differently from any other 8-bit mode, but I don't know specifically.
The way it generally worked was that the OS tries to make various apps attempting to display truecolor images share the color palette, but the foreground process got the priority.
IIRC, a part of the palette would be mostly static to always have a base of safe colors to choose from, but the rest was selected based on what was onscreen.
But I don't know if the OS had a truecolor buffer, or if the applications had APIs to request a certain set of colors. I'm guessing the latter, due to memory constraints, but I'm not sure. A truecolor framebuffer is only 3x larger than an 8-bit one.
What's 332?
Ahh, nvm. XD
Ok, now I better understand what you were saying before.
Yeah, non-indexed 8-bit is rough. Not fun at all. Just not enough color to go around.
Nah, blue is definitely the least important of the three.
@ddlyh @kabel42 @OpenComputeDesign
Yes, initially, the only 8-bit color mode on the PC was "MCGA" 320x200.