Because it's patent-encumbered?
I'd be curious to know the other things š
I've started using WebP where possible.
I wish my phone would snap pictures in that format directly.
JPEG is starting to get AARP flyers in the mail. š
@RL_Dane
WebP is often worse than JPEG, especially for photographic images. It is better at low fidelity, about even at medium, & worse at high fidelity. It is decent with non-photographic images, though. This surprised me initially, as it goes against the whole purpose of WebP.
Lossless WebP, on the other hand, is fantastic. There hasn't been a better lossless format until lossless JXL.
That's really fascinating, thanks for sharing. I've been fascinated by codecs for many, many years (though I'm not terribly well versed on the subject), initially general-purpose ones like stuffit/arc/arj/zip/compactor pro/compress/gzip/bzip2, etc, but later JPEG fascinated me with its magical (for the time) powers.
I'm surprised that JPEG XL was still using DCT. I thought all the cool (video codec) kids had moved on to wavelets.
Fascinating, thank you for taking the time to answer.
So I'm guessing the codec analyses the image and decides on the unique mosaic of block sizes, or is it just one block size per image?
If almost everything is still DCT, why don't newer algorithms have the kind of chunky artifacts as old JPEG? They just seem to get blurrier, not as artifact-y.
Interesting! I didn't realize the modern codecs were just "covering up" the "edges," so to speak.
Can you elaborate a little bit about what you mean by JPEG treating the blocks "independently?"
Are the newer codecs applying some kind of averaging across blocks, or making the blocks overlap or something?
I vaguely remember seeing HW JPEG acceleration in years past (when it was still computationally quite costly for the CPUs of the era). Also, I wonder if digital cameras of the 2000s used any kind of hardware encoding for JPEG, or if they just had an optimized software stack running on their custom ARM CPUs.
...
@RL_Dane
I think smartphone ISPs utilize hardware encoding right now for JPEG - not entirely sure though. They definitely do for HEIC if you're on an iPhone.
The early dawn of JPEG hardware was admittedly before my time, but hearing what you're saying, that sounds super interesting. I was born into a world where 'jpeg' and 'image' were basically already synonymous
@wb
Yeah, early JPEG was pretty exciting. To my knowledge, it was the first GOOD lossy algorithm of any kind -- QuickTime had lossy video codecs, but they were super simple: selectively updating the screen by rather large blocks, but no complex math or DCT AFAIK.
There were lossy audio codecs for voice, but they were pretty rudimentary. Nothing good for full-spectrum audio (music) until MP3.
...