Because it's patent-encumbered?
I'd be curious to know the other things 😁
I've started using WebP where possible.
I wish my phone would snap pictures in that format directly.
JPEG is starting to get AARP flyers in the mail. 😂
@RL_Dane
WebP is often worse than JPEG, especially for photographic images. It is better at low fidelity, about even at medium, & worse at high fidelity. It is decent with non-photographic images, though. This surprised me initially, as it goes against the whole purpose of WebP.
Lossless WebP, on the other hand, is fantastic. There hasn't been a better lossless format until lossless JXL.
That's really fascinating, thanks for sharing. I've been fascinated by codecs for many, many years (though I'm not terribly well versed on the subject), initially general-purpose ones like stuffit/arc/arj/zip/compactor pro/compress/gzip/bzip2, etc, but later JPEG fascinated me with its magical (for the time) powers.
I'm surprised that JPEG XL was still using DCT. I thought all the cool (video codec) kids had moved on to wavelets.
Fascinating, thank you for taking the time to answer.
So I'm guessing the codec analyses the image and decides on the unique mosaic of block sizes, or is it just one block size per image?
If almost everything is still DCT, why don't newer algorithms have the kind of chunky artifacts as old JPEG? They just seem to get blurrier, not as artifact-y.
Interesting! I didn't realize the modern codecs were just "covering up" the "edges," so to speak.
Can you elaborate a little bit about what you mean by JPEG treating the blocks "independently?"
Are the newer codecs applying some kind of averaging across blocks, or making the blocks overlap or something?
I vaguely remember seeing HW JPEG acceleration in years past (when it was still computationally quite costly for the CPUs of the era). Also, I wonder if digital cameras of the 2000s used any kind of hardware encoding for JPEG, or if they just had an optimized software stack running on their custom ARM CPUs.
...
@RL_Dane
I think smartphone ISPs utilize hardware encoding right now for JPEG - not entirely sure though. They definitely do for HEIC if you're on an iPhone.
The early dawn of JPEG hardware was admittedly before my time, but hearing what you're saying, that sounds super interesting. I was born into a world where 'jpeg' and 'image' were basically already synonymous
@wb
Yeah, early JPEG was pretty exciting. To my knowledge, it was the first GOOD lossy algorithm of any kind -- QuickTime had lossy video codecs, but they were super simple: selectively updating the screen by rather large blocks, but no complex math or DCT AFAIK.
There were lossy audio codecs for voice, but they were pretty rudimentary. Nothing good for full-spectrum audio (music) until MP3.
...
@RL_Dane @gianni JPEG basically made digital photography possible — early CompactFlash cards would have capacities like 2 to 15 MB, so without lossy compression it wouldn't be very practical (you would only be able to store 1 or a few photos instead of dozens).
It also made network transfer of photos possible — I remember before JPEG (and GIF) we would use blocky ANSI art to get something graphical on our BBS (this was before the web took off).
By comparison, JXL is only a small improvement.
Yes, I remember waiting quite a while for GIFs to load -- even progressive (interlaced?? I forget the right term) GIFs weren't that much help.
I wish I still had it, but someone took a "digital" photo of me way back in 1992 with a Kodak Xapshot camera at a Mac Users' Group meeting in Austin.
...
I have a batch_de-heic script that calls #ImageMagick `convert` to deal with pictures from others' iPhones 😆
I might see if convert can handle JXL. File size is definitely more important than quality for the purposes of these particular photos.
Ahh, that makes sense. I wouldn't have thought that encoders would be more economical (because of increased complexity of encoding vs decoding), but the point about not having to support such a wide range of possible algorithms makes a lot of sense.
That also explains why cameras record to relatively high bitrates -- not just for best possible quality, but possibly also so that the encoding can be a bit simpler/"faster."