why are garbage sdr-mascarading-as-hdr photos still a thing on phones? a lot of modern phones have both true-hdr displays and true-hdr sensors. why do i need to shoot raw on an iphone and then edit in lightroom to make use of that hardware? why can't apple process a few extra bits of data on the fly? this feels incredibly odd because the jump in quality you get from this process is immediately apparent to even the most un-technical person, and the hardware is many years old at this point