Old doesn't mean outdated. TCP is ancient and we still use it for a bunch of stuff.
JPEG is good enough, not encumbered by IP concerns, and universally supported. That makes it better than an alternative that is "better" in a less important dimension but worse in broad support.
JPEG only supports up to 8 bit color. It has much worse compression than more modern standards. It doesn't support alpha transparency. It doesn't support many of the modern smartphone image embedding functions (image sequences, derivatives/edited images, small movies embedded in the image).
You could make the same arguments about any of the wide variety of outdated video formats. This sort of thinking leads to a lack of progress in the industry.
8 bits and no transparency is good enough for many 9s worth of the photos people take.
I have a calibrated HDR monitor. I have a camera that can shoot in 12bit color. And what HDR support does is just cause me pain. I don't find it beneficial in games. I find it to have limited utility in video. I never need more than 8 bits when sending photos to friends or having them printed. The extra support other formats offer gives me no value.
If you want transparency you want some other format, but no camera I know of records an alpha channel.
Transparency is useful immediately for any editing. Why is that so hard to have support even if it’s unused?
Also, the other formats are useful and provide value: if you send a photo to people, the “live” portion of the photo is sent automatically for compatible receivers. It’s only beneficial.
If that’s not enough, why wouldn’t you want similar compression quality at half the file size? That really, really adds up with tons of photos.
> And what HDR support does is just cause me pain. I don't find it beneficial in games. I find it to have limited utility in video.
Oh boy. You must either be using Windows or never seen photographs/video taken on modern phones/cameras and viewed side-by-side on a good modern display (such as an iPad Pro or MacBook Pro) vs. a random "gOoD eNoUgH" display.
I regularly edit (RAW) photos from a Nikon Z8 on both a M2 Macbook Pro and a Windows Machine (either desktop with good monitor or Surface Laptop Studio 2) and do not notice a difference.
And? Much like more modern compression algorithms for video take longer to decode/encode, this problem is largely solved by just adding accelerators since it’s basically guaranteed to be used. We do it for HEVC. I see little reason why you couldn’t do the same, especially since the HEIC encapsulated content for iOS stuff is derived/very similar to HEVC.
JPEG is almost as outdated as SMS.