We're debating semantics, but if I reshaped an RGB image into component arrays i.e. u8[yn][xn][3] → u8[3][yn][xn] then would you still view that as a 24-bit format? What if those 24-bit values were huffman or run-length encoded would it be an n-bit format? If your Y′CbCr luminance plane has a legal range of 16..235 and the chrominance planes are 16..240, then would it be a 23.40892 bit format?
I'm arguing about non-compressed, eventually padded data types that make learning Unicode (or any other applicable data format) easier because of the equivalence : 1 atomic unit ("character", pixel) = 1 smallest addressable unit of memory (byte). This involves byte size being at least as large as atom size.
And it's particularly important to have this property for text, because not only data is overwhelmingly stored as text (in importance, not by "weight"), but because computer programs themselves are written using text.
Well I figured since you feel strongly about using a type wider than 8-bits for RGB you must have a really good display that actually lets you perceive the colors that enables you to encode. Most PC displays are garbage including the expensive ones because first, sRGB only specifies a very small portion of light that's perceivable and secondly, any display maker who builds something better is going to run into complaints about how terrible netflix looks, because it reveals things like banding (which you mentioned) that otherwise wouldn't be perceivable. So I was hoping you could recommend me a better monitor, so I can get into >8 bit RGB, because I've found it exceedingly difficult to shop around for this kind of thing.
Ok, so you weren't sarcastic and/or misunderstanding my use of "atomic".
Sadly, I kind of gave up on getting a "HDR" display, at least for now, because :
- AFAIK neither Linux nor Windows have good enough "HDR" support yet. (MacOS supposedly does, but I'm not interested.)
- I'm happy enough with my HP LP2475w which I got for dirt cheap just before "HDR" became a thing. I consider the 1920x1200 resolution to be perfect for now (as a bonus I can manually scale various old resolutions like 800x600 to be pixel-perfect) - too many programs/OSes still have issues with auto-scaling programs on higher resolution screens (which would come with "HDR"). I'm also particularly fond of the 16:10 ratio, which seems to have gone extinct.
- Maybe I'll be able to run this monitor properly in wide gamuts (though with banding), or maybe even in some kind of ""HDR" compatibility mode", though it would seem that the current sellers of "HDR" screens aren't going to make that easy. I might be able to get a colorimeter soon to properly calibrate it.
If you have a $200 monitor then it probably struggles to make proper use of 8-bit formats. I have a display that claims to simulate DICOM but it's not enough I want more. However I'm not willing to spend $3000 on a display which doesn't have engineering specs and then send it back because it doesn't work. I don't care about resolution. I care about being able to see the unseen. I care about edge cases like yellow and blue making pink. That was the first significant finding Maxwell reported on when he invented RGB. However nearly every monitor ever made mixes those two colors wrong, as gray, due to subpixel layout issues. Nearly every scaling algorithm mixes those two colors wrong too, due to the way sRGB was designed. It's amazing how poorly color is modeled on personal computers. https://justine.lol/maxwell.png
Well, when released in 2008 it was a $600 monitor, I got it second-hand for 80€.
I'm not sure what DICOM has to do with color reproduction quality ? Also it seems to be a quite a bit older standard than sRGB...
By definition, you can't "see the unseen". "Yellow" and "blue" are opponent "colors", so, by definition, a proper mixture of them is going to give you grey :
Also, when talking about subtle color effects, you have to consider that personal variation might come into play (for instance red-green "colorblindness" is a spectrum).
It looks like this thing is the thing I want to buy https://www.apple.com/pro-display-xdr/ There's plenty of light that is currently unseeable. Look at chromaticity chart for sRGB. If your definition of color mixes yellow and blue and as grey then you've defined color wrong, because nature has a different definition where it's pink. For example the CIELAB colorspace will mix the two as pink. Also I'm not colorblind. If I'm on the spectrum I would be on the able to see more color more accurately end of the spectrum. Although when designing charts I'm very good at choosing colors that accommodate people who are colorblind, while still looking stylish, because I feel like inclusive technology is important.