Maybe it has drifted, but when I take samples from my own display and re-display them on the same display, the result looks almost exactly the same to my eyes, maybe just a tiny bit different if not identical. I bought it (edit: Used) for something around 35 dollars on eBay, so it might not be the best tool.
(Edit) That’s part of what I’m saying. Another thing is that consumer models will move white somewhere else while keeping the area around red, yellow, and green reasonably accurate without adaptation, at the expense of other colors getting dragged towards blue. By extension, I might assume that professional units did the correction without changing white, likely using Neal’s derivation from 1975, or perhaps some other improved derivation. At least, the professional units must have done that during SMB1’s era. Consumer models, however, kept doing this up to the 2000s.
The 1985 and 1989 models are RF-only, and the 1997 and 2000 ones have composite. For the 2000 one, I did take the samples through the composite input.
The 1997 TV uses the CXA2025AS chip, which does not have separate demodulation settings for composite versus RF. It does have a US/JP switch for the demodulation settings, and it does have a “Dynamic Color” feature which isn’t explained in the documentation, but I’m guessing that’s another switch for demodulation settings and white balance combined.
What you’re saying makes sense to me. I assumed in your original screenshots you were expanding the gamut from SMPTE C to 1953 primaries rather than vice versa. It’s not possible to expand gamuts electronically, and the 1953 gamut is wider than the SMPTE C gamut. The primaries are fixed for a given display, but can be electronically ‘shrunk’.
White point is determined simply by changing the strength of individual electron guns for a given voltage. Theoretically, a display can show any white point within the gamut of the three primaries. Changing the white point of a display is therefore straightforward. There were actually people who you could pay to calibrate your CRT via system menu for you in the 90s and 2000s. Most people didn’t do this, but the service existed. They would optimize contrast (called Picture back then), black level, color saturation, hue, overscan, white point, and other settings, within the limits of the display. As a consumer, I did it myself through test images. We shared the remote codes to get into the service menu on mailing lists.
TVs have a limited gamut, SMPTE C is smaller than sRGB, but if they existed, they would cost several thousand (up to 75,000) dollars more.
In addition, it is possible that it used a P3 or Adobe RGB profile.
I may use the camera’s internal light meter, or I may use an accessory such as these to adjust the white point.
I believe most TVs were calibrated closer to 9300K in reality, if you do sRGB->SMPTE-C 9300K on a good sRGB monitor, colors then are pretty close how my CRT looks like.
sRGB and SMPTE-C are close, the good sRGB monitor gets rid a lot of the green tint in Super Metroid without doing anything.
They probably not exactly 9300K on CRTs out of Japan (not that cool, while it looks much more pleasant in 9300K), but in between 9300 and 6500K, like 7500K and more. That doesn’t mean all TVs are like that, some could still look like something else, there simply is too much diversity.
@PlainOldPants I see you have a colorimeter. Have you taken ICC profile measurements or HFCR measurements with your colorimeter? If you have that data, we can probably figure out what the TVs are doing.
EDIT: @Nesguy I think everyone needs to get on the same page with standards and what they are for. The standards were never a prescriptive thing. They were more description of what broadcasters/engineers were already doing. Even the original NTSC was a decision on which color system to use based on competing systems already made by manufacturers, not a descriptive formulation. Standards would usually only be introduced to solve a problem. BT.1886 took as long as it did because all CRTs pretty much had the same gamma. There was no standard because there didn’t need to be one. Broadcasters could also end up doing something the standards didn’t describe and just never update the standard if no perceived problem arose from it. Lots of standards are technically still in effect in various industries and are ignored without consequence.
So, for example, when SMPTE issued the C standard in 1987 it wasn’t a directive telling all broadcasters to convert to the SMPTE C primaries overnight. It was rather informing broadcasters that they could color correct on SMPTE C phosphor monitors and not need to use any other correction circuitry. The vast majority of broadcasters were already doing that for years and that is why SMPTE settled on that standard, but there was still uncertainty about if the practice was correct and that’s why they issued the standard. The same thing applies to the standard on encoding video, 170M, the method described there was already being done in practice for VHS, LaserDisc, etc. and the standard was defined to ensure the creators could rely on those established practices for new technologies going forward.
Standards can also be abandoned or ignored. The FCC YIQ system for NTSC was abandoned for YPbPr. There is a good argument to be made the D93 for NTSC-J wasn’t followed (and D65 not followed for NTSC-U/color temp inconsistencies in general). Even sRGB considered D50 early on. The warmer temperature was better for matching colors to print (something actually relevant to game developers who did a lot of print material and scanning), cooler temperatures were better for matching grayscale to the Black & White P4 phosphor (something broadcasters and TV manufacturers in the 1950s and 1960s would have cared about). Another example is how game consoles didn’t bother to include setup for NTSC-U until the PS1, etc.
In conclusion, standards are useful, but it’s important to understand the context around them, why they were developed. A lot of this context has been altered by repeated myths and misinformation.
In case you skimmed over this, in my original reply to you, you have to click on the line that says “Summary” to view the full details.
Yes, I’ve used HCFR to take measurements. Copying and pasting from my Nesdev post about the 1985 Toshiba Blackstripe, since I’ve followed this same procedure on the 1989 RCA ColorTrak and 2000 Panasonic as well:
The procedure is like this:
Wait the standard 30 minutes for the CRT to fully warm up.
Sample phosphors. Now, any sampled color can be separated into a multiple of the 3 different phosphors. For good measure, I sampled around the entire RGB triangle and did linear regressions on the triangle edges. Somehow, I messed something up around green, but it should be okay as long as I’m within the vicinity of the correct points. https://www.desmos.com/calculator/rzyeouzpzk
Sample the grayscale. The sweet spot is probably about 85 samples, a perfect interval of 3 between samples from 0 to 255. Now, any multiple of the 3 different phosphors can be converted into electrical R’G’B’ values. https://www.desmos.com/calculator/kmc4hust5j
Keeping Y and sqrt(I^2 + Q^2) constant, sample a full 360-degree cycle of chroma. Now, the demodulation offsets and gains for R-Y, G-Y, and B-Y can be determined by a sinusoidal regression. https://www.desmos.com/calculator/rxauy2wqnl
That’s all I did for the 1989 RCA and 2000 Panasonic. For the 1985 Toshiba, I could get demodulation settings from the TA7644BP chip’s datasheet, which was close to the result I got from sampling. For the 1997 Sony, I don’t even have that TV, but I could get phosphors and demodulation settings for it from the internet.
In theory, once we have the primaries, the full grayscale (which we treat as an EOTF and OETF) and the nonstandard R-Y/G-Y/B-Y demodulation angles and gains, that should be all we need to emulate the CRT’s colors. The Panasonic and RCA CRTs both had all 3 of those things. The Toshiba and Sony CRTs only had demodulation settings and primaries, because the Toshiba CRT’s grayscale was clearly desync’d very badly, and the Sony CRT’s white balance can’t be found anywhere online.
Once we have that, we have to figure out what the CRT is doing, and for two of them, we have to guess-and-check several different white points too. This crappy RetroArch shader here called White Trash https://www.mediafire.com/file/n7zwx1f89r2uzvq/white-trash-2025-10-19.slang/file , which I’ll pretend I didn’t make, can emulate the demodulation offsets/gains along with any primaries with a simple power law for gamma, and can display test patterns to check how close the CRT’s approximation gets to some standard primaries/whitepoint, with or without accounting for chromatic adaptation. The downside is that this shader is crap. Here’s what the output looks like if I use the Toshiba Blackstripe with its white balance set to x=0.281,y=0.311 and with the datasheet’s demodulation settings (unlike the NesDev post which uses approximate sampled demodulation settings), and try to match against 1953 NTSC with illuminant C, without accounting for chromatic adaptation:
(The bottom-right three triangles are for setting the global hue and saturation. Others are self-explanatory.)
My initial reply to you includes links to the CRTs’ data, and what white point each CRT had (or, for the two that don’t have a known white point, what white point it most likely had). As far as I can tell, they all appear to be trying to approximate the red/yellow/green area according to 1953 NTSC color without accounting for chromatic adaptation. Trying anything else gives nasty-looking results. You can see all this if you click on the line of text that just says “Summary”.
The main point that I was trying to make with that reply was about the timeline. The standards being published in 1987 and 1994 didn’t immediately affect what the CRTs were doing, and the beginning of HDTV in 1998 (?) may or may not have caused many CRTs to switch to D65. The CRTs’ behaviors aren’t new either; they’re based on papers from 1966 and 1975, sometimes even with the same nonstandard white balance as in the 1966 paper.
I can post the Excel spreadsheets, Desmos graphs, and a few White Trash tests sometime else, but not now.
Something to keep in mind.
A professional CRT monitor loses its calibration after 5 months. A TV that is more than 25 years old may lose it monthly.
Old components, swollen capacitors, broken solder joints, worn cables, even a slightly burnt tube, can change the result.
I did see the expanded summaries but it wasn’t clear exactly what data you were working with. This post really makes it clear. Thank you for that. What do you mean by ‘chromatic adaption’?
I think you’re putting too much emphasis on the exact year for standards. The standards weren’t something that applied to the CRT manufacturers. They were for broadcasters and video masterers. You shouldn’t think of the year for a standard coming out as a cutoff or starting point for some kind of technology. That’s just not how it worked. Like D65 started being used in Japan before their own version HDTV. You mention HDTV as being in 1998, but that’s just in America. Japan introduced HDTV in the late 80s. They specified a D65 white point. So the momentum in Japan to move from D93 to D65 was in place long before 1998.
An old document i was able to find with Japan primaries, translation is “JAPAN Specific Phosphor”, in the document it’s clearly stated “it’s D93 and is slightly different than SMPTE-C that was used in USA and NTSC1953 was never used” too.
@DariusG I used those Japanese phosphor coordinates and the D93 white point in Scanline Classic to create some screenshots. The S-Video shader I made allows you to cut out the chroma and just look at grayscale. I decided to compare it to D65 US NTSC grayscale as well as the grayscale made by a P4 phosphor. Old black and white TVs used this P4 phosphor which has a bluish white color. The extra sharpness of the P4 screenshot is intended because those displays were sharper. The S-Video shader is beta and may have errors, but I think it’s more or less correct for what I’m demonstrating here.
Imagine in the 60s when most programs were still in black and white, especially the ‘serious’ programs like Perry Mason. Comparing the D65 with P4 side-by-side would make it look dull. Imagine wanting to sell a color TV, your customer asks ‘will it work with B&W?’ ‘Of course!’ you say, and switch over to a station playing a B&W program.’ You’d want it to look as close to the B&W displays as possible, right? So as to not scare off the customer?. We can see that D93 is closer to the P4 characteristic.
When publishers and artists starting doing work on computers, they preferred the warmer colors. The (monochrome) Macintosh was designed with publishing in mind and used something called a ‘paper white’ phosphor. I haven’t been able to determine the chromaticity, but it may be D65, D50, or something else. Video colorists may have preferred working long hours with the less straining D65 white point as well.
The CRTs have a different white balance than they presume the input signal to have. According to what I’ve been hearing elsewhere, whenever you’re forced to move white from one position to another, the “proper” thing to do is a chromatic adaptation transform, which is done by converting into an LMS-like space (such as the Bradford space which is “sharpened”) and adjusting the ratios in that space.
These consumer CRTs, however, don’t care about that. Instead, they re-saturate the red, yellow, and green area back to where they were with the original reference white, at the expense of the rest of the colors.
I have only educated myself through random websites and documents on the internet over the past year or so, but it seems like, while color scientists prefer adapting the LMS cone response ratios, video masterers prefer to keep the entire image unadapted/unmodified, even if their viewing environment’s lighting is very different.
I have seen this document too, but I think Grade’s “P22_J_ph” is probably more accurate. If I understand right, what this Japanese document is saying is that professional monitors (not necessarily just in Japan) generally have phosphors near these chromaticity coords. Grade’s P22-J comes from averaging several different CRTs’ chromaticity points. Those different CRTs’ primaries are all conveniently compiled together in this program’s constants: https://github.com/ChthonVII/gamutthingy/blob/master/src/constants.h
Meanwhile, the 3 consumer sets I have from Toshiba (1985), RCA (1989), and Panasonic (2000) are closer to the SMPTE C green, and both Toshiba and Panasonic also picked deeper reds. Weirdly, Toshiba picked a blue that’s closer to the NTSC spec, and this is the only brand I’ve seen doing this. Most other brands stay close to the same blue primary across the board.
My completely unproven opinion is that the SMPTE C green helps make yellows deeper (which is important), while the “Japan Specific Phosphor” green and red both are more efficient with less load on the electron guns needed to reach bright colors or the desired white balance. This one is a very far stretch, but copying an opinion from this document about using nonstandard CRT white balances https://library.imaging.org/admin/apis/public/api/ist/website/downloadArticle/cic/1/1/art00047 , maybe these phosphors could result in more even current ratios across the 3 electron guns, which means less uneven wear on the guns, less uneven burn-in, less vertical fringing, and better sharpness. I have no proof of this at all, and I have no actual experience in the industry, so please just take this as a random conjecture, not anything serious. But it would make sense if consumer units preferred deeper yellows at the expense of more burn-in and decreased longevity overall.
This document also has some suggestions for why a bluer white balance might be chosen https://library.imaging.org/admin/apis/public/api/ist/website/downloadArticle/cic/1/1/art00047 . My opinion, though, is that x=0.281, y=0.311 originally appeared because it was closer to the black-and-white TVs’ white balances, and that it stuck around because people liked the oversaturated blues better and because it didn’t have to drive the red phosphor as hard to reach the desired white balance. Even my modern TV has a “Dynamic Color” setting that does this blue-stretch trick.
I forgot to mention, how would a typical person set up a VCR with a CRT that has a composite input? Wouldn’t they connect the VCR’s RF input into the wall, and plug its composite output into their TV’s composite input? In that case, the correction on composite video would need to be based on 1953 colorimetry.
FWIW, I grew up with B&W TVs, and they definitely had a cool white point. My family also had early Macintoshes, as my mother did desktop publishing for a living, so I have some context there, too.