Maybe it has drifted, but when I take samples from my own display and re-display them on the same display, the result looks almost exactly the same to my eyes, maybe just a tiny bit different if not identical. I bought it (edit: Used) for something around 35 dollars on eBay, so it might not be the best tool.
(Edit) That’s part of what I’m saying. Another thing is that consumer models will move white somewhere else while keeping the area around red, yellow, and green reasonably accurate without adaptation, at the expense of other colors getting dragged towards blue. By extension, I might assume that professional units did the correction without changing white, likely using Neal’s derivation from 1975, or perhaps some other improved derivation. At least, the professional units must have done that during SMB1’s era. Consumer models, however, kept doing this up to the 2000s.
The 1985 and 1989 models are RF-only, and the 1997 and 2000 ones have composite. For the 2000 one, I did take the samples through the composite input.
The 1997 TV uses the CXA2025AS chip, which does not have separate demodulation settings for composite versus RF. It does have a US/JP switch for the demodulation settings, and it does have a “Dynamic Color” feature which isn’t explained in the documentation, but I’m guessing that’s another switch for demodulation settings and white balance combined.
What you’re saying makes sense to me. I assumed in your original screenshots you were expanding the gamut from SMPTE C to 1953 primaries rather than vice versa. It’s not possible to expand gamuts electronically, and the 1953 gamut is wider than the SMPTE C gamut. The primaries are fixed for a given display, but can be electronically ‘shrunk’.
White point is determined simply by changing the strength of individual electron guns for a given voltage. Theoretically, a display can show any white point within the gamut of the three primaries. Changing the white point of a display is therefore straightforward. There were actually people who you could pay to calibrate your CRT via system menu for you in the 90s and 2000s. Most people didn’t do this, but the service existed. They would optimize contrast (called Picture back then), black level, color saturation, hue, overscan, white point, and other settings, within the limits of the display. As a consumer, I did it myself through test images. We shared the remote codes to get into the service menu on mailing lists.
TVs have a limited gamut, SMPTE C is smaller than sRGB, but if they existed, they would cost several thousand (up to 75,000) dollars more.
In addition, it is possible that it used a P3 or Adobe RGB profile.
I may use the camera’s internal light meter, or I may use an accessory such as these to adjust the white point.
I believe most TVs were calibrated closer to 9300K in reality, if you do sRGB->SMPTE-C 9300K on a good sRGB monitor, colors then are pretty close how my CRT looks like.
sRGB and SMPTE-C are close, the good sRGB monitor gets rid a lot of the green tint in Super Metroid without doing anything.
They probably not exactly 9300K on CRTs out of Japan (not that cool, while it looks much more pleasant in 9300K), but in between 9300 and 6500K, like 7500K and more. That doesn’t mean all TVs are like that, some could still look like something else, there simply is too much diversity.
@PlainOldPants I see you have a colorimeter. Have you taken ICC profile measurements or HFCR measurements with your colorimeter? If you have that data, we can probably figure out what the TVs are doing.
EDIT: @Nesguy I think everyone needs to get on the same page with standards and what they are for. The standards were never a prescriptive thing. They were more description of what broadcasters/engineers were already doing. Even the original NTSC was a decision on which color system to use based on competing systems already made by manufacturers, not a descriptive formulation. Standards would usually only be introduced to solve a problem. BT.1886 took as long as it did because all CRTs pretty much had the same gamma. There was no standard because there didn’t need to be one. Broadcasters could also end up doing something the standards didn’t describe and just never update the standard if no perceived problem arose from it. Lots of standards are technically still in effect in various industries and are ignored without consequence.
So, for example, when SMPTE issued the C standard in 1987 it wasn’t a directive telling all broadcasters to convert to the SMPTE C primaries overnight. It was rather informing broadcasters that they could color correct on SMPTE C phosphor monitors and not need to use any other correction circuitry. The vast majority of broadcasters were already doing that for years and that is why SMPTE settled on that standard, but there was still uncertainty about if the practice was correct and that’s why they issued the standard. The same thing applies to the standard on encoding video, 170M, the method described there was already being done in practice for VHS, LaserDisc, etc. and the standard was defined to ensure the creators could rely on those established practices for new technologies going forward.
Standards can also be abandoned or ignored. The FCC YIQ system for NTSC was abandoned for YPbPr. There is a good argument to be made the D93 for NTSC-J wasn’t followed (and D65 not followed for NTSC-U/color temp inconsistencies in general). Even sRGB considered D50 early on. The warmer temperature was better for matching colors to print (something actually relevant to game developers who did a lot of print material and scanning), cooler temperatures were better for matching grayscale to the Black & White P4 phosphor (something broadcasters and TV manufacturers in the 1950s and 1960s would have cared about). Another example is how game consoles didn’t bother to include setup for NTSC-U until the PS1, etc.
In conclusion, standards are useful, but it’s important to understand the context around them, why they were developed. A lot of this context has been altered by repeated myths and misinformation.
In case you skimmed over this, in my original reply to you, you have to click on the line that says “Summary” to view the full details.
Yes, I’ve used HCFR to take measurements. Copying and pasting from my Nesdev post about the 1985 Toshiba Blackstripe, since I’ve followed this same procedure on the 1989 RCA ColorTrak and 2000 Panasonic as well:
The procedure is like this:
Wait the standard 30 minutes for the CRT to fully warm up.
Sample phosphors. Now, any sampled color can be separated into a multiple of the 3 different phosphors. For good measure, I sampled around the entire RGB triangle and did linear regressions on the triangle edges. Somehow, I messed something up around green, but it should be okay as long as I’m within the vicinity of the correct points. https://www.desmos.com/calculator/rzyeouzpzk
Sample the grayscale. The sweet spot is probably about 85 samples, a perfect interval of 3 between samples from 0 to 255. Now, any multiple of the 3 different phosphors can be converted into electrical R’G’B’ values. https://www.desmos.com/calculator/kmc4hust5j
Keeping Y and sqrt(I^2 + Q^2) constant, sample a full 360-degree cycle of chroma. Now, the demodulation offsets and gains for R-Y, G-Y, and B-Y can be determined by a sinusoidal regression. https://www.desmos.com/calculator/rxauy2wqnl
That’s all I did for the 1989 RCA and 2000 Panasonic. For the 1985 Toshiba, I could get demodulation settings from the TA7644BP chip’s datasheet, which was close to the result I got from sampling. For the 1997 Sony, I don’t even have that TV, but I could get phosphors and demodulation settings for it from the internet.
In theory, once we have the primaries, the full grayscale (which we treat as an EOTF and OETF) and the nonstandard R-Y/G-Y/B-Y demodulation angles and gains, that should be all we need to emulate the CRT’s colors. The Panasonic and RCA CRTs both had all 3 of those things. The Toshiba and Sony CRTs only had demodulation settings and primaries, because the Toshiba CRT’s grayscale was clearly desync’d very badly, and the Sony CRT’s white balance can’t be found anywhere online.
Once we have that, we have to figure out what the CRT is doing, and for two of them, we have to guess-and-check several different white points too. This crappy RetroArch shader here called White Trash https://www.mediafire.com/file/n7zwx1f89r2uzvq/white-trash-2025-10-19.slang/file , which I’ll pretend I didn’t make, can emulate the demodulation offsets/gains along with any primaries with a simple power law for gamma, and can display test patterns to check how close the CRT’s approximation gets to some standard primaries/whitepoint, with or without accounting for chromatic adaptation. The downside is that this shader is crap. Here’s what the output looks like if I use the Toshiba Blackstripe with its white balance set to x=0.281,y=0.311 and with the datasheet’s demodulation settings (unlike the NesDev post which uses approximate sampled demodulation settings), and try to match against 1953 NTSC with illuminant C, without accounting for chromatic adaptation:
(The bottom-right three triangles are for setting the global hue and saturation. Others are self-explanatory.)
My initial reply to you includes links to the CRTs’ data, and what white point each CRT had (or, for the two that don’t have a known white point, what white point it most likely had). As far as I can tell, they all appear to be trying to approximate the red/yellow/green area according to 1953 NTSC color without accounting for chromatic adaptation. Trying anything else gives nasty-looking results. You can see all this if you click on the line of text that just says “Summary”.
The main point that I was trying to make with that reply was about the timeline. The standards being published in 1987 and 1994 didn’t immediately affect what the CRTs were doing, and the beginning of HDTV in 1998 (?) may or may not have caused many CRTs to switch to D65. The CRTs’ behaviors aren’t new either; they’re based on papers from 1966 and 1975, sometimes even with the same nonstandard white balance as in the 1966 paper.
I can post the Excel spreadsheets, Desmos graphs, and a few White Trash tests sometime else, but not now.
Something to keep in mind.
A professional CRT monitor loses its calibration after 5 months. A TV that is more than 25 years old may lose it monthly.
Old components, swollen capacitors, broken solder joints, worn cables, even a slightly burnt tube, can change the result.
I did see the expanded summaries but it wasn’t clear exactly what data you were working with. This post really makes it clear. Thank you for that. What do you mean by ‘chromatic adaption’?
I think you’re putting too much emphasis on the exact year for standards. The standards weren’t something that applied to the CRT manufacturers. They were for broadcasters and video masterers. You shouldn’t think of the year for a standard coming out as a cutoff or starting point for some kind of technology. That’s just not how it worked. Like D65 started being used in Japan before their own version HDTV. You mention HDTV as being in 1998, but that’s just in America. Japan introduced HDTV in the late 80s. They specified a D65 white point. So the momentum in Japan to move from D93 to D65 was in place long before 1998.
An old document i was able to find with Japan primaries, translation is “JAPAN Specific Phosphor”, in the document it’s clearly stated “it’s D93 and is slightly different than SMPTE-C that was used in USA and NTSC1953 was never used” too.
@DariusG I used those Japanese phosphor coordinates and the D93 white point in Scanline Classic to create some screenshots. The S-Video shader I made allows you to cut out the chroma and just look at grayscale. I decided to compare it to D65 US NTSC grayscale as well as the grayscale made by a P4 phosphor. Old black and white TVs used this P4 phosphor which has a bluish white color. The extra sharpness of the P4 screenshot is intended because those displays were sharper. The S-Video shader is beta and may have errors, but I think it’s more or less correct for what I’m demonstrating here.
Imagine in the 60s when most programs were still in black and white, especially the ‘serious’ programs like Perry Mason. Comparing the D65 with P4 side-by-side would make it look dull. Imagine wanting to sell a color TV, your customer asks ‘will it work with B&W?’ ‘Of course!’ you say, and switch over to a station playing a B&W program.’ You’d want it to look as close to the B&W displays as possible, right? So as to not scare off the customer?. We can see that D93 is closer to the P4 characteristic.
When publishers and artists starting doing work on computers, they preferred the warmer colors. The (monochrome) Macintosh was designed with publishing in mind and used something called a ‘paper white’ phosphor. I haven’t been able to determine the chromaticity, but it may be D65, D50, or something else. Video colorists may have preferred working long hours with the less straining D65 white point as well.
The CRTs have a different white balance than they presume the input signal to have. According to what I’ve been hearing elsewhere, whenever you’re forced to move white from one position to another, the “proper” thing to do is a chromatic adaptation transform, which is done by converting into an LMS-like space (such as the Bradford space which is “sharpened”) and adjusting the ratios in that space.
These consumer CRTs, however, don’t care about that. Instead, they re-saturate the red, yellow, and green area back to where they were with the original reference white, at the expense of the rest of the colors.
I have only educated myself through random websites and documents on the internet over the past year or so, but it seems like, while color scientists prefer adapting the LMS cone response ratios, video masterers prefer to keep the entire image unadapted/unmodified, even if their viewing environment’s lighting is very different.
I have seen this document too, but I think Grade’s “P22_J_ph” is probably more accurate. If I understand right, what this Japanese document is saying is that professional monitors (not necessarily just in Japan) generally have phosphors near these chromaticity coords. Grade’s P22-J comes from averaging several different CRTs’ chromaticity points. Those different CRTs’ primaries are all conveniently compiled together in this program’s constants: https://github.com/ChthonVII/gamutthingy/blob/master/src/constants.h
Meanwhile, the 3 consumer sets I have from Toshiba (1985), RCA (1989), and Panasonic (2000) are closer to the SMPTE C green, and both Toshiba and Panasonic also picked deeper reds. Weirdly, Toshiba picked a blue that’s closer to the NTSC spec, and this is the only brand I’ve seen doing this. Most other brands stay close to the same blue primary across the board.
My completely unproven opinion is that the SMPTE C green helps make yellows deeper (which is important), while the “Japan Specific Phosphor” green and red both are more efficient with less load on the electron guns needed to reach bright colors or the desired white balance. This one is a very far stretch, but copying an opinion from this document about using nonstandard CRT white balances https://library.imaging.org/admin/apis/public/api/ist/website/downloadArticle/cic/1/1/art00047 , maybe these phosphors could result in more even current ratios across the 3 electron guns, which means less uneven wear on the guns, less uneven burn-in, less vertical fringing, and better sharpness. I have no proof of this at all, and I have no actual experience in the industry, so please just take this as a random conjecture, not anything serious. But it would make sense if consumer units preferred deeper yellows at the expense of more burn-in and decreased longevity overall.
This document also has some suggestions for why a bluer white balance might be chosen https://library.imaging.org/admin/apis/public/api/ist/website/downloadArticle/cic/1/1/art00047 . My opinion, though, is that x=0.281, y=0.311 originally appeared because it was closer to the black-and-white TVs’ white balances, and that it stuck around because people liked the oversaturated blues better and because it didn’t have to drive the red phosphor as hard to reach the desired white balance. Even my modern TV has a “Dynamic Color” setting that does this blue-stretch trick.
I forgot to mention, how would a typical person set up a VCR with a CRT that has a composite input? Wouldn’t they connect the VCR’s RF input into the wall, and plug its composite output into their TV’s composite input? In that case, the correction on composite video would need to be based on 1953 colorimetry.
FWIW, I grew up with B&W TVs, and they definitely had a cool white point. My family also had early Macintoshes, as my mother did desktop publishing for a living, so I have some context there, too.
Some of you may remember that I had pretty tense arguments on color with Nesguy a few years back.
Now I never liked D65 regardless of its status as a standard. I don’t like it in movies, and I find it particularly jarring when applied to videogames, especially retro stuff, since color depth was limited then and the excessive warmth becomes even more apparent. It looks to me like a dirty, yellow filter. So of course 9300 is the way to go for me when it comes to playing old games.
But besides my personal preference, it’s pretty clear that japanese TV output from the old days (which is what most people had in their homes, globally) is quite a bit cooler than D65. This is due to the fact that D93 was the broadcast standard white in Japan back then, a fact that has been documented quite extensively. This pisses off D65 zealots, obedient to the film industry, who don’t seem to grasp the fact that their idea of white is not universal and absolute.
So yeah, I firmly believe that japanese devs and artists made games with that standard in mind, that 9300K is the intended, correct temperature for japanese games from that era, and since that is perfectly aligned with my personal preferences, I configure all my shader presets accordingly.
Does that mean that D65 is objectively wrong? Absolutely not, and if someone wants to apply it to their games, all the power to them. It is simply not the way that stuff was displayed in Japan in the 80s and 90s. Get over it
Hey Squalo, been a while, hope you’ve been well. I think we’ve pretty much covered this- to recap:
I’m willing to concede that D93 may have been generally (weakly) intended by Japanese devs, for a certain era of content, but that it in no way “breaks the image” for it to be displayed at a warmer color temp - over half of Nintendo’s customers were in America, after all, where warmer color temps were the standard, and Nintendo would have tested their games on American televisions… It seems most likely to me that they would have tested on a variety of displays to ensure everything stayed within acceptable limits.
My second point is that what was generally intended regarding color temperatures on CRT displays no longer really applies when we’re talking about LCDs… On an LCD, the backlight itself is biased toward 6500k, with the intent of displaying a 6500k temperature. A CRT has a white point that is far more fine-tunable. An LCD has a native white point, while a CRT does not. An LCD at 9300k just isn’t going to look the same as a CRT at 9300k- in fact, they’re vastly different, as I’ve personally confirmed with side by side comparisons. The LCD gets desaturated in a way that the CRT doesn’t. At some point we may also need to consider the fact that LCDs are not CRTs, and so we may have to do things slightly differently. Score a point for OLED, since it doesn’t have this drawback.
@Nesguy hello my friend, a while indeed! I too hope you are fine I thought that our color fight was over, we did trade some heavy blows back then didn’t we haha
Regarding your first point, the tricky part is that I’m pretty sure those TVs were manufactured and exported under the very same cooler standard. Think about the big CRT brands, where are they from. So yeah, of course you could calibrate to D65 or any other profile, but they were inherently D93 devices made for a D93 society.
The NES was pretty peculiar with regards to color with its funky different palettes, but let’s take arcades, which in best case scenarios were the pro-spec, image quality summit in videogames back then (besides P-BVMs, that is). Do you think US versions of japanese machines looked different from the original ones? The roms were not color-corrected (more like color-wronged lol), that’s for sure, and I seriously doubt the screens themselves were modified either. As far as I’m concerned, Japan made games for their D93 universe and didn’t give a flying F about what we gaijin did to their beautiful 9300K white balance as long as we bought them. If you grew up playing Japanese games and you didn’t have a nerdy dad that D65’d your TVs at home, you probably grew up in glorious D93! And going full circle, I think that’s actually one of the reasons why I don’t like D65. I strongly associate a cooler picture with better image quality, and I spent a LOT of time in arcades as a kid. There could be connections between the two.
As for the second point, yes of course technologies are different and color behaviour can’t be compared apples to apples, we definitely agree on that. As accurate as CRT emulation is getting (which is extremely impressive) it will always be discrete vs continous.
RCA, Zenith, General Electric (GE), Admiral, and Magnavox were all manufactured in the USA and going strong in the 1980s, though beginning to decline. I get your point, though.
It raises the question of whether this was a deliberate artistic choice rather than simply the way things were done. That’s why I think at most we can say D93 is generally intended, but not image-breaking (in the way viewing a movie at D93 is )
If you pull up the 1990 Sony Trinitron “W0006311M.pdf” manual, for displays which included Sony’s “Trinitone” system that adjusted the color temperature, it specifies that the bluer (presumably D93) setting was the factory default at that time, even in the US.
This presumably applied to all Trinitrons from at least the 1984 XBR line, until either 1992 when the early HD MUSE displays launched, supposedly with a D65 factory default, or c. 1996 when they started including “game” modes that automatically switched to D65.
That said, the Trinitone function also meant you didn’t need a nerdy dad to calibrate for D65. It was just the press of a button on the remote.