I believe most TVs were calibrated closer to 9300K in reality, if you do sRGB->SMPTE-C 9300K on a good sRGB monitor, colors then are pretty close how my CRT looks like.
sRGB and SMPTE-C are close, the good sRGB monitor gets rid a lot of the green tint in Super Metroid without doing anything.
They probably not exactly 9300K on CRTs out of Japan (not that cool, while it looks much more pleasant in 9300K), but in between 9300 and 6500K, like 7500K and more. That doesn’t mean all TVs are like that, some could still look like something else, there simply is too much diversity.
@PlainOldPants I see you have a colorimeter. Have you taken ICC profile measurements or HFCR measurements with your colorimeter? If you have that data, we can probably figure out what the TVs are doing.
EDIT: @Nesguy I think everyone needs to get on the same page with standards and what they are for. The standards were never a prescriptive thing. They were more description of what broadcasters/engineers were already doing. Even the original NTSC was a decision on which color system to use based on competing systems already made by manufacturers, not a descriptive formulation. Standards would usually only be introduced to solve a problem. BT.1886 took as long as it did because all CRTs pretty much had the same gamma. There was no standard because there didn’t need to be one. Broadcasters could also end up doing something the standards didn’t describe and just never update the standard if no perceived problem arose from it. Lots of standards are technically still in effect in various industries and are ignored without consequence.
So, for example, when SMPTE issued the C standard in 1987 it wasn’t a directive telling all broadcasters to convert to the SMPTE C primaries overnight. It was rather informing broadcasters that they could color correct on SMPTE C phosphor monitors and not need to use any other correction circuitry. The vast majority of broadcasters were already doing that for years and that is why SMPTE settled on that standard, but there was still uncertainty about if the practice was correct and that’s why they issued the standard. The same thing applies to the standard on encoding video, 170M, the method described there was already being done in practice for VHS, LaserDisc, etc. and the standard was defined to ensure the creators could rely on those established practices for new technologies going forward.
Standards can also be abandoned or ignored. The FCC YIQ system for NTSC was abandoned for YPbPr. There is a good argument to be made the D93 for NTSC-J wasn’t followed (and D65 not followed for NTSC-U/color temp inconsistencies in general). Even sRGB considered D50 early on. The warmer temperature was better for matching colors to print (something actually relevant to game developers who did a lot of print material and scanning), cooler temperatures were better for matching grayscale to the Black & White P4 phosphor (something broadcasters and TV manufacturers in the 1950s and 1960s would have cared about). Another example is how game consoles didn’t bother to include setup for NTSC-U until the PS1, etc.
In conclusion, standards are useful, but it’s important to understand the context around them, why they were developed. A lot of this context has been altered by repeated myths and misinformation.
In case you skimmed over this, in my original reply to you, you have to click on the line that says “Summary” to view the full details.
Yes, I’ve used HCFR to take measurements. Copying and pasting from my Nesdev post about the 1985 Toshiba Blackstripe, since I’ve followed this same procedure on the 1989 RCA ColorTrak and 2000 Panasonic as well:
The procedure is like this:
Wait the standard 30 minutes for the CRT to fully warm up.
Sample phosphors. Now, any sampled color can be separated into a multiple of the 3 different phosphors. For good measure, I sampled around the entire RGB triangle and did linear regressions on the triangle edges. Somehow, I messed something up around green, but it should be okay as long as I’m within the vicinity of the correct points. https://www.desmos.com/calculator/rzyeouzpzk
Sample the grayscale. The sweet spot is probably about 85 samples, a perfect interval of 3 between samples from 0 to 255. Now, any multiple of the 3 different phosphors can be converted into electrical R’G’B’ values. https://www.desmos.com/calculator/kmc4hust5j
Keeping Y and sqrt(I^2 + Q^2) constant, sample a full 360-degree cycle of chroma. Now, the demodulation offsets and gains for R-Y, G-Y, and B-Y can be determined by a sinusoidal regression. https://www.desmos.com/calculator/rxauy2wqnl
That’s all I did for the 1989 RCA and 2000 Panasonic. For the 1985 Toshiba, I could get demodulation settings from the TA7644BP chip’s datasheet, which was close to the result I got from sampling. For the 1997 Sony, I don’t even have that TV, but I could get phosphors and demodulation settings for it from the internet.
In theory, once we have the primaries, the full grayscale (which we treat as an EOTF and OETF) and the nonstandard R-Y/G-Y/B-Y demodulation angles and gains, that should be all we need to emulate the CRT’s colors. The Panasonic and RCA CRTs both had all 3 of those things. The Toshiba and Sony CRTs only had demodulation settings and primaries, because the Toshiba CRT’s grayscale was clearly desync’d very badly, and the Sony CRT’s white balance can’t be found anywhere online.
Once we have that, we have to figure out what the CRT is doing, and for two of them, we have to guess-and-check several different white points too. This crappy RetroArch shader here called White Trash https://www.mediafire.com/file/n7zwx1f89r2uzvq/white-trash-2025-10-19.slang/file , which I’ll pretend I didn’t make, can emulate the demodulation offsets/gains along with any primaries with a simple power law for gamma, and can display test patterns to check how close the CRT’s approximation gets to some standard primaries/whitepoint, with or without accounting for chromatic adaptation. The downside is that this shader is crap. Here’s what the output looks like if I use the Toshiba Blackstripe with its white balance set to x=0.281,y=0.311 and with the datasheet’s demodulation settings (unlike the NesDev post which uses approximate sampled demodulation settings), and try to match against 1953 NTSC with illuminant C, without accounting for chromatic adaptation:
(The bottom-right three triangles are for setting the global hue and saturation. Others are self-explanatory.)
My initial reply to you includes links to the CRTs’ data, and what white point each CRT had (or, for the two that don’t have a known white point, what white point it most likely had). As far as I can tell, they all appear to be trying to approximate the red/yellow/green area according to 1953 NTSC color without accounting for chromatic adaptation. Trying anything else gives nasty-looking results. You can see all this if you click on the line of text that just says “Summary”.
The main point that I was trying to make with that reply was about the timeline. The standards being published in 1987 and 1994 didn’t immediately affect what the CRTs were doing, and the beginning of HDTV in 1998 (?) may or may not have caused many CRTs to switch to D65. The CRTs’ behaviors aren’t new either; they’re based on papers from 1966 and 1975, sometimes even with the same nonstandard white balance as in the 1966 paper.
I can post the Excel spreadsheets, Desmos graphs, and a few White Trash tests sometime else, but not now.
Something to keep in mind.
A professional CRT monitor loses its calibration after 5 months. A TV that is more than 25 years old may lose it monthly.
Old components, swollen capacitors, broken solder joints, worn cables, even a slightly burnt tube, can change the result.
I did see the expanded summaries but it wasn’t clear exactly what data you were working with. This post really makes it clear. Thank you for that. What do you mean by ‘chromatic adaption’?
I think you’re putting too much emphasis on the exact year for standards. The standards weren’t something that applied to the CRT manufacturers. They were for broadcasters and video masterers. You shouldn’t think of the year for a standard coming out as a cutoff or starting point for some kind of technology. That’s just not how it worked. Like D65 started being used in Japan before their own version HDTV. You mention HDTV as being in 1998, but that’s just in America. Japan introduced HDTV in the late 80s. They specified a D65 white point. So the momentum in Japan to move from D93 to D65 was in place long before 1998.
An old document i was able to find with Japan primaries, translation is “JAPAN Specific Phosphor”, in the document it’s clearly stated “it’s D93 and is slightly different than SMPTE-C that was used in USA and NTSC1953 was never used” too.
@DariusG I used those Japanese phosphor coordinates and the D93 white point in Scanline Classic to create some screenshots. The S-Video shader I made allows you to cut out the chroma and just look at grayscale. I decided to compare it to D65 US NTSC grayscale as well as the grayscale made by a P4 phosphor. Old black and white TVs used this P4 phosphor which has a bluish white color. The extra sharpness of the P4 screenshot is intended because those displays were sharper. The S-Video shader is beta and may have errors, but I think it’s more or less correct for what I’m demonstrating here.
Imagine in the 60s when most programs were still in black and white, especially the ‘serious’ programs like Perry Mason. Comparing the D65 with P4 side-by-side would make it look dull. Imagine wanting to sell a color TV, your customer asks ‘will it work with B&W?’ ‘Of course!’ you say, and switch over to a station playing a B&W program.’ You’d want it to look as close to the B&W displays as possible, right? So as to not scare off the customer?. We can see that D93 is closer to the P4 characteristic.
When publishers and artists starting doing work on computers, they preferred the warmer colors. The (monochrome) Macintosh was designed with publishing in mind and used something called a ‘paper white’ phosphor. I haven’t been able to determine the chromaticity, but it may be D65, D50, or something else. Video colorists may have preferred working long hours with the less straining D65 white point as well.
The CRTs have a different white balance than they presume the input signal to have. According to what I’ve been hearing elsewhere, whenever you’re forced to move white from one position to another, the “proper” thing to do is a chromatic adaptation transform, which is done by converting into an LMS-like space (such as the Bradford space which is “sharpened”) and adjusting the ratios in that space.
These consumer CRTs, however, don’t care about that. Instead, they re-saturate the red, yellow, and green area back to where they were with the original reference white, at the expense of the rest of the colors.
I have only educated myself through random websites and documents on the internet over the past year or so, but it seems like, while color scientists prefer adapting the LMS cone response ratios, video masterers prefer to keep the entire image unadapted/unmodified, even if their viewing environment’s lighting is very different.
I have seen this document too, but I think Grade’s “P22_J_ph” is probably more accurate. If I understand right, what this Japanese document is saying is that professional monitors (not necessarily just in Japan) generally have phosphors near these chromaticity coords. Grade’s P22-J comes from averaging several different CRTs’ chromaticity points. Those different CRTs’ primaries are all conveniently compiled together in this program’s constants: https://github.com/ChthonVII/gamutthingy/blob/master/src/constants.h
Meanwhile, the 3 consumer sets I have from Toshiba (1985), RCA (1989), and Panasonic (2000) are closer to the SMPTE C green, and both Toshiba and Panasonic also picked deeper reds. Weirdly, Toshiba picked a blue that’s closer to the NTSC spec, and this is the only brand I’ve seen doing this. Most other brands stay close to the same blue primary across the board.
My completely unproven opinion is that the SMPTE C green helps make yellows deeper (which is important), while the “Japan Specific Phosphor” green and red both are more efficient with less load on the electron guns needed to reach bright colors or the desired white balance. This one is a very far stretch, but copying an opinion from this document about using nonstandard CRT white balances https://library.imaging.org/admin/apis/public/api/ist/website/downloadArticle/cic/1/1/art00047 , maybe these phosphors could result in more even current ratios across the 3 electron guns, which means less uneven wear on the guns, less uneven burn-in, less vertical fringing, and better sharpness. I have no proof of this at all, and I have no actual experience in the industry, so please just take this as a random conjecture, not anything serious. But it would make sense if consumer units preferred deeper yellows at the expense of more burn-in and decreased longevity overall.
This document also has some suggestions for why a bluer white balance might be chosen https://library.imaging.org/admin/apis/public/api/ist/website/downloadArticle/cic/1/1/art00047 . My opinion, though, is that x=0.281, y=0.311 originally appeared because it was closer to the black-and-white TVs’ white balances, and that it stuck around because people liked the oversaturated blues better and because it didn’t have to drive the red phosphor as hard to reach the desired white balance. Even my modern TV has a “Dynamic Color” setting that does this blue-stretch trick.
I forgot to mention, how would a typical person set up a VCR with a CRT that has a composite input? Wouldn’t they connect the VCR’s RF input into the wall, and plug its composite output into their TV’s composite input? In that case, the correction on composite video would need to be based on 1953 colorimetry.
FWIW, I grew up with B&W TVs, and they definitely had a cool white point. My family also had early Macintoshes, as my mother did desktop publishing for a living, so I have some context there, too.
Some of you may remember that I had pretty tense arguments on color with Nesguy a few years back.
Now I never liked D65 regardless of its status as a standard. I don’t like it in movies, and I find it particularly jarring when applied to videogames, especially retro stuff, since color depth was limited then and the excessive warmth becomes even more apparent. It looks to me like a dirty, yellow filter. So of course 9300 is the way to go for me when it comes to playing old games.
But besides my personal preference, it’s pretty clear that japanese TV output from the old days (which is what most people had in their homes, globally) is quite a bit cooler than D65. This is due to the fact that D93 was the broadcast standard white in Japan back then, a fact that has been documented quite extensively. This pisses off D65 zealots, obedient to the film industry, who don’t seem to grasp the fact that their idea of white is not universal and absolute.
So yeah, I firmly believe that japanese devs and artists made games with that standard in mind, that 9300K is the intended, correct temperature for japanese games from that era, and since that is perfectly aligned with my personal preferences, I configure all my shader presets accordingly.
Does that mean that D65 is objectively wrong? Absolutely not, and if someone wants to apply it to their games, all the power to them. It is simply not the way that stuff was displayed in Japan in the 80s and 90s. Get over it
Hey Squalo, been a while, hope you’ve been well. I think we’ve pretty much covered this- to recap:
I’m willing to concede that D93 may have been generally (weakly) intended by Japanese devs, for a certain era of content, but that it in no way “breaks the image” for it to be displayed at a warmer color temp - over half of Nintendo’s customers were in America, after all, where warmer color temps were the standard, and Nintendo would have tested their games on American televisions… It seems most likely to me that they would have tested on a variety of displays to ensure everything stayed within acceptable limits.
My second point is that what was generally intended regarding color temperatures on CRT displays no longer really applies when we’re talking about LCDs… On an LCD, the backlight itself is biased toward 6500k, with the intent of displaying a 6500k temperature. A CRT has a white point that is far more fine-tunable. An LCD has a native white point, while a CRT does not. An LCD at 9300k just isn’t going to look the same as a CRT at 9300k- in fact, they’re vastly different, as I’ve personally confirmed with side by side comparisons. The LCD gets desaturated in a way that the CRT doesn’t. At some point we may also need to consider the fact that LCDs are not CRTs, and so we may have to do things slightly differently. Score a point for OLED, since it doesn’t have this drawback.
@Nesguy hello my friend, a while indeed! I too hope you are fine I thought that our color fight was over, we did trade some heavy blows back then didn’t we haha
Regarding your first point, the tricky part is that I’m pretty sure those TVs were manufactured and exported under the very same cooler standard. Think about the big CRT brands, where are they from. So yeah, of course you could calibrate to D65 or any other profile, but they were inherently D93 devices made for a D93 society.
The NES was pretty peculiar with regards to color with its funky different palettes, but let’s take arcades, which in best case scenarios were the pro-spec, image quality summit in videogames back then (besides P-BVMs, that is). Do you think US versions of japanese machines looked different from the original ones? The roms were not color-corrected (more like color-wronged lol), that’s for sure, and I seriously doubt the screens themselves were modified either. As far as I’m concerned, Japan made games for their D93 universe and didn’t give a flying F about what we gaijin did to their beautiful 9300K white balance as long as we bought them. If you grew up playing Japanese games and you didn’t have a nerdy dad that D65’d your TVs at home, you probably grew up in glorious D93! And going full circle, I think that’s actually one of the reasons why I don’t like D65. I strongly associate a cooler picture with better image quality, and I spent a LOT of time in arcades as a kid. There could be connections between the two.
As for the second point, yes of course technologies are different and color behaviour can’t be compared apples to apples, we definitely agree on that. As accurate as CRT emulation is getting (which is extremely impressive) it will always be discrete vs continous.
RCA, Zenith, General Electric (GE), Admiral, and Magnavox were all manufactured in the USA and going strong in the 1980s, though beginning to decline. I get your point, though.
It raises the question of whether this was a deliberate artistic choice rather than simply the way things were done. That’s why I think at most we can say D93 is generally intended, but not image-breaking (in the way viewing a movie at D93 is )
If you pull up the 1990 Sony Trinitron “W0006311M.pdf” manual, for displays which included Sony’s “Trinitone” system that adjusted the color temperature, it specifies that the bluer (presumably D93) setting was the factory default at that time, even in the US.
This presumably applied to all Trinitrons from at least the 1984 XBR line, until either 1992 when the early HD MUSE displays launched, supposedly with a D65 factory default, or c. 1996 when they started including “game” modes that automatically switched to D65.
That said, the Trinitone function also meant you didn’t need a nerdy dad to calibrate for D65. It was just the press of a button on the remote.
Yep, that was my point. Trinitrons (like the vast majority of japanese CRTs from that time, if not all) are D93 devices. And the way I see it, if all the screens were D93, then all the games were made in D93 and meant to be displayed in D93.
@Squalo saying Trinitrons are ‘D93 devices’ isn’t really fair when they can be adjusted to 6500 K. Plenty of models have a setting to control temperature directly as well. Even modern TVs come out of the box blue to look brighter at the store.
It’s better to look at things from the perspective of content creation. We already have plenty of evidence that Japanese media was mastered at D65 because Japanese LaserDiscs don’t look overly warm on D65 displays compared to other regions (some of the most coveted LaserDiscs by film enthusiasts are Japanese and they do not report a bias towards D93). There’s also plenty of conjecture, which probably isn’t even true, saying that the Japanese market preferred the blue D93. But if content is mastered at D93, it won’t look blue. It will just look neutral. You only get a blue boost if you master/broadcast at a lower temperature.
So were the television broadcasters and the video masterers mastering at different white points? I suppose that’s possible, but wouldn’t people notice that and complain about one being too blue or the other being to red?
You can’t have it one way or the other. Either the masterers were making video in D65 and the Japanese were getting a blue boost on their D93 sets, or the masterers were making video in D93 and the result is neutral.
But of course we have PVMs and BVMs with D93 modes, so certainly they were using these modes for some reason. Perhaps just as a spot check to make sure it didn’t look bad in D93.