back then the differences between CRTs were even more vast than current ones. I remember we gamed on an 80s TV model,and used it even till the PS3 era were fonts would look tiny and unreadable
I think in general the importance of the correct colors getting displayed on screen is overstated. I used to be very adamant about determining the correct output colors in emulators, however, I have found that the viewing environment has a greater impact on color perception than the screen itself. Therefore, to objectively compare colors requires not only identical hardware, but also a controlled lighting environment.
In my opinion, color temperature does not matter for the whole screen content as long as it’s in a reasonable range and the screen is much brighter than any surrounding ambient light. Our eyes adjust for the difference in temperature. If you have a preference for a certain temperature, it’s better to use your display controls to set it rather than rely on a shader to preserve dynamic range. I always calibrate to D65.
CRT gammas are pretty much universally fixed to 2.4 gamma because it is an intrinsic part of how the signal is translated into the electron beam. For some reason, LCDs originally had a 2.2 gamma, perhaps because of their poor black levels. It’s better to use the TV to adjust to desired gamma rather than rely on processing to avoid loss of dynamic range, similar to color temperature. Gamma has a much more tangible effect than color temperature because our eyes don’t adapt to changes as strongly. Gamma 2.4 is supposed to be correct in a dim room (really dim, like too dim to read a book), and 2.2 is better for typical lighting. 2.6 is for a completely dark room. However, even professional Blu-ray masters still have issues with gamma consistency. From a practical standpoint, ensure you can perceive differences in dark gray at 5% intervals.
For the exact color palettes in systems like the NES, I still don’t think it actually matters that much. While I believe the light purple sky in Super Mario Bros. is the correct color, I don’t think it matters if someone prefers that it is blue, and it’s rather inconsequential if it is blue. Colors in art don’t need to match colors in real life, and the colors of real life things depends on the lighting environment anyway. The sky is blue on a sunny day, but it is gray on a rainy one, for example. Water is blue under sunlight, but water in a cave might look black, brown, or gray instead. We can reliably measure the signals from the consoles directly with a scope or capture card, and there isn’t really much magic between that and the TV that would greatly change those results. For example, FirebrandX’s composite direct NES palette looks near identical to the official NES Classic palette, as well as his own ‘hand-tuned’ palette. The ‘composite direct’ approach is, therefore, the best preservation method in my opinion, with shaders allowing users to tweak the colors to their personal preference.
This is what I normally use - but also check out “Magnum,” Firebrand X’s latest palette. I added a feature request to get it packed with Nestopia, but we’ll see. I think it’s an evolution of the accurate palettes, with a bit of artistry to make the image a bit more exciting on digital displays and high end CRT setups- based on a PVM with +20% saturation, I think.
It occurs to me that to get “process accurate” composite video emulation with the NES and “correct color”, we would actually need to build the shader for the NES exclusively, starting with it’s weird internally generated composite video output, and use the RAW palette - some recent composite video shaders currently still in development are trying to do this.
If we try to use the composite direct palette AND a composite video shader, are we basically doubling up on composite color artifacting? (I’m not sure about this) Isn’t this the case for basically any palette based on screen captures displaying composite video? Or, since we established in the other thread, the desaturation/color changes are entirely local and related to chroma bleed, so it’s entirely safe to combine a composite video shader with a composite video palette?
I always liked the weird, otherworldly quality of the purple color, and felt it suited the setting.
The current NTSC shaders simply make an assumption about the composite video output and show the artifact colors based on that. That’s why there are 256px and 320px shaders in RetroArch. We can’t recover the composite signal just from the source pixels. We need more information, so those 256px and 320px shaders operate under different assumptions. My shader attempts to reconstruct the raw output using parameters. An NES emulator could display the raw video output directly as a 682x262 monochrome frame. I’m not sure how Mesen’s raw output works or whether it provides the sufficient info needed to fully decode without assumptions.
But yes, you’re safe combining the composite palette with an NTSC shader. It ends up working exactly the same as with SNES.
Not exactly, all CRTs were inherently power gamma, but 2.4 is just the modern shorthand/matching/“compatibility” setting, chosen because that is what CRT reference monitors used in content production were.
Actual CRTs ranged anywhere from something like 2.35ish to 2.6ish depending on the model and the unit. My FW900 was approx 2.35 when it was in good working order, for example.
I’ve been meaning to post some things here, but time isn’t on my side lately.
Starting with the original issue of the sky in Super Mario Bros.
I just quickly added in an approximate row-skew in my NES composite emulation, so here’s how the sky looks in it with different color settings. All of these are a row-skew of 2.5 (the purplest option, it’s based on a certain specific PPU revision), with the global rotation hue set to perfectly balance the highest and lowest skews.
These are in order from most purple to least purple.
Plain sRGB D65.
US NTSC standard - NTSC 1953 primaries with illuminant C, approximated in sRGB space using C. Bailey Neal’s method with a rate parameter of 80%:
Japan NTSC standard - NTSC 1953 primaries, D93 (x=0.283, y=0.298) or in other words NTSC-J, approximated in sRGB space using C. Bailey Neal’s method with a rate parameter of 75%:
US region, manufactured August 1989, RCA ColorTrak Remote E13169GM-F02, uses a whitepoint around 7500-8000K and alters the red/yellow/green area to approximate NTSC primaries / illuminant C without accounting for chromatic adaptation, approximate re-creation directly with sRGB primaries
US region, manufactured September 1985, Toshiba FST Blackstripe CF2005 (similar to Sony KV-27S22 manufactured in 1997), uses a whitepoint of 9300K+27MPCD (x=0.281, y=0.311) and alters the red/yellow/green area to approximate NTSC primaries / illuminant C without accounting for chromatic adaptation, approximate re-creation directly with sRGB primaries
I forgot to include the possibility that the display was matched using this famous method here https://pub.smpte.org/latest/rp167/rp0167-1995_stable2004.pdf where you input standard color bars, switch to blue only, and adjust the hue and color settings to get four solid blue bars. This method makes sense for SMPTE C color, but it isn’t necessarily correct for NTSC color. I would have to check, but I believe this generally causes some loss of saturation and some slight hue error, which would then make the sky more purple. Consumers match their TV by eye when watching actual TV shows, while the game developers may have either used this method or just centered all their knobs.
You all can make your own opinions on how purple the “purplish” sky should be.
You can find my measurements of my RCA ColorTrak and Toshiba FST Blackstripe in ChthonVII’s program here: https://github.com/ChthonVII/gamutthingy/blob/master/src/constants.h Don’t forget the nonstandard whitepoint and nonstandard chroma demodulation offsets and gains, and don’t forget to reasonably match the red/yellow/green area well against NTSC with illuminant C.
9300K was also brought up.
There are two different 9300k points: 9300k+8MPCD or D93 (either x=0.2831, y=0.2971 according to https://www.jstage.jst.go.jp/article/itej1978/33/12/33_12_1013/_pdf/-char/ja , or x=0.2838, y=0.2984 according to https://www.jstage.jst.go.jp/article/itej1954/31/11/31_11_883/_article/-char/ja/ ), and 9300K+27MPCD (x=0.281, y=0.311, appearing in https://ieeexplore.ieee.org/document/4179914 ).
Notice how D93 was proposed in 1968 and agreed upon for the Japanese standard in 1972, whereas 9300K+27MPCD predates it in 1966. I would guess, then, that 9300K+27MPCD is what manufacturers agreed on, while D93 was derived from that.
I’ve posted before that 9300K+27MPCD appears in US TVs from 1985 and 1997. New CRT shader - Any interest? I don’t have this right now, but believe I recall the Sony CXA2025AS’s JP mode looking better when using a 9300K+27MPCD whitepoint (instead of 9300K+8MPCD) as well. I believe I remember hearing about both 9300K+27MPCD and 9300K+8MPCD appearing in US computer monitors.
I’m going to leave off here for now. I’ll post more info later.
Why are you converting to the 1953 NTSC primaries and Illuminant C? Those were replaced by SMPTE C primaries and D65 long before the NES. While it’s true that the 1953 standard was technically never abandoned, CRT manufacturers didn’t use different phosphors. P22 phosphors (correlating with SMPTE C) were almost universally used by Japanese and American manufacturers and European manufacturers used EBU phosphors. The only other common phosphors for color TVs were used by projectors, and they were meant to look similar to P22. sRGB is a compromise between the two standards and didn’t factor in the 1953 primaries at all. Had the 1953 primaries been in common use, it certainly would have been a factor in sRGB’s development. I don’t think the 1953 definition is relevant for emulation.
Have you noticed that the only community researching and seeking solutions to the correct appearance of games is RetroArch?
@Nesguy I found definitive proof that games are made with temperature in mind… This is Mario in 1 Japan, 2 USA in miami, 3 Europe in spring, and 4 Europe in summer.
Don’t get angry, it’s a joke…

Before SMPTE C, they were all D93. After D65 was adopted as the standard in the US, Japan kept D93 and Europe did whatever it wanted.
Actually, the one in the photo looks to me like a special edition of the C1 for Nintendo, or perhaps one from the same line, because they were very high-end TVs.
How did you convert to NTSC?
Applying temperature to the image is not functional, because it is something different.
The temperature in an image is used for artistic purposes.
The white point on a console is used to define pure white, and after applying the black level, it remains as white as possible.
Correctly defining the white point is vitally important because if the tone is incorrect, it can change a scene in a movie.
It is true that there is no pure white in a CRT. And because of D96, it tends to look bluish, but it is minimal and the eye adapts easily, and it also depends a lot on the manufacturer.
The Japanese experience the same thing as the rest of the world with emulation. The colors are not the same, nor are the contrasts.
To give an example, On a CRT, black is not completely black; it is the color of the screen when turned off (dark gray), and white is not completely white.
Since the mid-1980s (?), CRTs have not differentiated between NTSC SMPTE C and NTSC System M (Japan) because they are very similar; the gamut and white point are set automatically.
But this is a pure Japanese model, the Sharp SF1. You can see the correct contrast, the color tones are not affected, and the white is slightly blue. (Although it should be noted that the room is RED, and perhaps the photograph compensates with blue.)
We see it as black because the human eye is an impressive camera and adjusts sensitivity without us noticing, but on an LCD screen it looks awful. That’s why we always see corrected images, which obviously aren’t a good reference for color.
Such images definitely aren’t a good reference for exact colors under any circumstance, but i am much more bullish than some on the idea that it is possible to get an approximate idea of the color temperature based on images like these.
It’s not about the precise tint of the white so much as the way the camera captures the highlights, if that makes sense? Higher color temperatures result in a certain glint. It’s certainly not an exact science tho, and i would love to see more of these displays analyzed using spectrometers or colorimeters while they still function somewhat well.
It simply isn’t useful to compare things with photographs. Pictures of my own TV I take with my iPhone look wildly different from real life all the time and are not consistent. The only time photographs are useful are when the environmental lighting conditions are fixed and the camera is operated in a manual mode.
The effect of environment is analogous to audio. How we perceive speakers is affected by the room itself in addition to the speakers themselves. For TVs, the ambient lighting, reflections, temporal response of the screen, etc., all outweigh color temperature in importance. And we already know there were both 6500K, 9300K, and everything in between, available as options for TVs in the 1980s.
And what would be the reason? I don’t know much about this and I would like to understand why.
Yes, it is, and it may be the best way to create a shader that simulates the real image and add options that simulate 3D effects.
Is that why photos taken with an iPhone always come out differently? Is it in automatic mode?
If I understand correctly, the comment was meant to provide a better reference than the initial idea, but in any case, a professional photographer can handle the situations you mention, or at least that’s what I suspect.
Environmental lighting and camera settings effect the color in the final image.
Additionally, all color CRTs were capable of displaying colors that fall outside of the sRGB gamut, so the picture would have to be WCG to properly match the colors in all circumstances.
It is entirely possible to produce images that could be used as a color reference with the right sort of camera, but it is a thing that has to be done intentionally.
People keep bringing up the year 1987 for this, but don’t forget the year 1994 for SMPTE 170M. SMPTE C (a.k.a. RP 145) is six sentences long and only specifies the CRT phosphor chromaticity values and D65 white balance, without specifying that that’s what the RGB values in a video signal (or in anything) represent. SMPTE 170M, on the other hand, is a specification for the composite signal, which allows encoding directly for SMPTE C primaries without disallowing 1953 primaries, and never mentions illuminant C at all (only ever D65).
The reason why I’m correcting for 1953 NTSC primaries and illuminant C is because I know of four different real US TVs (three of which I measured myself, and one of which I found the data for online) that all appear to be correcting for 1953 primaries and illuminant C. At least, I know for certain that all four of these TVs are doing a color correction. The TVs also happen to be manufactured in years that straddle the different “updates” to video standards. The approximate correction is done only by using nonstandard chroma demodulation angles and gains, with no mechanism to properly account for gamma.
Going through the whole thing in chronological order, because I need to spread the word:
Summary
1966 - “An Analysis of the Necessary Decoder Corrections for Color Receiver Operation with Non-Standard Receiver Primaries” by N. W. Parker
This seems to be the original paper describing this method for correcting for NTSC primaries. It works by choosing two chromaticity coordinates (other than white), namely a “flesh tone” x=.444,y=.399 and “green” x=0.309,y=0.494, where you want the exact right color, and building the correction around that. The paper also shows how, if the CRT’s white balance is at x=0.281, y=0.311, you can still cause those two points to be exactly correct, without taking chromatic adaptation into account. The errors in chromaticity are plotted in this diagram:
1975 - “Computing Colorimetric Errors of a Color Television Display System” by C. Bailey Neal
This paper is published later, showing that Parker’s method is still being used and improved upon in the 1970s, over 20 years after the original 1953 NTSC standard was published.
September 1985 - Toshiba FST Blackstripe CF2005
I got this TV myself this past february. Note how it predates SMPTE C by 2 years. My original post explaining the measurements and the resulting NES palette is on the NesDev forums here: https://forums.nesdev.org/viewtopic.php?t=26093
- Primaries - Measured via X-Rite i1Display colorimeter. Not sure how precise this is, but visually indistinguishable.
- Demodulation offsets/gains - Taken from the TA7644BP datasheet. The TA7644BP chip is in the service manual, and I happened to see an eBay listing for the service manual with a picture of that exact page that showed the chip. As you see in the post, I used a series of samples from the colorimeter to approximate the demodulation offsets/gains as well, and the result came close to the datasheet’s values.
- White balance - Unable to obtain from hardware because of its bad condition. Guessed and checked using this piece of shit program that I hate https://www.mediafire.com/file/n7zwx1f89r2uzvq/white-trash-2025-10-19.slang/file . It took me a long time to finally try both ignoring chromatic adaptation and using the 9300k+27MPCD whitepoint that isn’t part of any video standard. I finally decided that this TV must be based on x=0.281, y=0.311, and not doing chromatic adaptation, just as the paper described.
Using this shitty program, I made that same kind of diagram as with N. W. Parker’s paper. The result is similar. Still, this TV’s white is about 95% of the brightness of the outer near-perfect area, which means this might be using some improved method, not exactly the same thing.
1987 - SMPTE C
As stated before, it’s only 6 sentences long, and while it specifies chromaticity coordinates for a display, it doesn’t specify that the input video signal or anything else is really based on those. My interpretation is that the display is expected to physically have these phosphors/white and do a color correction similar to the consumer TVs.
1989 - RCA ColorTrak Remote E13169GM-F02
I own this CRT myself. Notice how this one is manufactured well after SMPTE C in 1987 and well before SMPTE 170M in 1994.
- Primaries - Sampled via i1Display 2 colorimeter.
- Demodulation offsets/gains - Approximated using a series of colorimeter samples. Because I don’t know what chip is used, I can’t search up a datasheet for the exact values.
- White balance - Sampled via i1Display 2 colorimeter. The grayscale is not perfectly in sync, but it mostly stays around 7500K-8000K, or roughly x=0.301, y=0.308.
Using that same stupid program from before, it turned out to be correcting the red/yellow/green area for 1953 primaries and illuminant C without chromatic adaptation, the same idea as the 1985 Toshiba.
1994 - SMPTE 170M
This one specifies the composite video signal, encoding directly for SMPTE C primaries and D65.
1997 - Sony KV-27S22
I do not own this TV, but I was able to get data about it from the internet.
- Primaries - A bunch of samples of Sony’s primaries can be found here https://github.com/ChthonVII/gamutthingy/blob/master/src/constants.h . I picked the official values, from color.org.
- Demodulation offsets/gains - The CRT Database page states that this TV had the CXA2025AS. I took the demodulation offsets/gains from that chip’s data sheet.
- White point - Guessed and checked in the same manner as with the Toshiba CF2005. Turned out to be the same x=0.281, y=0.311 point, and also about 95% as bright as the approximated region.
Judging by the CXA2025AS’s datasheet, this couldn’t have had a YPbPr component input, but it may have had S-Video at best. That means it would need a convertor for HDTV.
It is still correcting for 1953 primaries with illuminant C, not for SMPTE C with D65. It’s also still using a white balance of x=0.281, y=0.311 without accounting for chromatic adaptation.
Around 1998 or something, HDTV begins
Wikipedia says this with no good source: " High-definition television (HDTV) in the United States was introduced in 1998 and has since become increasingly popular and dominant in the television market."
2000 - Panasonic CT-36D30B
I own this TV. Note that this is manufactured after HDTV began, and it has a YPbPr component input for that.
- Primaries - Sampled with i1Display 2
- Demodulation - Derived from samples. Sampled three times, once for each of the TV’s color temperature options, and it came out the same (or close to the same) each time.
- White - The on-screen display lets you pick warm, normal, or cool. Warm is closer to D65, while Normal is closer to 9300K. (Cool is somewhere near 14000K.)
I should try messing with this again, but it looks like it’s still approximating the red/yellow/green region based on 1953 primaries and illuminant C, while the CRT’s white balance is set to D65, without accounting for chromatic adaptation.
It makes sense why D65 would be used, because that point is needed for HDTV through the YPbPr component input, which seems to not be doing any color correction, at least not for 1953 primaries.
Feel free to prove me wrong about this. I won’t be surprised to hear if Sony PVMs had their composite video based on SMPTE C / D65, but for now, it looks like consumer TVs weren’t majorly affected by any of these changes, except in D65 being required by HDTV.
As far as i understand: if it’s new enough to have glass filters, it should generally give valid enough results outside of a professional/high stakes environment. (Professional environments require that all colorimeters be regularly profiled against spectrometers regardless.)
But if you mean the Eye-One Display 2, it unfortunately has organic filters, and is old enough that it has surely drifted out of spec by now.
Are you saying that the CRTs presume the input RGB values (after decoding from YIQ) to be in 1953 NTSC (FCC) space and transform the input for the P22 primaries? I can see that being plausible. Although if it’s happening on the composite input (as opposed to RF), that would seem to be erroneous. However, simply reusing the same circuitry for RF on composite would be an easy shortcut for manufacturers, and I have direct experience of needing to adjust the tint control when using RF vs composite on one of the TVs I had. It wasn’t a high quality model.
Maybe it has drifted, but when I take samples from my own display and re-display them on the same display, the result looks almost exactly the same to my eyes, maybe just a tiny bit different if not identical. I bought it (edit: Used) for something around 35 dollars on eBay, so it might not be the best tool.
(Edit) That’s part of what I’m saying. Another thing is that consumer models will move white somewhere else while keeping the area around red, yellow, and green reasonably accurate without adaptation, at the expense of other colors getting dragged towards blue. By extension, I might assume that professional units did the correction without changing white, likely using Neal’s derivation from 1975, or perhaps some other improved derivation. At least, the professional units must have done that during SMB1’s era. Consumer models, however, kept doing this up to the 2000s.
The 1985 and 1989 models are RF-only, and the 1997 and 2000 ones have composite. For the 2000 one, I did take the samples through the composite input.
The 1997 TV uses the CXA2025AS chip, which does not have separate demodulation settings for composite versus RF. It does have a US/JP switch for the demodulation settings, and it does have a “Dynamic Color” feature which isn’t explained in the documentation, but I’m guessing that’s another switch for demodulation settings and white balance combined.
What you’re saying makes sense to me. I assumed in your original screenshots you were expanding the gamut from SMPTE C to 1953 primaries rather than vice versa. It’s not possible to expand gamuts electronically, and the 1953 gamut is wider than the SMPTE C gamut. The primaries are fixed for a given display, but can be electronically ‘shrunk’.
White point is determined simply by changing the strength of individual electron guns for a given voltage. Theoretically, a display can show any white point within the gamut of the three primaries. Changing the white point of a display is therefore straightforward. There were actually people who you could pay to calibrate your CRT via system menu for you in the 90s and 2000s. Most people didn’t do this, but the service existed. They would optimize contrast (called Picture back then), black level, color saturation, hue, overscan, white point, and other settings, within the limits of the display. As a consumer, I did it myself through test images. We shared the remote codes to get into the service menu on mailing lists.
TVs have a limited gamut, SMPTE C is smaller than sRGB, but if they existed, they would cost several thousand (up to 75,000) dollars more. In addition, it is possible that it used a P3 or Adobe RGB profile.
I may use the camera’s internal light meter, or I may use an accessory such as these to adjust the white point.
I believe most TVs were calibrated closer to 9300K in reality, if you do sRGB->SMPTE-C 9300K on a good sRGB monitor, colors then are pretty close how my CRT looks like.
sRGB and SMPTE-C are close, the good sRGB monitor gets rid a lot of the green tint in Super Metroid without doing anything.
They probably not exactly 9300K on CRTs out of Japan (not that cool, while it looks much more pleasant in 9300K), but in between 9300 and 6500K, like 7500K and more. That doesn’t mean all TVs are like that, some could still look like something else, there simply is too much diversity.
@PlainOldPants I see you have a colorimeter. Have you taken ICC profile measurements or HFCR measurements with your colorimeter? If you have that data, we can probably figure out what the TVs are doing.
EDIT: @Nesguy I think everyone needs to get on the same page with standards and what they are for. The standards were never a prescriptive thing. They were more description of what broadcasters/engineers were already doing. Even the original NTSC was a decision on which color system to use based on competing systems already made by manufacturers, not a descriptive formulation. Standards would usually only be introduced to solve a problem. BT.1886 took as long as it did because all CRTs pretty much had the same gamma. There was no standard because there didn’t need to be one. Broadcasters could also end up doing something the standards didn’t describe and just never update the standard if no perceived problem arose from it. Lots of standards are technically still in effect in various industries and are ignored without consequence.
So, for example, when SMPTE issued the C standard in 1987 it wasn’t a directive telling all broadcasters to convert to the SMPTE C primaries overnight. It was rather informing broadcasters that they could color correct on SMPTE C phosphor monitors and not need to use any other correction circuitry. The vast majority of broadcasters were already doing that for years and that is why SMPTE settled on that standard, but there was still uncertainty about if the practice was correct and that’s why they issued the standard. The same thing applies to the standard on encoding video, 170M, the method described there was already being done in practice for VHS, LaserDisc, etc. and the standard was defined to ensure the creators could rely on those established practices for new technologies going forward.
Standards can also be abandoned or ignored. The FCC YIQ system for NTSC was abandoned for YPbPr. There is a good argument to be made the D93 for NTSC-J wasn’t followed (and D65 not followed for NTSC-U/color temp inconsistencies in general). Even sRGB considered D50 early on. The warmer temperature was better for matching colors to print (something actually relevant to game developers who did a lot of print material and scanning), cooler temperatures were better for matching grayscale to the Black & White P4 phosphor (something broadcasters and TV manufacturers in the 1950s and 1960s would have cared about). Another example is how game consoles didn’t bother to include setup for NTSC-U until the PS1, etc.
In conclusion, standards are useful, but it’s important to understand the context around them, why they were developed. A lot of this context has been altered by repeated myths and misinformation.