"Correct" Color: Some Evidence Related to the NES

Have you noticed that the only community researching and seeking solutions to the correct appearance of games is RetroArch? :blush:

@Nesguy I found definitive proof that games are made with temperature in mind… This is Mario in 1 Japan, 2 USA in miami, 3 Europe in spring, and 4 Europe in summer.


Don’t get angry, it’s a joke… :face_with_hand_over_mouth:

Before SMPTE C, they were all D93. After D65 was adopted as the standard in the US, Japan kept D93 and Europe did whatever it wanted.

Actually, the one in the photo looks to me like a special edition of the C1 for Nintendo, or perhaps one from the same line, because they were very high-end TVs.

How did you convert to NTSC?
Applying temperature to the image is not functional, because it is something different.

The temperature in an image is used for artistic purposes.

The white point on a console is used to define pure white, and after applying the black level, it remains as white as possible.
Correctly defining the white point is vitally important because if the tone is incorrect, it can change a scene in a movie.

It is true that there is no pure white in a CRT. And because of D96, it tends to look bluish, but it is minimal and the eye adapts easily, and it also depends a lot on the manufacturer.

The Japanese experience the same thing as the rest of the world with emulation. The colors are not the same, nor are the contrasts.
To give an example, On a CRT, black is not completely black; it is the color of the screen when turned off (dark gray), and white is not completely white.

Since the mid-1980s (?), CRTs have not differentiated between NTSC SMPTE C and NTSC System M (Japan) because they are very similar; the gamut and white point are set automatically.

But this is a pure Japanese model, the Sharp SF1. You can see the correct contrast, the color tones are not affected, and the white is slightly blue. (Although it should be noted that the room is RED, and perhaps the photograph compensates with blue.)

We see it as black because the human eye is an impressive camera and adjusts sensitivity without us noticing, but on an LCD screen it looks awful. That’s why we always see corrected images, which obviously aren’t a good reference for color.

2 Likes

Such images definitely aren’t a good reference for exact colors under any circumstance, but i am much more bullish than some on the idea that it is possible to get an approximate idea of the color temperature based on images like these.

It’s not about the precise tint of the white so much as the way the camera captures the highlights, if that makes sense? Higher color temperatures result in a certain glint. It’s certainly not an exact science tho, and i would love to see more of these displays analyzed using spectrometers or colorimeters while they still function somewhat well.

It simply isn’t useful to compare things with photographs. Pictures of my own TV I take with my iPhone look wildly different from real life all the time and are not consistent. The only time photographs are useful are when the environmental lighting conditions are fixed and the camera is operated in a manual mode.

The effect of environment is analogous to audio. How we perceive speakers is affected by the room itself in addition to the speakers themselves. For TVs, the ambient lighting, reflections, temporal response of the screen, etc., all outweigh color temperature in importance. And we already know there were both 6500K, 9300K, and everything in between, available as options for TVs in the 1980s.

1 Like

And what would be the reason? I don’t know much about this and I would like to understand why.

Yes, it is, and it may be the best way to create a shader that simulates the real image and add options that simulate 3D effects.

Is that why photos taken with an iPhone always come out differently? Is it in automatic mode?

If I understand correctly, the comment was meant to provide a better reference than the initial idea, but in any case, a professional photographer can handle the situations you mention, or at least that’s what I suspect.

1 Like

Environmental lighting and camera settings effect the color in the final image.

Additionally, all color CRTs were capable of displaying colors that fall outside of the sRGB gamut, so the picture would have to be WCG to properly match the colors in all circumstances.

It is entirely possible to produce images that could be used as a color reference with the right sort of camera, but it is a thing that has to be done intentionally.

People keep bringing up the year 1987 for this, but don’t forget the year 1994 for SMPTE 170M. SMPTE C (a.k.a. RP 145) is six sentences long and only specifies the CRT phosphor chromaticity values and D65 white balance, without specifying that that’s what the RGB values in a video signal (or in anything) represent. SMPTE 170M, on the other hand, is a specification for the composite signal, which allows encoding directly for SMPTE C primaries without disallowing 1953 primaries, and never mentions illuminant C at all (only ever D65).

The reason why I’m correcting for 1953 NTSC primaries and illuminant C is because I know of four different real US TVs (three of which I measured myself, and one of which I found the data for online) that all appear to be correcting for 1953 primaries and illuminant C. At least, I know for certain that all four of these TVs are doing a color correction. The TVs also happen to be manufactured in years that straddle the different “updates” to video standards. The approximate correction is done only by using nonstandard chroma demodulation angles and gains, with no mechanism to properly account for gamma.

Going through the whole thing in chronological order, because I need to spread the word:

Summary

1966 - “An Analysis of the Necessary Decoder Corrections for Color Receiver Operation with Non-Standard Receiver Primaries” by N. W. Parker

This seems to be the original paper describing this method for correcting for NTSC primaries. It works by choosing two chromaticity coordinates (other than white), namely a “flesh tone” x=.444,y=.399 and “green” x=0.309,y=0.494, where you want the exact right color, and building the correction around that. The paper also shows how, if the CRT’s white balance is at x=0.281, y=0.311, you can still cause those two points to be exactly correct, without taking chromatic adaptation into account. The errors in chromaticity are plotted in this diagram:

Screenshot from 2025-10-19 19-33-13

1975 - “Computing Colorimetric Errors of a Color Television Display System” by C. Bailey Neal

This paper is published later, showing that Parker’s method is still being used and improved upon in the 1970s, over 20 years after the original 1953 NTSC standard was published.

September 1985 - Toshiba FST Blackstripe CF2005

I got this TV myself this past february. Note how it predates SMPTE C by 2 years. My original post explaining the measurements and the resulting NES palette is on the NesDev forums here: https://forums.nesdev.org/viewtopic.php?t=26093

  • Primaries - Measured via X-Rite i1Display colorimeter. Not sure how precise this is, but visually indistinguishable.
  • Demodulation offsets/gains - Taken from the TA7644BP datasheet. The TA7644BP chip is in the service manual, and I happened to see an eBay listing for the service manual with a picture of that exact page that showed the chip. As you see in the post, I used a series of samples from the colorimeter to approximate the demodulation offsets/gains as well, and the result came close to the datasheet’s values.
  • White balance - Unable to obtain from hardware because of its bad condition. Guessed and checked using this piece of shit program that I hate https://www.mediafire.com/file/n7zwx1f89r2uzvq/white-trash-2025-10-19.slang/file . It took me a long time to finally try both ignoring chromatic adaptation and using the 9300k+27MPCD whitepoint that isn’t part of any video standard. I finally decided that this TV must be based on x=0.281, y=0.311, and not doing chromatic adaptation, just as the paper described.

Using this shitty program, I made that same kind of diagram as with N. W. Parker’s paper. The result is similar. Still, this TV’s white is about 95% of the brightness of the outer near-perfect area, which means this might be using some improved method, not exactly the same thing.

1987 - SMPTE C

As stated before, it’s only 6 sentences long, and while it specifies chromaticity coordinates for a display, it doesn’t specify that the input video signal or anything else is really based on those. My interpretation is that the display is expected to physically have these phosphors/white and do a color correction similar to the consumer TVs.

1989 - RCA ColorTrak Remote E13169GM-F02

I own this CRT myself. Notice how this one is manufactured well after SMPTE C in 1987 and well before SMPTE 170M in 1994.

  • Primaries - Sampled via i1Display 2 colorimeter.
  • Demodulation offsets/gains - Approximated using a series of colorimeter samples. Because I don’t know what chip is used, I can’t search up a datasheet for the exact values.
  • White balance - Sampled via i1Display 2 colorimeter. The grayscale is not perfectly in sync, but it mostly stays around 7500K-8000K, or roughly x=0.301, y=0.308.

Using that same stupid program from before, it turned out to be correcting the red/yellow/green area for 1953 primaries and illuminant C without chromatic adaptation, the same idea as the 1985 Toshiba.

1994 - SMPTE 170M

This one specifies the composite video signal, encoding directly for SMPTE C primaries and D65.

1997 - Sony KV-27S22

I do not own this TV, but I was able to get data about it from the internet.

  • Primaries - A bunch of samples of Sony’s primaries can be found here https://github.com/ChthonVII/gamutthingy/blob/master/src/constants.h . I picked the official values, from color.org.
  • Demodulation offsets/gains - The CRT Database page states that this TV had the CXA2025AS. I took the demodulation offsets/gains from that chip’s data sheet.
  • White point - Guessed and checked in the same manner as with the Toshiba CF2005. Turned out to be the same x=0.281, y=0.311 point, and also about 95% as bright as the approximated region.

Judging by the CXA2025AS’s datasheet, this couldn’t have had a YPbPr component input, but it may have had S-Video at best. That means it would need a convertor for HDTV.

It is still correcting for 1953 primaries with illuminant C, not for SMPTE C with D65. It’s also still using a white balance of x=0.281, y=0.311 without accounting for chromatic adaptation.

Around 1998 or something, HDTV begins

Wikipedia says this with no good source: " High-definition television (HDTV) in the United States was introduced in 1998 and has since become increasingly popular and dominant in the television market."

2000 - Panasonic CT-36D30B

I own this TV. Note that this is manufactured after HDTV began, and it has a YPbPr component input for that.

  • Primaries - Sampled with i1Display 2
  • Demodulation - Derived from samples. Sampled three times, once for each of the TV’s color temperature options, and it came out the same (or close to the same) each time.
  • White - The on-screen display lets you pick warm, normal, or cool. Warm is closer to D65, while Normal is closer to 9300K. (Cool is somewhere near 14000K.)

I should try messing with this again, but it looks like it’s still approximating the red/yellow/green region based on 1953 primaries and illuminant C, while the CRT’s white balance is set to D65, without accounting for chromatic adaptation.

It makes sense why D65 would be used, because that point is needed for HDTV through the YPbPr component input, which seems to not be doing any color correction, at least not for 1953 primaries.

Feel free to prove me wrong about this. I won’t be surprised to hear if Sony PVMs had their composite video based on SMPTE C / D65, but for now, it looks like consumer TVs weren’t majorly affected by any of these changes, except in D65 being required by HDTV.

As far as i understand: if it’s new enough to have glass filters, it should generally give valid enough results outside of a professional/high stakes environment. (Professional environments require that all colorimeters be regularly profiled against spectrometers regardless.)

But if you mean the Eye-One Display 2, it unfortunately has organic filters, and is old enough that it has surely drifted out of spec by now.

Are you saying that the CRTs presume the input RGB values (after decoding from YIQ) to be in 1953 NTSC (FCC) space and transform the input for the P22 primaries? I can see that being plausible. Although if it’s happening on the composite input (as opposed to RF), that would seem to be erroneous. However, simply reusing the same circuitry for RF on composite would be an easy shortcut for manufacturers, and I have direct experience of needing to adjust the tint control when using RF vs composite on one of the TVs I had. It wasn’t a high quality model.

Maybe it has drifted, but when I take samples from my own display and re-display them on the same display, the result looks almost exactly the same to my eyes, maybe just a tiny bit different if not identical. I bought it (edit: Used) for something around 35 dollars on eBay, so it might not be the best tool.

(Edit) That’s part of what I’m saying. Another thing is that consumer models will move white somewhere else while keeping the area around red, yellow, and green reasonably accurate without adaptation, at the expense of other colors getting dragged towards blue. By extension, I might assume that professional units did the correction without changing white, likely using Neal’s derivation from 1975, or perhaps some other improved derivation. At least, the professional units must have done that during SMB1’s era. Consumer models, however, kept doing this up to the 2000s.

The 1985 and 1989 models are RF-only, and the 1997 and 2000 ones have composite. For the 2000 one, I did take the samples through the composite input.

The 1997 TV uses the CXA2025AS chip, which does not have separate demodulation settings for composite versus RF. It does have a US/JP switch for the demodulation settings, and it does have a “Dynamic Color” feature which isn’t explained in the documentation, but I’m guessing that’s another switch for demodulation settings and white balance combined.

What you’re saying makes sense to me. I assumed in your original screenshots you were expanding the gamut from SMPTE C to 1953 primaries rather than vice versa. It’s not possible to expand gamuts electronically, and the 1953 gamut is wider than the SMPTE C gamut. The primaries are fixed for a given display, but can be electronically ‘shrunk’.

White point is determined simply by changing the strength of individual electron guns for a given voltage. Theoretically, a display can show any white point within the gamut of the three primaries. Changing the white point of a display is therefore straightforward. There were actually people who you could pay to calibrate your CRT via system menu for you in the 90s and 2000s. Most people didn’t do this, but the service existed. They would optimize contrast (called Picture back then), black level, color saturation, hue, overscan, white point, and other settings, within the limits of the display. As a consumer, I did it myself through test images. We shared the remote codes to get into the service menu on mailing lists.

TVs have a limited gamut, SMPTE C is smaller than sRGB, but if they existed, they would cost several thousand (up to 75,000) dollars more. In addition, it is possible that it used a P3 or Adobe RGB profile.

I may use the camera’s internal light meter, or I may use an accessory such as these to adjust the white point.

I believe most TVs were calibrated closer to 9300K in reality, if you do sRGB->SMPTE-C 9300K on a good sRGB monitor, colors then are pretty close how my CRT looks like.

sRGB and SMPTE-C are close, the good sRGB monitor gets rid a lot of the green tint in Super Metroid without doing anything.

They probably not exactly 9300K on CRTs out of Japan (not that cool, while it looks much more pleasant in 9300K), but in between 9300 and 6500K, like 7500K and more. That doesn’t mean all TVs are like that, some could still look like something else, there simply is too much diversity.

@PlainOldPants I see you have a colorimeter. Have you taken ICC profile measurements or HFCR measurements with your colorimeter? If you have that data, we can probably figure out what the TVs are doing.

EDIT: @Nesguy I think everyone needs to get on the same page with standards and what they are for. The standards were never a prescriptive thing. They were more description of what broadcasters/engineers were already doing. Even the original NTSC was a decision on which color system to use based on competing systems already made by manufacturers, not a descriptive formulation. Standards would usually only be introduced to solve a problem. BT.1886 took as long as it did because all CRTs pretty much had the same gamma. There was no standard because there didn’t need to be one. Broadcasters could also end up doing something the standards didn’t describe and just never update the standard if no perceived problem arose from it. Lots of standards are technically still in effect in various industries and are ignored without consequence.

So, for example, when SMPTE issued the C standard in 1987 it wasn’t a directive telling all broadcasters to convert to the SMPTE C primaries overnight. It was rather informing broadcasters that they could color correct on SMPTE C phosphor monitors and not need to use any other correction circuitry. The vast majority of broadcasters were already doing that for years and that is why SMPTE settled on that standard, but there was still uncertainty about if the practice was correct and that’s why they issued the standard. The same thing applies to the standard on encoding video, 170M, the method described there was already being done in practice for VHS, LaserDisc, etc. and the standard was defined to ensure the creators could rely on those established practices for new technologies going forward.

Standards can also be abandoned or ignored. The FCC YIQ system for NTSC was abandoned for YPbPr. There is a good argument to be made the D93 for NTSC-J wasn’t followed (and D65 not followed for NTSC-U/color temp inconsistencies in general). Even sRGB considered D50 early on. The warmer temperature was better for matching colors to print (something actually relevant to game developers who did a lot of print material and scanning), cooler temperatures were better for matching grayscale to the Black & White P4 phosphor (something broadcasters and TV manufacturers in the 1950s and 1960s would have cared about). Another example is how game consoles didn’t bother to include setup for NTSC-U until the PS1, etc.

In conclusion, standards are useful, but it’s important to understand the context around them, why they were developed. A lot of this context has been altered by repeated myths and misinformation.

3 Likes

In case you skimmed over this, in my original reply to you, you have to click on the line that says “Summary” to view the full details.

Yes, I’ve used HCFR to take measurements. Copying and pasting from my Nesdev post about the 1985 Toshiba Blackstripe, since I’ve followed this same procedure on the 1989 RCA ColorTrak and 2000 Panasonic as well:

The procedure is like this:

  • Wait the standard 30 minutes for the CRT to fully warm up.
  • Sample phosphors. Now, any sampled color can be separated into a multiple of the 3 different phosphors. For good measure, I sampled around the entire RGB triangle and did linear regressions on the triangle edges. Somehow, I messed something up around green, but it should be okay as long as I’m within the vicinity of the correct points. https://www.desmos.com/calculator/rzyeouzpzk
  • Sample the grayscale. The sweet spot is probably about 85 samples, a perfect interval of 3 between samples from 0 to 255. Now, any multiple of the 3 different phosphors can be converted into electrical R’G’B’ values. https://www.desmos.com/calculator/kmc4hust5j
  • Keeping Y and sqrt(I^2 + Q^2) constant, sample a full 360-degree cycle of chroma. Now, the demodulation offsets and gains for R-Y, G-Y, and B-Y can be determined by a sinusoidal regression. https://www.desmos.com/calculator/rxauy2wqnl

Samples and linear matrix multiplication are in this excel spreadsheet. It doesn’t display correctly in Google Drive. https://docs.google.com/spreadsheets/d/ … ue&sd=true

I posted the results of that in this issue on gamutthingy in early March. https://github.com/ChthonVII/gamutthing … 2700255641

That’s all I did for the 1989 RCA and 2000 Panasonic. For the 1985 Toshiba, I could get demodulation settings from the TA7644BP chip’s datasheet, which was close to the result I got from sampling. For the 1997 Sony, I don’t even have that TV, but I could get phosphors and demodulation settings for it from the internet.

In theory, once we have the primaries, the full grayscale (which we treat as an EOTF and OETF) and the nonstandard R-Y/G-Y/B-Y demodulation angles and gains, that should be all we need to emulate the CRT’s colors. The Panasonic and RCA CRTs both had all 3 of those things. The Toshiba and Sony CRTs only had demodulation settings and primaries, because the Toshiba CRT’s grayscale was clearly desync’d very badly, and the Sony CRT’s white balance can’t be found anywhere online.

Once we have that, we have to figure out what the CRT is doing, and for two of them, we have to guess-and-check several different white points too. This crappy RetroArch shader here called White Trash https://www.mediafire.com/file/n7zwx1f89r2uzvq/white-trash-2025-10-19.slang/file , which I’ll pretend I didn’t make, can emulate the demodulation offsets/gains along with any primaries with a simple power law for gamma, and can display test patterns to check how close the CRT’s approximation gets to some standard primaries/whitepoint, with or without accounting for chromatic adaptation. The downside is that this shader is crap. Here’s what the output looks like if I use the Toshiba Blackstripe with its white balance set to x=0.281,y=0.311 and with the datasheet’s demodulation settings (unlike the NesDev post which uses approximate sampled demodulation settings), and try to match against 1953 NTSC with illuminant C, without accounting for chromatic adaptation:

(The bottom-right three triangles are for setting the global hue and saturation. Others are self-explanatory.)

My initial reply to you includes links to the CRTs’ data, and what white point each CRT had (or, for the two that don’t have a known white point, what white point it most likely had). As far as I can tell, they all appear to be trying to approximate the red/yellow/green area according to 1953 NTSC color without accounting for chromatic adaptation. Trying anything else gives nasty-looking results. You can see all this if you click on the line of text that just says “Summary”.

The main point that I was trying to make with that reply was about the timeline. The standards being published in 1987 and 1994 didn’t immediately affect what the CRTs were doing, and the beginning of HDTV in 1998 (?) may or may not have caused many CRTs to switch to D65. The CRTs’ behaviors aren’t new either; they’re based on papers from 1966 and 1975, sometimes even with the same nonstandard white balance as in the 1966 paper.

I can post the Excel spreadsheets, Desmos graphs, and a few White Trash tests sometime else, but not now.

Let me know if I need to clarify something.

2 Likes

Something to keep in mind.
A professional CRT monitor loses its calibration after 5 months. A TV that is more than 25 years old may lose it monthly.
Old components, swollen capacitors, broken solder joints, worn cables, even a slightly burnt tube, can change the result.

2 Likes

I did see the expanded summaries but it wasn’t clear exactly what data you were working with. This post really makes it clear. Thank you for that. What do you mean by ‘chromatic adaption’?

I think you’re putting too much emphasis on the exact year for standards. The standards weren’t something that applied to the CRT manufacturers. They were for broadcasters and video masterers. You shouldn’t think of the year for a standard coming out as a cutoff or starting point for some kind of technology. That’s just not how it worked. Like D65 started being used in Japan before their own version HDTV. You mention HDTV as being in 1998, but that’s just in America. Japan introduced HDTV in the late 80s. They specified a D65 white point. So the momentum in Japan to move from D93 to D65 was in place long before 1998.

2 Likes

An old document i was able to find with Japan primaries, translation is “JAPAN Specific Phosphor”, in the document it’s clearly stated “it’s D93 and is slightly different than SMPTE-C that was used in USA and NTSC1953 was never used” too.

sRGB

to those primaries D93

source

Page 26

2 Likes

@DariusG I used those Japanese phosphor coordinates and the D93 white point in Scanline Classic to create some screenshots. The S-Video shader I made allows you to cut out the chroma and just look at grayscale. I decided to compare it to D65 US NTSC grayscale as well as the grayscale made by a P4 phosphor. Old black and white TVs used this P4 phosphor which has a bluish white color. The extra sharpness of the P4 screenshot is intended because those displays were sharper. The S-Video shader is beta and may have errors, but I think it’s more or less correct for what I’m demonstrating here.

P4

NTSC-J

NTSC-U

Imagine in the 60s when most programs were still in black and white, especially the ‘serious’ programs like Perry Mason. Comparing the D65 with P4 side-by-side would make it look dull. Imagine wanting to sell a color TV, your customer asks ‘will it work with B&W?’ ‘Of course!’ you say, and switch over to a station playing a B&W program.’ You’d want it to look as close to the B&W displays as possible, right? So as to not scare off the customer?. We can see that D93 is closer to the P4 characteristic.

When publishers and artists starting doing work on computers, they preferred the warmer colors. The (monochrome) Macintosh was designed with publishing in mind and used something called a ‘paper white’ phosphor. I haven’t been able to determine the chromaticity, but it may be D65, D50, or something else. Video colorists may have preferred working long hours with the less straining D65 white point as well.

2 Likes

The CRTs have a different white balance than they presume the input signal to have. According to what I’ve been hearing elsewhere, whenever you’re forced to move white from one position to another, the “proper” thing to do is a chromatic adaptation transform, which is done by converting into an LMS-like space (such as the Bradford space which is “sharpened”) and adjusting the ratios in that space.

These consumer CRTs, however, don’t care about that. Instead, they re-saturate the red, yellow, and green area back to where they were with the original reference white, at the expense of the rest of the colors.

I have only educated myself through random websites and documents on the internet over the past year or so, but it seems like, while color scientists prefer adapting the LMS cone response ratios, video masterers prefer to keep the entire image unadapted/unmodified, even if their viewing environment’s lighting is very different.

I have seen this document too, but I think Grade’s “P22_J_ph” is probably more accurate. If I understand right, what this Japanese document is saying is that professional monitors (not necessarily just in Japan) generally have phosphors near these chromaticity coords. Grade’s P22-J comes from averaging several different CRTs’ chromaticity points. Those different CRTs’ primaries are all conveniently compiled together in this program’s constants: https://github.com/ChthonVII/gamutthingy/blob/master/src/constants.h

Meanwhile, the 3 consumer sets I have from Toshiba (1985), RCA (1989), and Panasonic (2000) are closer to the SMPTE C green, and both Toshiba and Panasonic also picked deeper reds. Weirdly, Toshiba picked a blue that’s closer to the NTSC spec, and this is the only brand I’ve seen doing this. Most other brands stay close to the same blue primary across the board.

My completely unproven opinion is that the SMPTE C green helps make yellows deeper (which is important), while the “Japan Specific Phosphor” green and red both are more efficient with less load on the electron guns needed to reach bright colors or the desired white balance. This one is a very far stretch, but copying an opinion from this document about using nonstandard CRT white balances https://library.imaging.org/admin/apis/public/api/ist/website/downloadArticle/cic/1/1/art00047 , maybe these phosphors could result in more even current ratios across the 3 electron guns, which means less uneven wear on the guns, less uneven burn-in, less vertical fringing, and better sharpness. I have no proof of this at all, and I have no actual experience in the industry, so please just take this as a random conjecture, not anything serious. But it would make sense if consumer units preferred deeper yellows at the expense of more burn-in and decreased longevity overall.

This document also has some suggestions for why a bluer white balance might be chosen https://library.imaging.org/admin/apis/public/api/ist/website/downloadArticle/cic/1/1/art00047 . My opinion, though, is that x=0.281, y=0.311 originally appeared because it was closer to the black-and-white TVs’ white balances, and that it stuck around because people liked the oversaturated blues better and because it didn’t have to drive the red phosphor as hard to reach the desired white balance. Even my modern TV has a “Dynamic Color” setting that does this blue-stretch trick.

I forgot to mention, how would a typical person set up a VCR with a CRT that has a composite input? Wouldn’t they connect the VCR’s RF input into the wall, and plug its composite output into their TV’s composite input? In that case, the correction on composite video would need to be based on 1953 colorimetry.

2 Likes

FWIW, I grew up with B&W TVs, and they definitely had a cool white point. My family also had early Macintoshes, as my mother did desktop publishing for a living, so I have some context there, too.

I recall it being more neutral.

3 Likes