Probably a little bit of both?
Yep, that was my point. Trinitrons (like the vast majority of japanese CRTs from that time, if not all) are D93 devices. And the way I see it, if all the screens were D93, then all the games were made in D93 and meant to be displayed in D93.
@Squalo saying Trinitrons are ‘D93 devices’ isn’t really fair when they can be adjusted to 6500 K. Plenty of models have a setting to control temperature directly as well. Even modern TVs come out of the box blue to look brighter at the store.
It’s better to look at things from the perspective of content creation. We already have plenty of evidence that Japanese media was mastered at D65 because Japanese LaserDiscs don’t look overly warm on D65 displays compared to other regions (some of the most coveted LaserDiscs by film enthusiasts are Japanese and they do not report a bias towards D93). There’s also plenty of conjecture, which probably isn’t even true, saying that the Japanese market preferred the blue D93. But if content is mastered at D93, it won’t look blue. It will just look neutral. You only get a blue boost if you master/broadcast at a lower temperature.
So were the television broadcasters and the video masterers mastering at different white points? I suppose that’s possible, but wouldn’t people notice that and complain about one being too blue or the other being to red?
You can’t have it one way or the other. Either the masterers were making video in D65 and the Japanese were getting a blue boost on their D93 sets, or the masterers were making video in D93 and the result is neutral.
But of course we have PVMs and BVMs with D93 modes, so certainly they were using these modes for some reason. Perhaps just as a spot check to make sure it didn’t look bad in D93.
You could use either the VHS RF Out or Composite Out. The RF Out was a passthrough so you could just use the channel changer on the TV, except channel 3 or 4 which would come from the Composite In if a signal was present. A lot of people just used RF, and you might prefer it if you had something else on the Composite (most TVs had only one or two, if any).
By the time most TVs had composite (in America at least), the broadcasters were not supposed to be using the 1953 space anymore.
Weren’t Trinitrons also famous for their red push? As I understand it, this was done intentionally to help keep the image from looking washed out… 
That was primarily a thing later on, during the 00s/Wega era if i remember right?
Red push was definitely a thing, but I think people also conflate it with how red bleeds more than the other primaries against certain backgrounds due to chroma being bandlimited in YUV space. That is not the same thing as red push. The same thing would actually happen to blue, but our eyes aren’t very sensitive to blue so it doesn’t happen.
Isn’t “red push” just the NTSC color fix? It would just cancel out for signals that were based on that colorimetry, but in SMPTE C, it causes too much saturation and causes yellow to become too orange and green to become too yellow.
It’s true that the red phosphor on Sony Trinitrons is less deep, at x=0.621, y=0.340 (though it might be closer to the SMPTE C red primary on some PVMs), but I don’t think this is far off enough to warrant a “red push” just for SMPTE C or EBU color. That Japanese document that was linked earlier with the “JAPAN Specific Phosphor” said that mixing up the different standard primaries other than NTSC 1953 doesn’t cause any major problems.
I want to point out that the datasheets for Sony’s decoder chips (or “jungle” chips) have a feature called “Dynamic Color” which can be toggled on or off. I don’t have a real chip or anything else to prove this, but I believe that just the nonstandard demodulation offsets and gains stated in the decoder chip datasheets are in themselves the full dynamic color behavior (or at least the US mode is), while the demodulation offsets and gains for non-dynamic color are undocumented. I might buy a CXA decoder chip on eBay and set it up on a breadboard.
No, the red push was a specific thing on the late model CRTs. Unfortunately the CNET article that defined it is long gone. It was noticeable because other TVs didn’t have it.
It could be controlled somewhat by gain and offset parameters baked into the TV’s NVRAM. Sometimes you could dial it down through the service menu, but not on all TVs.
Supposedly Sony started it and others followed.
But people conflate the red push with any color error involving red. Like if you raise Color control too high, red is the first color to bloom.
Well the wording was perhaps not strictly correct, in the sense that the hardware was not locked to this or that standard (although there is the stretched possibility that they were designed to function not exclusively but yes optimally at 9300k, that I do not know), but the point still stands: D93 was more than likely what most game studios worked with and what most people had in their homes, so it’s pretty safe to assume that, at least when it comes to videogames, the type of content that matters the most in the context of retroarch, D93 presentation will be more faithful towards artistic intention.
Regarding TVs and their historical and undeniable blue bias in stores (which in many cases goes well beyond 9300K), I have a theory. Brightness is most definitely a factor, sure. But there’s more. Light temperature for commercial settings (around 5000K) is pretty cool relative to living rooms (around 2500K) or movie theaters (in which all the light comes from the screen itself). Your own brain’s white balancer is going to make a D65 screen look hideously yellow in such light. The same content will look good in an incandescent environment or a dark place. Ambient light is a key component when it comes to color perception. Hide yourself in a cave illuminated only by very strong green LEDs. Stay in there for a couple hours. Now run out of that cave. For a few seconds, the entire world will be magenta.
Except this is happening on an LCD which can’t properly display 9300k. The backlight itself is 6500k.
To get it to display 9300k you are filtering out a ton of red and green light- limiting the peak brightness of the display, and since LCDs depend on their brightness to achieve their contrast ratios, you’re also doing bad things to the contrast ratio in the process. The only way an LCD can display 9300k is by limiting its peak performance and making it less efficient. The desaturation with 9300k is very pronounced on LCD, but it’s still a thing even with CRT - hence the Sony red push.
One might also point to the total dearth of artist/developer quotes regarding color temperature…. If it was that critical, someone would have mentioned it by now, right?
This is a great bit of evidence and why I’ve kept referring to it as “the broadcast standard”
Which is one of the reasons why OLED is so amazing.
Well for me it is indeed critical when it comes to achieving CRT-like presentation in the context of vintage gaming. And even though you say you don’t really care, here’s this thread 
And here’s an interesting read too
https://www.reddit.com/r/MAME/comments/1f27z11/d93_colour_correction_luts_for_mame https://github.com/danmons/colour_matrix_adaptations?tab=readme-ov-file
How an image looks on a screen is dependent on many things. The biggest influences on the colour especially come down to two main features. Firstly, the regional standards that display used, such as various US/NTSC vs EU/PAL vs Japanese standards. The often-debated feature here is whether or not the authors of the content intended to use western-standard D65 white points, or 1990s-Japanese-standard D93. For more on what this means, see the
FAQ_D93.mdfile.
Unless we find a japanese developer/artist that was active at the time and is willing to talk about it, we will probably never know for sure. Boiling it down to personal preference is likely the healthiest approach 
Is there a reliable source for this? D93 white point itself lies well within the sRGB gamut, just to mention it. Or it’s more “the general color translation” of all relevant colors isn’t reliable or even possible.
It’s more the latter, yeah, but it has to do with the performance of the display. It can display 9300k “properly,” but only by limiting the peak performance of the display- and in ways which are particularly bad for our shader purposes. In short, a 6500k backlight display can only display 9300k by limiting the amounts of red and green light that it naturally produces. It’s literally limiting the range of the subpixels and lowers the peak brightness. It’s difficult to find a source for this other than forum knowledge, since it’s all just based on how the technology works, but I’ll see what I can find. Calibrators might be a good source- they will know what happens to contrast when you try to use 9300k on a 6500k display.
There are LCDs that can be switched natively. Some TVs have two ‘expert’ settings you can save, one could be for D65, the other another temp. I’ve never tried a D93 calibration, not sure how it holds up in linearity vs a D65 cal in general.
Transforming to D93 within sRGB space definitely goes out of gamut. When I developed my current phosphor shader model, I used magenta to test for out of gamut colors. In ‘normal’ content they popped up all over when translating to D93 even though the primaries are the same.
There isn’t enough evidence to say this. And there are too many open questions. You assume that the artists used TVs for their work but they could have used computer monitors or arcade monitors instead. You also assume that all the TVs were in D93, but TVs come out of the box at all kinds of temperatures, and you assume they didn’t alter the behavior to use a different temperature. That’s too many assumptions.
We have Sony TVs with a ‘game mode’ setting that lowers the temperature. Why would they do this if the intended temperature is 9300 K? The answer doesn’t have to make sense, but the question is enough to cast doubt on D93 being ‘more than likely’. There are too many open questions.
There are indeed. Quoting myself:
It can go beyond personal preference which I made earlier: use the setting on your TV that allows for good linearity and gamma. If your TV can handle D65 well, use that, if it can handle D93, then use that. If both give good results, then I guess it does come down to preference. Ideally watch content in a dim environment where the TV can be much brighter than the surround to allow chromatic adaptation to work its magic. Then your eyes will compensate for the white point naturally.
Transformation via bradford is simple to do, but adequate LUT could give better results.
The only problem is that some LUT tools use “bradford” for their calculations. 
Does any of this correspond to reality or is ChatGPT drunk again? Maybe @anikom15 knows?
9300 K “Sony look” is mostly a marketing / display mode label
- In the late 1980s–1990s, many Sony Trinitrons had a user-selectable color temperature setting labeled “9300 K” (or “cool” mode) in addition to “6500 K” (“normal”) and sometimes “5500 K” (“warm”).
- 9300 K mode was never truly 9300 K in absolute colorimetry ; it was a visually cool setting intended to appear brighter and sharper in stores or under bright ambient lighting.
- On most consumer Trinitrons, measured white point for 9300 K mode was more like 7000–7500 K , not a literal 9300 K.
Factory Defaults Were Warmer
- Sony TVs typically shipped with a default “Standard” or “Normal” mode that was actually warmer than D65 (6500 K) .
- This default mode is where the red channel boost (+3–6 %) comes in, producing a white point closer to 6000–6200 K .
- This is what gives the classic “Trinitron glow” — slightly warm whites, slightly enhanced reds and yellows.
Why People Associate Sony with 9300 K
- The “9300 K” label was on the remote / menu , so hobbyists often equated it with all Sony TVs.
- Marketing materials for CRTs sometimes touted “9300 K for crisp store-quality whites,” giving the impression that Sony TVs ran extremely cool by default.
- In reality, most consumers never used the 9300 K mode , because the picture looked too blue / cold. Default mode = warmer than D65.
What it Means for Red Channel
- Red push (~+5 % gain) existed in default / standard mode (~6200 K) .
- Switching to the 9300 K mode :
- Red bias is reduced (to make the picture look cooler)
- Blue channel is boosted
- Whites look “bluish white” rather than warm
- So the red boost and the “9300 K” label are actually opposites : red is stronger in default mode, weaker in the 9300 K mode.
So when people say “Sony TVs are 9300 K,” they’re usually referring to a menu setting — the real factory default mode was warmer (~6200 K) with a red boost , which is why NTSC composite and arcade RGB looked vivid.
Factory Defaults Were Warmer
Why People Associate Sony with 9300 K
What it Means for Red Channel