"Correct" Color: Some Evidence Related to the NES

There probably is a warm shift to the photo, but that’s not enough to account for the colors we’re seeing on the monitor.

There’s no way you get that color yellow background and that color blue for the logo with a cool color temp.

That doesn’t necessarily mean that they were targeting the standards that are popularly assumed to be what they were following.

I’d love to see some direct quotes or solid evidence regarding color temperatures. The connection between broadcasting standards in place at the time and whatever pixel artists were doing always seemed a bit tenuous to me.

I was speculating that one of the reasons for the shift to a blue sky in later iterations of SMB could be that they were using photographs of CRTs as a reference- every time I try to take a photo of that purple sky, it always comes out blue. That and just, you know, the sky is blue.

1 Like

O_O

If you didn’t understand the above, it’s going to be too complicated to explain why, but…

The photo has flash, goodbye TV colors.

The tones in the rest of the image are not shifted significantly enough to indicate the kind of shift that would be required to shift cool colors on a monitor to the warm colors we’re seeing. Regardless of whatever camera shift is occurring, that’s warm to start with.

The purple sky quote is even more damning, though- it’s purple with a warm color temp.

It’s kinda moot though- as I’ve acknowledged, we have multiple “official” palettes now, pick whatever you like. I like the original colors :smiley:

Monitors are not calibrated for temperature. NTSC-J has a standard of 9300K. And the TVs shown in the photo are not professional models; one is a Trinitron and the other is a Sharp.

Could it be that they are not addictive colors like a TV screen?

This isn’t about me; I’m trying to explain why that image isn’t a good color reference.

But I think you’re overcomplicating things. That image is in the production stage; they could have put it there and then changed it.

2 Likes

So you’re assuming 9300k was the standard. I know this is retro gaming gospel among some, but show me the solid evidence. Just because this was a broadcasting standard doesn’t mean it was in use by the pixel artists. It’s a huge leap without much to support it. A single quote from a developer would be great. I haven’t been able to find anything.

Meanwhile, we have the purple color quote (you don’t get that color with a cool temp), and the photo can be considered corroborating evidence of warm temperature usage.

I can see the shifts happening to the monitor but I think it’s a stretch, and too convenient, to claim that we started with 9300k and the camera somehow made it 6500k. :smiley:

Anyway, I don’t think retro gaming is a very color-sensitive thing in the way movies are.

1 Like

What’s even messier is trying to judge these things using photos. Have you ever gotten the white balance to match exactly what you’re seeing on the screen when trying to take a photo or make a video recording of CRTs or CRT shaders running on a modern display?

That alone skews things at least slightly.

true, and that why it has to be done Professionally with post-processing adjustments to make the images the same colors and image as CRT as they appear to the eye. It might also be a good idea to put two images, one without any post-processing modification and the other with post-processing modification

1 Like

IMO they tested for three major markets, thats Japan, USA and the rest (european countries, Australia…). So it might depend which game version is used, since the cartridges contained different “content”.

3 Likes

A camera shot is not a good reference for sure. That photo could have been taken from a magazine that did extra processing etc that altered colors. Only if you have a real japan nes, japan cartridge and pro equipment (that won’t alter colors again)to capture what’s really going on.

2 Likes

If anything the shift would be toward blue, because that’s what always happens when you try to photograph CRTs. It’s very rare that I’ve seen CRTs appear warmer in a photo compared to IRL.

I think a lot of people are just very invested in the idea that “9300k is correct,” with no hard evidence to support this. It’s all pure speculation among retro gamers. It was a broadcasting standard and it was how games happened to be displayed in that region. Whether it was part of an intentional look is highly speculative.

The OP was meant to be somewhat tongue in cheek, guess that didn’t come across.

I’m still very confident that is a warm color temp on that monitor. Starting with a cool color temp and warm shifting the photo simply will NOT produce that color blue for the logo, period. I’ll die on this hill.

Have fun, do what feels right.

2 Likes

All the posts you open asking questions, and when they answer you, you say, “I don’t know, I don’t think so, I’m not sure, convince me.”

Son, do what we all do and search the internet.

I have no idea what you’re talking about. What other posts are you even referring to? Maybe you’ve misinterpreted or misunderstood something that was said to give you this impression.

And I have searched the internet - extensively- that’s part of the point I’m making. The hard proof that 9300k is part of an intended look simply doesn’t exist. Show me otherwise and I’ll happily eat my words. I know you’re very invested in being right about this 9300k thing but it’s all speculative.

In fact, I never post a question without first Googling the topic, and only if I think it will lead to a fruitful discussion - and it often does. I think there are others who would agree. Do you seriously think composite video is “solved?” (I assume that’s the topic you’re referring to?) There currently is no clear consensus on how all this stuff is working among the emulation community right now- these are HIGHLY relevant and important conversations to be having.

From what is widely known, NTSC-J used 9300K that is applied on the Japanese monitor used, e.g. if you use a Japanese N64 in USA it will produce the exact same image but in 6500K (the US monitor), i think NTSC-U has a lower black level too (would need to bump up the brightness a bit to match NTSC-J black). That 6500K should be ~7000K in reality in most mid to late 90s TVs.

I’m not disputing that- I’m contesting the idea that the existence of the broadcasting standard proves definitively that it was always rigorously adhered to by all or even most pixel artists. Particularly in the early days which weren’t very “standardized.” I mean just look at the way the NES handles color, it’s wild. They just did whatever worked back then.

Many Japanese games were developed with a global market in mind. Developers may have designed the graphics to look correct on both NTSC-J (9300K) and North American NTSC (6500K) systems. Some may have simply disregarded the color temperature differences, assuming the result would still be acceptable.

There was too much diversion, also different TV manufacturers used different primaries, it was a complete mess. I had a Trinitron and colors in an Amiga Monkey Island scene that showed some sea waves were purple-ish on Trinitron and blue-ish on another cheap brand 20".

2 Likes

I’m inclined to think that because of how wildly different TVs were, the pixel artists kind of did whatever with color. One of my main points is that we have AFAIK literally 0 developer quotes regarding color, the way we have quotes for things like scanlines, crts, composite video, etc.

Just a hint, Android uses a larger gamut or at least most of them do, when i run some games, colors looked exactly like those sRGB to NTSC-U matrices i was using in shaders for PCs without actually doing anything on a shader to alter colors. It seems it uses (or at least some do) a gamut that is as wide as those old CRTs and reproduces colors as it should.

1 Like

Okay, let’s try it another way…

If 9300K is not used as the white point in Japan, then what is used in Japan?

There is a reason they put this there, that’s NTSC-J default on a Japanese monitor. No reason to slot that D93 there if there was no use, as Japan is the only one using it anyway. Those are high profile monitors that costed thousands of dollars from one of the most reputable brands.

https://crtdatabase.com/crts/sony/sony-pvm-20s1wu

9300k was part of the broadcasting standard - this was never in dispute. The existence of the broadcasting standard does not mean that 9300k is part of the “intended look” for these games.

But now I’m just repeating myself:

I’m contesting the idea that the existence of the broadcasting standard proves definitively that it was always rigorously adhered to by all or even most pixel artists. Particularly in the early days which weren’t very “standardized.” I mean just look at the way the NES handles color, it’s wild. They just did whatever worked back then.

Many Japanese games were developed with a global market in mind. Developers may have designed the graphics to look correct on both NTSC-J (9300K) and North American NTSC (6500K) systems. Some may have simply disregarded the color temperature differences, assuming the result would still be acceptable.

I’m inclined to think that because of how wildly different TVs were, the pixel artists kind of did whatever with color. One of my main points is that we have AFAIK literally 0 developer quotes regarding color, the way we have quotes for things like scanlines, crts, composite video, etc.

Now, are you going to respond to any of those points?

If 9300k was essential for the intended look, do you think we’d have maybe at least one quote from a developer about it? If you can find it, I’ll retract everything. I’d actually be happy if we had some better evidence relating to this, either way.