From what I could find all it does is load the gamma tables that comprises white point and grey balance, it doesn’t do any gamut mapping because that’s the task of the applications. So yes, you still need to convert sRGB to SMPTE 170M. If the verification of the profile is satisfactory you can use the default SMPTE 170M primaries for the sRGB to SMPTE 170M conversion, otherwise you’d need to use your calibrated display primaries for the transform.
Good observation. I’ve had some success with using white-red tint to replicate the red push on Sony TVs, and it definitely improves the saturation.
Thanks. I looked though the color navigator guide (this is apparently Eizo’s “Display CAL” that comes with their monitors) it seems custom gamut can be set in advanced mode. See page 26 of this manual for example:
CG 2730 27" (68 cm) Hardware Calibration LCD Monitor
It says:
Gamut: Specify the color gamut for the monitor. Use the monitor’s color gamut as is, use the standard value, or set the value manually, and enter a value as needed
It’s also interesting to see on page 27 that it apparently supports EBU and SMPTE-C modes out of the box.
Anyway, given that it seems possible to do custom gamut color primaries target hardware calibration, would that change things re the conversion of sRGB to SMPTE 170M on the shader side?
Edit: if I understood you correctly on the clamping, it’s because the primaries are outside of sRGB, right? So in that sense even with custom color gamut calibration to monitor hardware slot/profile one would still need to have the custom space set in the shader to prevent unnecessary clamping occuring (because shader thinks it’s in sRGB, while monitor hardware gamut is SMPTE 170M? Maybe I’m starting to get grips now, or not?
I don’t know what gamut refers to. It can be as simple as some RGB saturation sliders, or proper matrix transforms. But this is not detailed.
The EIZOs are top monitors regarding color management, it might be possible that they apply full gamut mapping to unmanaged apps as well through that software. Just confirm to check that’s true.
*If the gamut mapping is done through the monitor LUTs I believe they’d use proper gamut mapping like CIECAM or something of that sorts, something that I haven’t added to the shader. But you would lose some precision because all the grade operations would perform in sRGB space.
Great thanks. So bottom line seems to be that even if it would be possible to have custom gamut mapping through the monitors hardware LUTs, that even then you would want to use a custom XYZ to RGB matrix such that all grade’s operations happen within that same custom space to have the most accurate results?
So for the last question, does that imply that for the (theoretically) most accurate results for simulation of CRT phosphor gamut with the grade shader the following steps would be needed:
- Set a preferred CRT phospor gamut in the shader
- Calibrate monitor to that specific CRT phospor gamut (preferably, if possible, via monitor hardware LUT)
- Create a custom XYZ to RGB matrix that maps sRGB to that specific CRT Phosphor gamut
- Replace (for example) the DCI matrix in the shader code with the one created in step 3
- Set the Display Color Space in the shader to “DCI” (which now actually is the custom matrix from 3.)
- Enjoy the most accurate possible representation of CRT Phophor gamut?
Entirely not sure if the above effort would materially improve visuals when compared with sRGB, since if I understood you correctly sRGB is more or less an average of the various CRT Phospor gamuts. So one could also read this as: with sRGB you’re getting the average of many CRT’s, whereas with steps 1-6 you’re choosing the looks of one specific CRT. The latter not per se being better than the first?
I assume that the monitor LUT box maps sRGB to display gamut, so you need to output sRGB from the shader which defeats the purpose of computing everything in grade in your display space.
In my opinion it’s better to have 100% control of gamut mapping, that means the ideal scenario is to use a calibrated monitor with a wide gamut and use a custom matrix transform from XYZ to your display gamut so all the operations within grade perform in that wider gamut. This is better than calibrating to SMPTE-C at least for grade, and since you are expanding (not compressing) you run into less mapping issues.
If you insist on SMPTE on step 3 you can also use the SMPTE-C matrix given your calibration is good enough.
sRGB is a standard, like the SMPTE. P22 is more the average of consumer’s CRT color response thus I swapped those recently. It’s worth it if currently you can see differences in those gamuts and consider them worthy I guess.
Presumably if one’s display is calibrated to Rec709, bypass is the best option (default settings?)
Yes, sRGB as Display Color Space.
Was wondering if you had any suggestions specifically for NES, I find that a lower CRT gamma “looks right” for this system, like 2.20 or 2.25. What do you think?
There’s so much good info in this thread, thanks @Dogway, @Rafan, @Syh and @c9f5fdda06 for a great discussion.
Yes, I found that too, as well as temperature using a lower kelvin around 8000. These are my NES settings, later I do some artistic things with color mangler though.
g_gamma_in = "2.222000"
g_signal_type = "1.000000"
g_gamma_type = "1.000000"
g_crtgamut = "2.000000"
g_space_out = "-1.000000"
g_hue_degrees = "0.000000"
g_I_SHIFT = "0.000000"
g_Q_SHIFT = "0.000000"
g_I_MUL = "0.707107"
g_Q_MUL = "0.707107"
wp_temperature = "8205.000000"
Are you saying you use color mangler on top of grade or that you are using the color mangler section of grade?
Yes that’s it, the color mangler portion of grade. Even if use a custom matrix transform for my display I further adjust greens and blues.
How are you guys seeing this? Just by eyeballing it?
What I find odd is that I can’t remember changing TV settings back in the day depending whether SNES or NES was connected… Is changing the gamma per emulated system a shortcoming of the shader? Or is it the way the emulator author is emulating things that this gamma adjustment per system is needed?
Yeah well said.
Big shout out also to @Dogway for his great grade shader (can’t do without it anymore!) and @Nesguy and the others for discussing and finetuning stuff!
Pretty much
I had a large-ish collection of NES games back in the day, for what that’s worth.
Different consoles had different integrated circuits that processed images in different ways, and none of that is actually emulated, so you have to recreate it with shaders. This is a really bad explanation, sorry. Maybe someone with more technical knowledge can give a better answer.
Just a heads up for anyone looking, the shader is now in the official repository both slang and glsl in the misc folder ` 0 ´
This is a first impression, but after some tests I think that the guest’s LUT shader could be wrong parsing the tone response curve. I still have to test against official Reshade over Retroarch which I have never done.
Ok, I took the plunge and tested with Reshade.
Original
Reshade LUT_guest (grade’s implementation is older/different)BTW nevermind, LUT_guest uses a depth of 32 internally, changed that to 64 and it works fine. Guess I need to update grade’s LUT with the new guest’s LUT.
I updated grade in my repo to sync with updated LUT shader.
It’s much better, but still it doesn’t mimic what Reshade does which is the correct look.
Source
Reshade
LUT_guest (does a D65 to D50 apparently)
LUT_grade (new)
RA Reshade LUT