Success! Thanks again
Just fired up my PVM with SNES 240p Test Suite via MiSTer and the color bars are not clipped at all (i.e., clear divisions in each column all the way from 0 to F) and the gray ramp has clearly defined divisions all the way across as well. I’m not going to bother taking any pics because it wouldn’t really show anything useful.
Awesome, thanks! This is what I expected to happen.
@Dogway This should be of interest to you.
You made sure to change that alias0 to an alias1 as well right?
I’m glad that it’s working for you now.
I don’t know how the test was set up. The safest is to load the test suite through a snes and connect via composite. The snes is what converts to composite compliant (and hence SMPTE-C legal levels) signal.
Also PVMs are very factory calibrated but you might already know that lol. For PVM look use 2.222 SMPTE-C gamma with a certain PVM phosphor matrix.
By the way, if you are not using the HALD LUTs I recommend leaving the LUT16 and LUT64 lines as hunterk recommended before, otherwise you might forget what size the LUT is using and run into errors.
LUT_Size1 = "16.000000"
LUT1_toggle = "0.000000"
LUT_Size2 = "64.000000"
LUT2_toggle = "0.000000"
textures = "SamplerLUT1;SamplerLUT2"
SamplerLUT1 = "..\reshade\shaders\LUT\16.png"
SamplerLUT1_linear = "true"
SamplerLUT1_wrap_mode = "clamp_to_border"
SamplerLUT1_mipmap = "false"
SamplerLUT2 = "..\reshade\shaders\LUT\64.png"
SamplerLUT2_linear = "true"
SamplerLUT2_wrap_mode = "clamp_to_border"
SamplerLUT2_mipmap = "false"
Yeah, I’ll need to get an everdrive and a generic CRT, then. These are things I’ve been meaning to get, anyway. Might be on the backburner for a while but I’ll let you know when/if it happens.
With regards to the white point for NTSC there’s this interesting information in an ITU research paper that I found referenced in the Wikipedia page on NTSC. The article link’s here:
https://www.itu.int/dms_pubrec/itu-r/rec/bt/R-REC-BT.470-6-199811-S!!PDF-E.pdf
Timewise the ITU research paper may be the pinnacle for CRT reference as it has been updated over the course of the 1970-1990’s, with the last update in 1998. Anyway it cought my attention because of the previous talk by @Nesguy and @Dogway on the whitepoint used for NTSC, discussing between whitepoint being 9300K or something lower for NTSC. Maybe we’ve got our answer right here, first see point 1 in Annex 2 on page 35:
- In 1953, when the NTSC colour television system was adopted for transmission in the United States of America, the colorimetry of the system was based on three specific primary colours and a reference white. The coordinates of the primaries were (the coordinates are given in the CIE system (1931)): Red: x= 0.67 y= 0.33 Green: x= 0.21 y= 0.71 Blue: x= 0.14 y= 0.08. The reference white chosen was standard: White C: x= 0.310 y= 0.316.
Which dogway already said something about this NTSC standard not being widespread I believe.
But then interesting on the whitepoint specifically, in point 6 it is noted:
6 In Japan, the colorimetry of the system is based upon the primary chromaticities and white point given in § 1. Studio monitors are adjusted to a white point of D, 9300 K
Given the last update of the article was in 1998, it seams that for NTSC-J TV’s the whitepoint was “White C”, and only for studio monitors “White D” was used. To me this implies White C is very probably what Japanese games were referenced by developers on, as that was what most japanese had in their homes! (Which family would have had studio monitors in their home??!)
Anyway @Nesguy maybe this helps in your doubt about the 9300K whitepoint. If I look up White C on wikipedia it actually is 6770K, which is only slightly colder than D65!:
@Dogway Give the above information, I’m thinking about your comments about calibration.
Suppose it would be possible that we could calibrate a wide gamut monitor to a CUSTOM target for RGB primaries and whitepoint (maybe displayCAL can do this). So not sRGB, but something custom like NTSC with whitepoint C, or EBU D65 or any of the other CRT gamuts. I assume if we could calibrate a monitor to such a CRT custom primaries and whitepoint we would have defacto CRT gamut on LCD?!? Would this not be the most accurate solution?
Not sure if this is possible, but if so, what settings would the grade shader be set to (I assume just leave it at sRGB, since color primaries are then not converted in the shader, but just output to the monitor, which uses the custom caibrated color primaries?
Man I love these papers haha That’s where most of my time went in.
After reading so much my conclusion was that NTSC-FCC (1953) was never used much. It was very saturated (test CRT gamut #-3), that is the one with White C. So if it wasn’t used much what was used instead? SMPTE 170M (1987) with white D65. That’s the standard, in practice white point was cooler with uncalibrated CRTs.
For Japanese white point I found a Japanese PDF with the phosphor matrix, when applied it had a D93 tint, that’s why I reverted it in the shader so you are not bound to a certain WP. I was looking for that PDF and couldn’t spot it but I found this other one from 1988, they used 9300K.
There are two official 9300K white points it seems:
- Standard CRT “9300K+27 MPCD” CIE x* =0.281 y=0.311
- Master CRT “9300K+8 MPCD” CIE x=0.2838 y=0.2984
- Correlated D93 “9300” CIE x=0.2848 y=0.2932 (my version: 0.2830637, 0.2970177)
*Computed the CCT, 8941.864K for the first case and 9176.195K for the second.
As for the last question, yes, if you have a dedicated monitor you can calibrate to a CRT white point, but the gamut is limited by the monitor’s phosphor gamut, that means, you are trying to gamut map sRGB content to something totally different / bigger, and hence be hardware limited. This is the same as doing nothing, you normally calibrate to a lower gamut than what your display is capable of (eg sRGB) so you can engulf all colors accurately relative to each other. In other words, you want to see CRT colorimetry on your display, so you calibrate to sRGB and color manage RetroArch to convert CRT phosphor emulation with grade translated to your display gamut.
In your Wide Gamut case, it might be possible if primaries are big enough (typically blue is anchored at the same point). Yes, you would use no phosphor gamut #0 but you have to provide a custom XYZ_to_RGB matrix for your display gamut.
EDIT: Updated the NTSC-J white point to the one pointed above instead of my correlated one and Sony and PAL matrices.
Thanks, I had a look at the Tables in the Japanese PDF, and it seems that the 9300K was indeed common good. Nice!
If you say SMPTE 170M (1987), was that the universal standard for NTSC primaries? I tried looking it up via google, but not sure about the “1987” primaries, could you post those?
I now understand your point about the relative colorimetry better. It’s better to have slightly less saturated primaries but the CRT relative colorimetry correct, instead of having more saturated primaries (e.g. somewhat closer to CRT primaries, but not precise) with incorrect CRT relative colorimetry.
With regards to the wide gamut, could you elaborate a bit on the part for needing a custom XYZ_to_RGB matrix?
I think I understand a) you would need one if e.g. the emulator outputs sRGB, the shader has SMPTE-C primaries as target and the monitor is a wide gamut, for example DCI-P3. In this case the shader must know that it is “living” on the hardware side in DCI-P3 space, such that it can correctly map sRGB -> SMPTE-C in that space.
But now suppose case b) where you can calibrate a wide gamut monitor to a custom set of primaries (through DisplayCAL or something). So the calibration target is not one of the usual sRGB/DCI-P3/Rec2020 as target, but instead has SMPTE 170M primaries as calibration target. Suppose the monitor is capable of this and the calibration profile maps the primaries to the SMPTE 170M primaries with deltaE < 3. Do we in this case still need a custom XYZ_to_RGB matrix? I would think since the calibration profile makes the monitor space already SMPTE 170M accurate, one would not need the custom XYZ_to_RGB in this case. Basicly because emulator sRGB RGB output is mapped via the OS side calibration color profile to SMPTE 170M directly?
Yes, as I could find it it was more towards 9000K for home receivers and 9200K for master monitors.
You are making me recheck my references (which I don’t usually keep) lol. The wikipedia says that in 1987 the SMPTE adopted the SMPTE C phosphors for general use in RP145.
The phosphors are well known:
0.630, 0.340
0.310, 0.595
0.155, 0.070
They actually were standardized under inspection of the most common phosphors used in commercial TV sets during that time which were of the type P22.
0.640, 0.352
0.282, 0.620
0.146, 0.061
About gamut mapping, there are several “intents” to remap colors, relative or perceptual will desaturate them to conform a relative accurate representation of source colorimetry. While there’s another intent coined Saturation which will try to maximize saturation at the cost of hue inaccuracies. Absolute is just clipping out the values which is what I (and old systems) do, it maximizes saturation as well but there’s no attempt to scale down relatively the gamut colors.
For your examples on Display gamut mapping, you can do your calculations in any space really. I decided to do all the color work in your display space (Rec.2020, DCI-P3 and so on) to maximize colorimetry, remember we are clamping extensively within the shader. You could as well get the output of all the operations in sRGB and convert that to DCI-P3, this is what other shaders do but you lose some benefits.
There’s a thing in Color Management called the color system, in CIE it is XYZ and it serves as a pivot space to convert from/to other spaces, it is an abstract color space so it doesn’t matter at which point you do your conversions given you know where you come from and where you go to. What a profile does is reach where the calibration stage couldn’t so your display can be within deltaE < 3. It doesn’t do any conversion whatsoever, that’s the task of Windows Color Management System (does it even work?) or each program by itself. We know that Retroarch is not color managed, it has no awareness of a system’s color profile so you need to expressively convert colors to your display space. I found that using a matrix conversions is what worked for me, LUTs are also possible but I didn’t have much luck.
In your case, if you calibrated (and profiled) to SMPTE 170M you need to convert the color output of RA to the above matrix for a cheap alternative, or find the profiling matrix, compute the conversion matrix and use that for XYZ_to_RGB (recommended method).
I was checking some NES screenshots and found that they are a bit more desaturated than I had. I tried with a 3 dB attenuation and it was spot on. So I also tested this on the SNES and I really liked the results, I dial back the Q shift a bit and the 9300K now doesn’t look so blue (better than lowering temperature only), it feels more analogue like:
g_I_SHIFT = "0.000000"
g_Q_SHIFT = "-0.030000"
g_I_MUL = "0.707107"
g_Q_MUL = "0.707107"
I love you, lol!
I’ll check this out later tonight when I have nothing going on. Super excited about this! Could you post a comparison of the old version and this one with the same settings being used?
I’m a bit busy, but it’s pretty straight forward, check with the color bars and forget about old implementation lol.
It’s cool, don’t worry about it lol.
Does it still function the same way via settings? Increasing the hue vs sat adds saturation to a hue section whereas decreasing it removes saturation from a hue section?
Yes, that’s it. Easy peasy.
Thanks I just wanted to confirm, So I wasn’t blindly fiddling with those settings. And to save me from needing to bother you later about it.
Hope you have a great day Dogway!
I think that the de-saturation you are seeing has to do with the high white point (~9300K). This is why many Sony TVs had ‘red push’ aka high red saturation to ameliorate seemingly low saturation.
Yes I did some comparisons with the analogue color adjustments vs the post color adjustments. There’s actually a shift in HUE that I find pleasing (more earthy) instead of the typical vivid colors of digital sources. This doesn’t work at all for Genesis, not sure why but I’m biased since I only test with Sonic and its shades of blue.
Just reading up more on calibration and am seeing that some vendors (Eizo, DELL amongst others) are providing a hardware calibration option with some of their monitors. Meaning the calibration profile is written to a hardware slot in the monitor that can be chosen with a preset button. The big benefit of course being that the profile can be active with any application, like Retroarch (opposed to software calibration profiles that don’t work with full screen applications).
For my understanding, suppose we’re talking of such a solution, where a monitor is calibrated to SMPTE 170M and the calibration is saved in a hardware slot on the monitor. When using this hardware preset then, would it need any special setting on the grade shader? Could the step of converting the color output of RA be skipped since the SMPTE 170M gamut is actually a monitor hardware profile / setting?
Edit: here’s a quick link from the Eizo site: https://www.eizoglobal.com/support/glossary/h/hardware_calibration/
Hardware Calibration
Hardware calibration is the method of adjusting color directly by adjusting the settings inside the monitor. With hardware calibration, the target color is not reproduced through the graphic card output where all or a certain combination of white point, gamma, and brightness are reduced. EIZO’s ColorNavigator software, which is included with ColorEdge monitors, employs hardware calibration, making it possible to maximize the monitor’s capabilities to achieve very precise calibration. A monitor that is equipped with a look-up table (LUT) of 10- bits or larger for each color is required for hardware calibration.