That’s some interesting information…
I’d love to hear @hunterk and @guest.r input on this as well.
Yeah, gamma is a bit tricky to understand. Let’s say most of the non-crt ond some of crt shaders don’t use any input/output gamma transformations and the image looks correct.
But it’s very nice to do the transformation with crt shaders, because the horizontal filtering is done “CRT properly”, scanlines look OK and masks aren’t that heavy (speaking 0.3-0.4 CGWG mask, 0.5-1.5 lottes masks etc.).
Depending on the gamma setting of your final display the output gamma could be a bit lower, like 10%. This produces a stronger contrast and more saturated colors.
A combo of 2.4 in and 2.2 out is very common, but it could be like 3.0 and 2.7 also etc. - if you like it.
In some of my shaders scanline functions already add extra saturation and contrast, so the combo can be neutral, like 2.4/2.4.It’s OK to put this as a personal preference.
By the way I like how we can control saturation, gamma,color profile etc on your shaders, it’s important to have possiblity to adjust those things for me. Yes I agree.
Edit: crt-guest-dr-venom https://pasteboard.co/IV0JTJE.png
I dig the look. But I can’t really achieve it. Could you please tell me what settings have you changed here?
It requires some tweaking. First you need to lower the interlacing trigger resolution, so you can have “interlacing” instead of scanlines.
Then you can select Interlacing mode 2.0 or 3.0.
Next bloom is to be increased, prefered with masks 1.0 to 4.0.
Personally i like this setup with less horizontal sharpness.
Inspired by this post I started to test washing out the blacks in my shaders and found a few surprises. Similar to the D93 temperature discussion we had a while back, I think that retro games also accounted for the extra brightness of CRT displays and made the content darker in compensation. Until now I was lowering the CRT gamma from 2.50 to 2.35 by feeling, but that didn’t deal with the crushed blacks, now gamma is back to 2.50 and blacks are lifted 0.1 points.
A few examples:
That’s an interesting idea. I know I used to play a lot of games with the brightness turned up since it let me see things that would normally be hidden, like secret paths that lead off of the map in JRPGs, but until they started including brightness guides and settings within the games (like later Resident Evil games), I never really knew what the intended brightness would be.
ff4 maybe? (the crippled US version of course)
I like the old RPGs. But not the super-low-res. But also not the clean output of upscalers. I am using a 10W Nvidia Jetson Nano.
Current favorite is 4.0 scale, 4xbrz + tweaked fakelottes, which brings me up to 96% GPU. http://0x0.st/icOR.glsl
Thanks and respect to Hyllian and hunterk.
I know many classic gamers don’t like the upscaled look - no need to tell me. Cheers.
I found this article that seems to detail the LG’s OLED pixel structure. Thoughts?
https://www.cnet.com/news/phone-oled-vs-tv-oled-whats-the-difference/
i think rather than trying to address the subpixel structures of every display ever made (some of which I don’t even know how we would address), I think we just need to rely on simpler methods, like royale’s basic LUT tiling or Lottes’ rotated masks, to get a “good enough” result.
I still don’t know how the white subpixel works. It’s adding a certain % of white to obtain the correct luminance and color, and without knowing exactly what it’s doing it’s impossible to say what mask pattern would work.
However, it seems like the OLED needs the white subpixel in order to display the correct luminance and color, so shutting off the white subpixel with a mask pattern to obtain “correct” subpixel spacing is going to seriously mess with the picture. It’s not like an LCD where you can just crank up the backlight to compensate, unfortunately.
With a high enough resolution you should be able to get a good enough result with Lottes-type masks that ignore subpixel spacing. That might be your best option.
I think the collection of masks in the mask mega-pack shader function do a good job of addressing the most commonly used subpixel patterns. For less common subpixel types, the Lottes-type patterns that ignore subpixels are probably the best option. At a high enough resolution, you can still get a fairly convincing result with the aperture grille or dotmask patterns. Slotmask really needs a subpixel-respecting pattern AND a high resolution to look good, though.
My reading comprehension is probably not the best but as far as I can understand, each point on the grid includes a White sub-pixel that increases brightness. Therefore, it’s possible that there is nothing to consider as a difference from RGB vs White+RGB
I’ve got an LG OLED that is WRGB and although I don’t have a slot mask CRT to compare it to, the Guest-SM shader never looks off to me with the stock mask.
You’re missing some context. Regular subpixel spacing is important if you want mask emulation to be as accurate as possible. The magenta-green “stock” mask was created with this in mind. It might look good enough to you on your display but it’s definitely not working as intended with RGBW subpixels; it needs standard RGB subpixels to have the intended effect. To understand why this is, you have to understand that an LCD pixel is not just a tiny square that displays any color.
If something looks good enough to you, that’s all well and good, but we’re talking about accurately recreating the CRT’s phosphor structure down to the smallest level of detail possible. There’s a lot of good discussion on this elsewhere in the shaders category.
There’s definitely a difference. The subpixel-respecting masks simply won’t work as intended with RGBW subpixels. Hate to be a buzzkill but there’s no solution that doesn’t introduce more problems.
Your best solution will probably be to use one of the Lottes-type masks that disregard the subpixels. You can get a decent result with a high enough resolution, but it won’t be ideal.