ff4 maybe? (the crippled US version of course)
I like the old RPGs. But not the super-low-res. But also not the clean output of upscalers. I am using a 10W Nvidia Jetson Nano.
Current favorite is 4.0 scale, 4xbrz + tweaked fakelottes, which brings me up to 96% GPU. http://0x0.st/icOR.glsl
Thanks and respect to Hyllian and hunterk.
I know many classic gamers don’t like the upscaled look - no need to tell me. Cheers.
I found this article that seems to detail the LG’s OLED pixel structure. Thoughts?
https://www.cnet.com/news/phone-oled-vs-tv-oled-whats-the-difference/
i think rather than trying to address the subpixel structures of every display ever made (some of which I don’t even know how we would address), I think we just need to rely on simpler methods, like royale’s basic LUT tiling or Lottes’ rotated masks, to get a “good enough” result.
I still don’t know how the white subpixel works. It’s adding a certain % of white to obtain the correct luminance and color, and without knowing exactly what it’s doing it’s impossible to say what mask pattern would work.
However, it seems like the OLED needs the white subpixel in order to display the correct luminance and color, so shutting off the white subpixel with a mask pattern to obtain “correct” subpixel spacing is going to seriously mess with the picture. It’s not like an LCD where you can just crank up the backlight to compensate, unfortunately.
With a high enough resolution you should be able to get a good enough result with Lottes-type masks that ignore subpixel spacing. That might be your best option.
I think the collection of masks in the mask mega-pack shader function do a good job of addressing the most commonly used subpixel patterns. For less common subpixel types, the Lottes-type patterns that ignore subpixels are probably the best option. At a high enough resolution, you can still get a fairly convincing result with the aperture grille or dotmask patterns. Slotmask really needs a subpixel-respecting pattern AND a high resolution to look good, though.
My reading comprehension is probably not the best but as far as I can understand, each point on the grid includes a White sub-pixel that increases brightness. Therefore, it’s possible that there is nothing to consider as a difference from RGB vs White+RGB
I’ve got an LG OLED that is WRGB and although I don’t have a slot mask CRT to compare it to, the Guest-SM shader never looks off to me with the stock mask.
You’re missing some context. Regular subpixel spacing is important if you want mask emulation to be as accurate as possible. The magenta-green “stock” mask was created with this in mind. It might look good enough to you on your display but it’s definitely not working as intended with RGBW subpixels; it needs standard RGB subpixels to have the intended effect. To understand why this is, you have to understand that an LCD pixel is not just a tiny square that displays any color.
If something looks good enough to you, that’s all well and good, but we’re talking about accurately recreating the CRT’s phosphor structure down to the smallest level of detail possible. There’s a lot of good discussion on this elsewhere in the shaders category.
There’s definitely a difference. The subpixel-respecting masks simply won’t work as intended with RGBW subpixels. Hate to be a buzzkill but there’s no solution that doesn’t introduce more problems.
Your best solution will probably be to use one of the Lottes-type masks that disregard the subpixels. You can get a decent result with a high enough resolution, but it won’t be ideal.
That’s fine, I wasn’t suggesting that it was working as intended and as stated I don’t have anything to compare it to. I usually use royale, so I’ll probably go back to that but it still looks good to me.
I wanted to share a shader I have been cooking for some time. It started with GenesisPlusGX, adding a dedither shader before the ntsc simulation block limited the number of shaders I could stack. Unsure if this is a retroarch limitation of the number of shaders, or a limitation of the ntsc filters I started to embed effects into a single ubershader. Effects I wanted to add like vignetting, temperature, black level, and recently LUTs. Now I am able to have all this under crt-royale.
[Dedither]
00 = gdapt-pass0-stripes.glsl
01 = gdapt-pass1.glsl
[Grade]
02 = grade.glsl (GenesisPlusGX LUT fix, Black Level, Vignetting, Color Mangler, Temperature, LUT2 -to display profile-)
[NTSC]
03 = ntsc-pass1-svideo-2phase.glsl
04 = ntsc-pass2-2phase.glsl
[Scanlines]
05 = crt-royale-first-pass-linearize-crt-gamma-bob-fields.glsl
06 = crt-royale-scanlines-vertical-interlacing.glsl
07 = crt-royale-bloom-approx.glsl
08 = blur9fast-vertical.glsl
09 = blur9fast-horizontal.glsl
10 = crt-royale-mask-resize-vertical.glsl
11 = crt-royale-mask-resize-horizontal.glsl
12 = crt-royale-scanlines-horizontal-apply-mask.glsl
13 = crt-royale-brightpass.glsl
14 = crt-royale-bloom-vertical.glsl
15 = crt-royale-bloom-horizontal-reconstitute.glsl
16 = crt-royale-geometry-aa-last-pass.glsl
That screen is pretty nice!
I’ve been doing something similar with guest-dr-venom as a base. Adding vignetting, noise, etc.
I’ll check this out when I get on my PC .
EDIT: That grade shader seems pretty cool, haven’t tested it yet but I went through the code for it (saw some stuff that’ll be super helpful for me, was having issues introducing black level to color-mangler). Thanks for sharing @Dogway
Thanks! Actually the screenshot is before I did the shader refactor yesterday with the LUTs implementation (allows 2 LUTs and toggle them through GUI). I’m trying to target those colors but didn’t have enough time to play with the values. I hope I got the gamma handling correct, everything should happen linearly.
Honestly I didn’t even really pay much attention to the 2 LUT thing when I originally looked over it… Like I saw the LUT toggles but didn’t really read any of it so I guess I just assumed you had separate toggles for the LUTs for some reason that didn’t involve both of them being used in tandem, lol.
I was super curious about the black level situation and the vignette, I tend to parse through sections of code, so I end up looking over a shader multiple times before I get the whole picture of what is going on.
Now I have questions, lol.
How does that work, like does it blend both of the LUTs to get a “new LUT/color profile”?
All except the LUTs were done last year so the novelty for me was the LUTs integration. My goal was to color manage retroarch, so colors can be reproduced correctly on a calibrated display, that’s the main goal of LUT2, I guess not everybody would calibrate the display or use 2 LUTs so it can be disabled. LUT1 is for using some CRT colors emulation luts or in this case a GenesisPlusGX color fix.
Black level I did a few days ago (I posted about it a few days ago), I wanted to wash out the blacks a bit without realising that the color mangler had already a Lift parameter (in R/G/B though).
Vignette, was last year, I think I only copied a chunk from another shader and copied it over. For slang I had to be more creative since I didn’t have any reference whatsoever.
Excuse my ignorance, but how exactly do I use this? It doesn’t seem like other shader presets. I don’t think I can just copy and paste that, rename it to .glslp, etc. If I’m supposed to put it in manually, that’s fine, but as far as I can tell, “gdapt-pass0-stripes.glsl” simply doesn’t exist. I’m just going to assume I’m doing something wrong here. Shader looks amazing in the screenshot, though.