Other shaders have always this “fake” looking softness, which I could not get rid off despite playing hours with the settings. I wonder what the reason behind it might be.
I read here, that this shader works on a “subpixel level”, which maybe the reason behind it…But honestly I have no clue, what this exactly means and what the difference in regards to programming in comparison with other shaders might be.
Can you share your Peak and Paper White Luminance settings please as well as state which TV model?
I’d like to start a database of settings from users who got things working and looking right to make things easier for new users and also to start gathering more data about how the shader works in the real world on various displays.
Yes, sure. I set the peak luminance at 700 and the paper white at 600.
My TV is a LG 55GX OLED.
Normally the paper white should be much lower, as OLED’s normally don’t have that much brightness on full field white screens. But this setting works the best for me, as there is no clipping in the grey ramp in the upper parts near 100% white and below 600 the brightness slowly gets lower and the colors start to get duller. Paper white at 200 for example looks very bad.
So from what I understand here is, that accurate masks make the picture very dark and therefore most shaders optimize for brightness instead of accuracy. Black pixels between the phosphors do add sharpness if I think about it and this may be one of the reasons why the Megatron masks looks so realistic.
What I also realized is, that the distinct separation of black and used pixels does somehow generate this contrasty CRT glow which we are all after, without using any tricks (beside HDR brightness boosting from the TV side). The picture now has some depth and punch which I miss from other shaders too.
The Megatron masks are indeed very dark in SDR mode (unusable on my OLED) when compared to other shaders. But in HDR absolutely fine. Thank Goodness that HDR exists, otherwise there would be no way to have such an accurate CRT simulation.
The newer LG G3, which has almost double the brightness of my GX model, should be able to use black frame insertion for CRT motion clarity on top.
Ah Im glad youre enjoying and using it - great to here such good feedback. To be honest when I had the idea for using HDR to simulate CRTs I had no idea I was going to end up writing a shader for it. I just added HDR support to RA and then sat back for 6 months thinking someone else was going to harness it to make a stripped down shader but then no one did (kind of) and I thought well I may as well try.
I say ‘kind of’ as Ive been lead to believe you can get other shaders to look like this by turning off most of the bells and whistles and making the masks 100% and then switching on HDR in RA.
The really major thing we need now are TVs that are bright enough to handle back light strobing as well. Clarity in motion is really lacking. This shader should help in that as well as we can run this at high refresh rates and so hopefully get that clarity up if the brightness was there to support it. By the way try this shader on your mobile if you have an Android - on my one plus pro 8 it works a treat in SDR mode and in motion its the best LCD/OLED Ive seen.
Are you using HDR Game Mode or HDR Bright Mode in Picture Settings?
For me I think I used similar settings when using HDR Game Mode but I could have gotten away with much lower Paper White Luminance when using the HDR Bright Picture Settings.
Of course I wouldn’t game on any setting other than HDR Game but it’s clear that this setting is much darker than HDR Bright by default.
I think I managed to get things looking pretty decent. My only issues were the reds in Bonk’s Revenge looking more brownish, rust coloured than red and the blue High Score text looking a bit pale compared to virtually all other colours on the screen.
The thing is, other than those very specific things which, everything looked amazing!
I’m using an LG 55OLEDE6P by the way.
You can read this post to see what my last experience using the shader was like as well as some photos from that session.
The difference in colors in reality is even more obvious than on the picture, as the camera maybe compensating a little bit.
Absolutely, this is the most authentic shader I have tried yet, and I am playing with shaders and formerly scanlines over 15 years now.
As you guys mentioned before, the black pixels between the phosphors are one reason for it. A CRT tube acts similar if we think about it, but instead of black pixels it has the metal wires (Aperture Grille) attached to the glass inside the tube or the metal grid with those tiny holes (Slot Mask). Where the wires / grid blocks the light, the picture also gets black at this point, but now we do this digitally with switching off pixels to black. In conclusion this means, that CRT tubes also produce a lot of light which gets eaten up by the mask. Without the mask they may also produce over 500 nits on a full white screen instead of 100 nits.
Thanks. You should see how easy it is to get things looking perfectly using HDR Bright mode albeit at the cost of possible added latency.
At first I forgot my TV on HDR Bright and was able to use relatively a low Paper White value, more in line with what @MajorPainTheCactus usually recommends.
Upon realizing my error, I had to start over using the HDR Game setting.
This is exactly what I think is needed here, more discussion and collaboration so eventually users with similar displays would better know what to expect, learn about the little quirks and how to overcome them.
Are you able to use Manual/Pro settings? If so, you can try ISO 100/200 or as high as you need depending on if you’re capturing dark or light areas of the screen or depending on if you want to capture individual phosphors without natural blooming, Speed 1/60, Manual Focus makes it easier to get consistent quality and White Balance - whatever looks similar to what you see in real life.
I would have to agree that there’s a lot of potential here that seems to be untapped by the wider community.
Don’t forget you can “NTSCfy” your output using my Custom Blargg NTSC Video Filter Presets. I’m particularly proud of the latest 3rd Generation ones.
Wow, you also implemented HDR in RA? You are “clearly a person lordly caliber” as someone wrote in the reddit forum Honestly I would like to send you a little donation if you would accept this, because I am having so much jow with the shader. If you have paypal, you could pm me.
I will try it on my phone, but I only have a Samsung Galaxy S10e with 1080p resolution, which may not be able to produce the Aperture Grille mask accurately.
I tried your shader with my 1080p Notebook display too, and all I get is blank pixels between the other used pixels, without any RGB phosphors.
My TV has a HDR setting, which has the name “Lebhaft” in german language (in english it may mean dynamic or bright as you mention). It’s the brightest of all HDR presets. But as I am happy with the brightness in game mode and I want to have as low latency as possible, I think I am fine.
Yes, I also would like to see more indepth discussions here about the shader.
I only use my Samsung Galaxy S10e phone for taking pictures and I think it has not the same capabilities as a dedicated camera. But for posting some comparison screenshots I think it’s fine.
Yes, and I also hope more people buy HDR screens and use 100% strength masks in combination with HDR, as this can be a very good substitute for CRT monitors.
In reality a bright OLED TV with accurate mask simulation and HDR can even look better than a CRT TV, as the contrast is better (many CRT’s have only a greyish glass layer to combat low brightness, which those masks inside the tube eat up).
Black Trinitron’s from Sony are the only ones I know, which have black levels which are good enough for my taste.
Beside that CRT’s have geometry issues, which we don’t have too with OLED TV’s.
Only the motion problem has to be managed, but with enough brightness and BFI this can be solved too.
I tried some Blargg Videofilters within Retroarch, but they don’t do anything when using the Megatron shader. I can activate them, but the picture looks the same as without.
I wasn’t suggesting nor expecting that you might actually want to play using that TV setting was just sharing my experience.
I think that has to do with the particular core or game. Possibly something about the pixel format being output so not all cores would support them.
Many do though for example Beetle PSX HW, SNES9X, Genesis Plus GX, Beetle PCE/Fast/SGX, PicoDrive to name a few.
For the ones which don’t support them, you can prepend the NTSC Adaptive Shader or use one of the Sony Megatron Color Video Presets which already has the NTSC Adaptive Shader integrated.
@GPDP1 has provided @guest.r’s more up to date version a few posts above.
Thanks, with the SNES Snes9x Core it works! And it looks good so far. The composite and S-Video versions can help to hide the limitations of lowres games pretty good, especially on a 55 inch OLED, which normally is way too big for 240p content.
The 4:3 diagonal is between 40 and 45 inch approx. which translates to a huge CRT. Normally I would play 240p games on a CRT with a max. diagonal of 25 to 29 inch, so games look not too crappy. This is why I am thinking about getting a second 27 or 32 inch 4K monitor. But the affordable monitors are all LCD and not OLED, which at the moment stops me from buying. Maybe in the future I will get one, when they get cheaper.
Oh no, no your feedback is reward enough but thank you for the offer! Im actually going to start a YouTube channel shortly (next few weeks), going through the rendering pipeline of modern games (D3D11 and 12) and profiling them (with a special tool Ive written). Explaining what part is doing what and which parts are fast and slow and why. Essentially have you ever wondered what actually happens behind the scenes when you choose ‘high’ or ‘low’ on a graphics option in a game. Hopefully I can shed some light on that for you. So be my first YT subscriber - its going to be bumpy ride.
more likely I would just replace ntsc-adaptive with it, since it’s just that with some more stuff added on top (and ntsc-adaptive is itself just maister’s ntsc shader variants smooshed together with automated switching of its behavior based on the image dimensions).
I had been backporting guest’s additions into ntsc-adaptive and modifying his presets to point there (as is standard practice in the repo to avoid unnecessary code duplication) but it was making it harder to update guest’s shaders, so I just stopped bothering with it.
I just swapped out the old NTSC Adaptive for the updated Guest-Advanced-NTSC version and came up with a new mind blowing (at least to me) Sony Megatron Color Video Monitor preset!
It’s really seems better in every way than the old NTSC Adaptive while maintaining similar functionality as well as usability.
Too bad sharing a regular screen shot isn’t going to convey how awesome this preset looks in person.
There must be a way to capture proper screenshots using RetroArch in HDR mode.
I might have to setup my camera and tripod and practice a bit because I really want others to be able to understand how awesome this simple combination is.
@MajorPainTheCactus, all that I would like to see is probably an easy way for users to integrate the new Guest NTSC stuff in your MegaTron signal chain because that’s a part of the CRT emulation process in my opinion and Shadow (Dot) and Slot Mask implementations that would work correctly with my LG OLED TV in a similar manner what how they work in when I use CRT-Guest-Advance and Mega Bezel Reflection Shader. We know that LG OLED TVs can do all three mask types but what we might need is just a special implementation to get things drawn and aligned correctly.
This thing (shader) is fantastic and game changing. I have no reason not to use it because I already paid for the HDR right so why compromise when the tech is available now?
It’s just too ironic that it’s just as difficult to show off as a real CRT so many of the uninitiated might just shrug their shoulders and bypass it.
Please, I deserve no credit. I’m flattered, but all I did was delete passes and change a couple of things in the preset file so it all linked up without all the other passes in guest’s shader chain. The real hard work was, of course, done by Themaister, guest, and everyone else who worked on NTSC emulation before them.
Is it possible to also include the Filtering Section in a similar preset?
That Horizontal Filter Range plus the Horizontal Filter Sigma, Subtractive Sharpness and Subtractive Sharpness Shape can really get back some of what’s lost when doing lower resolution scales for blending purposes and on the contrary can also allow the use of high resolution scales while taking a little off the edge if the Filtering Section is used for more softening than sharpening.
I had a similar thought a few weeks ago, actually. I briefly attempted to include it, but I had some trouble getting it to work. I could give it another attempt, I suppose. I wonder if the HD preset’s filtering implementation would be better for a standalone NTSC preset, though, as that can also filter vertically.