The main difference I’m seeing is that the phosphors in the real CRT photo are much better saturated (particularly in the highlights) and don’t have any of that between-phosphor glow that you see in the shader.
Disabling glow and bloom would help close the gap. These lighting effects essentially blur the image in different ways to try to make it brighter, and are only really necessary with an SDR display.
I’ll have to see an actual example of the OLED subpixels looking like CRT phosphors before I’m ready to conclude that we’ve solved this problem. I think the spacing between active subpixels is going to be too much to be overcome by the displays’s natural bloom, and you’ll always have that very obvious grid. Hopefully, I’m wrong about this.
Come one… it’s normal…it’s because you are using scart cable!!!
In this forum when there are speaking about NTSC i’m pretty sure its not with the scart cable… it’s more like with composite… (all video signal in yellow cinch cable…) isn’t it ?
This kind of halo and melt color is because the 3 colors is melting in the signal cable…
With scart RGB is separated so the video signal is clean.
It’s like on PSX… when you connect with composite the result is fucking bad… but with scart it’s clear. I think there is something confuse between NTSC and PAL on the subject… Pal have more lines but 50hz (before the PAL60 appeared with dreamcast) NTSC have less lines (so a little bit less resolution) but 60hz… Also the colorimetry is changing between PAL or NTSC…
Now people can correct me… if i said some big mistake.
Ah, my bad, I misinterpreted then. I thought he was recommending Mask Bloom for all cases.
Sure, but that’s not what I’m going for. I have no practical way of constantly raise/lower my TVs brightness so I want to keep it at a normal level. I know I need bloom to have enough brightness with a CRT mask. That’s not the problem.
From my experience, keeping, mclip at 0.5 and Bloom at 0.5 looks the same as mclip at 0 and Bloom at 1. Or at least, very similar. So I still don’t know the purpose of mclip. I imagine it might interfere with other parameters though.
Thank you for the preset! I’ll try it as soon as I can
Not sure how many nits my TV is set to output, but it’s probably lower than 400. It’s calibrated by a colorimeter though, and I verified that it’s gamma tracking is spot-on.
Please post the NTSC version too. It’s the one I’ll be using.
I recommend you use the Mupen64 core. Parallel is kind of deprecated and Mupen64 absorved it’s features, such as Parallel RSP and RDP.
Most Nintendo 64 games output 240p, but some are 480i. I’ve tested both 240p and 480i N64 games with this shader and they work perfectly with guest’s default parameters. I use Mupen64 with ParallelRDP and ParallelRSP. Hit me up if you need help setting it up!
That depends if you’re using 2 or 3 swapchain images and how your driver handles them. Typically, with 2 swapchain images, the CPU and GPU processing is sequential, to reduce input lag, while with 3, the CPU is preparing 1 frame in advance while the GPU is rendering the previous frame in parallel. In the former case, both CPU and GPU contribute to the time it takes to render a frame, but in the latter case, only the slower of the 2 is the limiting factor.
Now, it depends a lot on what GPU you’re using, but I would say any modern GPU will always be faster at processing the NTSC calculation than your CPU.
Your CPU will always seem underutilized with emulators because they’re mostly single-threaded. Even multi-threaded ones typically have one heavy main thread, and light auxiliary threads. You can check this easily by enabling fast-forward and check the CPU usage. For example, Mesen at fast-forward (producing frames as fast as possible) uses only 7% of my CPU. That’s because it uses 1 or 2 cores and cannot make use of the remaining resources.
And don’t forget that the Blaarg processing will only happen sequentially, after the core’s frame is ready, so it will not be more efficient to run it on the CPU.
Now, this obviously depends on the hardware you’re running it on, and how heavy the HSM Mega Bezel Reflection Shader is. I’m talking about a modern CPU with a modern GPU. And ofc, I might be wrong, but I think that’s how things work on RetroArch. But please feel free to correct me if I’m wrong!
Actually, my (now possibly debunked) performance theory was kinda low on my priority list as to why I use Blargg NTSC. I mainly use it because I think it’s cool. Lol
I’ve always wished I could’ve tweaked it and made it work well with certain cores, for example Mednafen/Beetle, so when I learned that I could through a post by @Juandiegous, I eventually went to town on it and was really satisfied with the results and the simplicity of the parameters. The only thing I’m not 100% sure about are some of the parameter ranges but I trial and error’d my way till I got the results I have right now.
I just tested your preset and here’s the problem I mentioned earlier showing itself again:
Here’s the preset you just posted:
And here’s your base NTSC preset:
As you can see, Mario has a very ugly glow around him on the slotmask preset that is not an NTSC artifact, because it is not present in your base NTSC preset. This glow is even more distracting in motion than in the screenshot.
Another problem I found is that you have a bit of clipping on the highlights:
I can still see it in the base ntsc preset. It just manifests itself a bit differently.
Here is an example of crt-geom-ntsc:
Clearly something ntsc shaders produce in this specific case.
But you can try GTU, crtsim, mame-ntsc…
The highlights aren’t acutally ‘clipped’, just the contrast is manifesting a bit differently. Bloom applies a smoothing effect which in counjuncture with ntsc smoothing, washes out the transitions.
(ntsc smoothing is quite aggressive, it can dissolve dithering, transparancy effects etc.).
The main deal is that the implementation is probably not going to change.
But you can reduce the Substractive sharpness parameter from filtering option to mitigate the effect.
Increasing NTSC resolution scaling helps here also.
Edit: at least with my display the color bars (R,G,B) are clipped even without any shader:
Might need to go back through and check everything again once it’s corrected. I wonder if this explains why your presets always look washed out for me.
People!! it’s very good all your work about NTSC ugly preset… but when you will be finish to play… it will time to work on the “ultimate CRT GUEST very light clear neat shader” (joke inside)
I know, I’m just commenting a point of view. I just wanted help for making the HD version look like my CRT Tv. That’s all. I like the NTSC how it looks too…
i agree with you … in my case i don’t care about ntsc version because for me the result is not very good except maybe for dithering and transparency… and i don’t have the nostalgy about playing game in such conditions
In my case if we can find a perfect preset for 1080P looking the closest possible of CRT - scart could be perfect for me.
You’re right! I can see it a bit indeed. And I’ve tried your slotmask preset on the non-NTSC version and there is no artifact at all.
It’s a shame that it manifests so strongly with the gamma correct and brightness boost controls. I fiddled with Substractive sharpness but it doesn’t help at all. Increasing NTSC resolution would ruin dedithering, so I don’t use it.
With no shaders, the R, G and B bars are not clipped at all on all my monitors and TVs. That would mean problems on RetroArch or on the core. But I see you’ve already found the problem in your monitor settings so all is good.
Anyway, thanks for the help! I’ll try and arrive at some middle ground where the halo is not as pronounced.
This ‘haloing’ seems to manifest itself quite strongly on blueish backgrounds. NES games also automatically trigger 3-phase ntsc filtering, which isn’t able to dissolve dithering/transparency effects.
I guess it’s a good idea to create a separate preset for games which use 3-phase or 2-phase ntsc filtering.
(this depends on horizontal input resolution: up to 300px: 3-phase, greater than 300px: 2-phase. But it can be also set with a parameter.)
I like @sonkun’s approach, where he created quite many very nice presets.
I thought about that, but I kinda want to simulate playing all my games on the same “TV”, as that’s pretty much what happened back then. And since the 2/3-phase thing is already automatic, I don’t have to worry about that.
I recommend this too. I was lucky enough to buy NEC monitors that use a probe that make it so that they calibrate themselves. Sometimes I take it for granted.
You can do this but you are using an emulator so you would also have to take into account the differences in video output from the various consoles and systems.
This can also be simulated by the use of shader parameters.
If you think of the shaders as a tool to only simulate the CRT TV side of things then you’re going to miss half the plot. The importance of this can be seen when comparing Genesis RF and Composite Video output to SNES, NES & Turbo Duo. The characteristics are infamously different.
Create the perfect preset for SNES, then try playing a NES game like Ninja Gaiden using those same settings and see for yourself, Ryu is just going look all wrong.
Make a sweet NTSC de-dithering preset for Sega Genesis and you can enjoy colours that rival any other 16-bit console. Use that same de-dithering preset on a high res system like the Turbo Duo and say goodbye to any idea of legible text. Meanwhile at the same settings you can read the text clearly in any Sega Genesis game.
Okay, I’ve given away a little too much of my secret sauce. I hope you’ll understand where I’m coming from.
From the way you sound, I have a feeling Sony Megatron Color Video Monitor might be a really good fit for you, where accuracy and matching CRT qualities down to the phosphor are of paramount importance.
Have you given that a try?
If you want to learn more about what I have said above you can head over to my thread and take a read through my first post and look at some of my videos.