What @RealNC is talking about goes beyond what is already there including what can be done using simple presets.
His method preserves game settings even if the Core Preset Changes. Normally if you adjust croping for example in HSM Mega Bezel Reflection Shader and you save a Game Preset, those crop settings will persist as long as you don’t try to change the Core Preset to another one. If you wanted to change the Core Preset for a game that has a Game Preset saved, you would have to save it as a new Game Preset but all of your crop settings would be lost if you didn’t manually set them again before saving the new game preset.
Oh, sure. I was specifically talking about 2-phase NTSC and if it’s possible or not for a CRT to decode an un-dithered waterfall in 2-phase NTSC signal, as implied by @RealNC
Ah yes, you’re right. I see what you mean now. My bad!
The artifacting if we compare 3-phase and 2-phase is also different (default settings). 3-phase merges fields by default, leaving mostly static color artifacts.
2-phase temporal blending is quite neutral, but there is a lot of colorplay with moving scenes and screenshots, luma artifacts from sharpening are more pronounced. Default settings also involve a lot of fringing, which does temporal blending, but introduces spiked artifacts on moving scenes.
If we return to ntsc anti ringing, here is a nice example:
normal:
with anti ringing:
An advice how to perhaps setup some phase related presets:
With 2-phase ntsc adaptive sharpness works very well, since the ability to resolve dithering patterns isn’t affected.
With 3-phase there more options, including ntsc resolution scaling, together with mentioned sharpening.
Just some explanations until documentation is ready.
I’m not an expert. Whenever a CRT produces a sharp image through composite or RF with less artifacts and less blending, people usually mention “comb filters”. Example:
I use my presets in S-Video mode because, in general, I don’t like NTSC temporal artifacts. I do like the other NTSC color blending artifacts, though.
Ah, I see the difference in your example there. Great!
But I don’t mind that much ringing in the normal picture. It’s very subtle and looks like it’s accurate to how NTSC works on a real CRT.
The problem I had with my presets was that increasing Gamma Correct or Bright Boost Dark Pixels exaggerated the ringing much beyond that, to the point of it being very distracting, and this new setting doesn’t work in that scenario.
Here’s a pretty bad example with some Gamma Correction and Bright Boost Dark Pixels with NTSC anti-ringing disabled:
And now with NTSC anti-ringing enabled:
There’s practically no change, so this problem is somewhere else in the way Gamma Correct is processed. But you already explained it a few posts back, so maybe it’s just intentional.
Regardless, from what I can see in your example, it’s a great new setting for those that want to slightly reduce the NTSC ringing. Thank you very much!
Hmm. I watched the raw video a few times but it’s a bit hard to see. But it looks to me that the vertical bars of the waterfall are perfectly blended, like in NTSC-Adaptive.
In NTSC-Adaptive, as soon as you raise the NTSC resolution scaling parameter by a single notch, the vertical bars in the waterfall start to be visible. And correct me if I’m wrong, but I was under the impression that any NTSC resolution scaling value above 1.0 is beyond spec, and inaccurate to real CRTs and consoles.
Now, I’m not saying it’s not possible. And I might be wrong about NTSC resolution. I’m just saying that in that particular video the columns seems perfectly blended to me. But it would be cool if there were a video or photo of a more close-up shot. Link me up if you find any!
What can cause some kind of ringing, or more like wider color artifacts, is trying to do gamma linearization on a picture that went through YIQ conversion and was modified before returning to RGB.
I was using this picture where it was really visible when trying to deal with that:
Could be, yeah.
There are vertical water ripple patterns even on a fully blended waterfall, though. Other games, such as Aladdin that make use of solid blocks of dithered colors would be easier to see. And more close-up shots would be nice. But you might be correct. It’s a bit hard to see though.
And also, maybe increased NTSC resolution scaling is within spec. Because I’ve seen shots online where the waterfall is not dedithered on S-Video modded Mega Drives, while S-Video mode on NTSC-Adaptive dedithers them completely. I’m trying to investigate that, but I found no documentation about the NTSC shader, and the code itself is not commented.
Looking good! Preset please!
I wish I could benefit from those low-brightness presets.
Could that be what’s happening with Gamma Correction and Bright Boost on this shader?
It’s more like blue is mixing with red and green, creating a brighter, less saturated and more luma intensive situation.
It’s quite ntsc specific and can be avoided with some tweaking.
Newest versions are now in the official libretro slang-shaders repository. Older presets and versions are not available any more, but you can still download them at the first post of the thread.
Big thanks to @hunterk for helping with this.
Also thanks to all participating/watching in this thread.
There are still some things on the TODO list… dunno 100% about the schedule.
we can’t use these videos as a reference IMO because the camera is adding quite a bit of blur.
Yes, almost all TVs in the 80s had some kind of analogue comb filter, and in the late 90s you started to see digital comb filters which basically made composite indistinguishable from S-video.
The NTSC shaders are emulating a raw NTSC signal, while most TVs had some kind of filter.
In any case, what we really need is an in-focus still shot of the waterfall. I don’t think there’s a single CRT that doesn’t bland the waterfall completely via composite.