Ok. I’ll take a look at everything later, I don’t even know what’s the next move I’m gonna make at the moment. Either stay with standard or go back to hd, I’ll see after I update my RetroArch and your shaders.
I want to have your baby!
Looks like I chose the right time to return to these forums.
Meant to ask the other day but now that I’ve started working on new settings I should ask now instead of later. With this change of Slot Mask Size settings to Slot Mask Thickness which number should be used for 1440p and 4k? I know 1 is good enough for 1080p.
I believe you’re still going to want to use thickness 1 even at 4k if you want to maintain realistic dimensions. Of course, anything goes if you’re going for an artistic interpretation, but if you want realistic dimensions than a thickness of 2 would require that each triad be 6 pixels tall. So 16x vertical scale for a mere 2 triads per scanline (which is very grainy, like a 9" TV or something, I dunno).
If realism for most slotmask displays is important, then it’s thickness 1.0 all the way to 4k, while 4k has the most realistic proportions. But there are interpretations and special cases, minding the Commodore 1084…thickness of 2.0 might work even with 1080p, maybe not with full strength slot masks, but somewhat mitigated.
A slotmask width of 4 really only works well with 4 pixel masks (like RGBX, or Red-Yellow-Cyan-Blue). I don’t know if it would look good (probably not), but an additional 4 pixel max that just darkens every 4th pixel may also be a good option (so, mask 8 but darkening every 4th instead of every 3rd pixel)
I see. So Slot Mask Thickness set to 1 for all resolution types (1080p, 1440p and 4k) like nesguy said. Ok got it, time to proceed and continue putting together my next update, when I get home later that is lol. Thank you to both of you.
I wonder what will be the GPU requirements for your shader with higher resolution (8k), adding mask filtering and using ntsc passes on top of that?
The “accurate” emulation of masks comes at a price, is it really necessary to focus so much on masks while knowing that as soon as there is motion everything becomes blurry or are there other aspects/effects you would like to implement that you think are more important ?
In summary what is your opinion on the future of crt shaders ?
In a perfect world good crt emulation would require good and adequate display technology. I’m cheering for displays without sub pixels where independent pixels could have any color.
Resolution is another issue and i’m more fond of 6k displays as the next step then 8k. Reasons are simple, like required processing power would be quite lower, they could stay in reasonable pricing and physical size margins while still providing the ‘extra’ over 4k. Not to mention less signal bandwith problems and higher refresh rate.
Last but not least are good color reproduction, low lag and most importantly superior response times, which would eliminate the need for BFI.
With such superior displays existing shaders would have no trouble to look much better and more authentic.
What could also be important is the evolution of low level signal and connection type emulation, even i’m quite happy with the efforts made so far.
8k is still years away, but i don’t like the idea of emulating a 14’’ display on a 80’’ home cinema TV and selling it as the best solution for emulation lol. Processing requrements would be high, i guess a RTX 2060 could manage vanilla guest-advanced, but looks bad even for incomming zen4 apu’s.
Hey thank you for your detailed answer, yup i agree
Hope you’re seeing the effects of various forms of filtering on the shape of the square dots, which is what I was alluding to in our recent conversation about how to make shadow mask “phosphors” appear round instead of like squares without requiring additional resolution. @Nesguy
The lacking resolution capability is a permanent issue here. You can even do some testing by halving your displays desktop resolution and try running some mask setups.
Hello @guest and greetings to all the community. I needed some advice from you: if you wanted to use an antialiasing shader to put before your shader, which one would you recommend according to your experience? Also for this kind of idea is it better to use the HD version or the Advance one? Thank you very much for helping.
FXAA is a great console AA shader, maybe you would want SMAA without the CRT effect and for PC games.
With higher resolution content later HD versions are very nice, you can use the ‘HD’ without much thoughts.
Thanks for the advice, I tried applying the FXAA and then your shader, I tried both HD and Advance, and it gives me this result:
Is there a different, more correct way to apply it, do I need to change any settings in your shader?
You really don’t need an AA shader with standard resolution content. It’s also smearing the image in a notable way.
It’s giving best results with 3D games beyond 400p.
If you want sprite AA best use xBR(Z) or ScaleFX…
Thank you very much for your advice, very kind as always