I’d be very appreciative if you ever decide to do that. As someone who’s gone back and forth between 480p and 720p in PCSX2, without that native stuff for comparison, I feel like I can see arguments for both being more accurate, but obviously it’s all assumptions I’m making by feel.
On the one hand, 720p sometimes looks maybe a little too pristine, to where I’m like ‘Surely this game at 480i/p on native hardware doesn’t look this clean?’, but at others, when running at 480p, some games make me think ‘This image looks too rough in a way that doesn’t feel right’. Some games look better than others; I’ve played some fighting games at 480p and thought ‘This seems fine ’, but then I play a game like Burnout 3 and it can be kinda hard to see oncoming traffic and stuff properly because everything’s so low-res looking.
Obviously it varies from display to display and signal to signal, but I know a lot of people say a CRT aids a lot of 480 games quite noticeably with an almost natural, built-in anti-aliasing, which I don’t feel like shaders particularly provide much of at that resolution. To text and stuff, yes, but not really the 3D assets. But if you boost the resolution a bit, keeping the shader to how it should look on 480p content, obviously brute-forcing more pixels means less aliasing, hence I can see how, in theory, that might more closely mirror how the proper hardware looks, even if the resolutions between the two aren’t the same.
I wouldn’t even NEED it to be a slot mask you use with a shader, the mask won’t really affect that, so if an aperture grille or something’s more convenient then no worries. All the test would be is having that set up and going back and forth between 480 and 720, then looking over at the real deal and seeing which is a nearer match.
Also damn it I keep forgetting to post these as replies and having to delete my posts.