Thanks for the reply. Peak is set to 580 per rtings. I can tend paper white to 1000 and it still doesn’t make much of a difference.
So when the mask is “minimal” in terms of turning the resolution low (to 1080p) and CRT TV resolution to 800 or 1000 lines the brightness is almost overly so, so HDR is being enabled correctly.
I will take some pictures and repost.
To be clear, I am using the latest version of the preset and the recommended underlying shader (both Megatron and Megatron NX were tried).
You would probably need to have your refresh rate set to 60Hz and not 120Hz.
You have to use “Computer” picture display mode.
You need to enable “Full UHD Color”.
You also have to be using HDMI Inputs 1, 2, 3 or 4. Not 5 at all.
Also, make sure your HDMI cable is rated for HDMI 2.1 Ultra High Speed 48Gbps to be safe.
You can use all of the presets like this. HDR is not a requirement.
This is a bit vague, maybe some photos might help me to understand better. Do note that your refresh rate needs to be 60Hz and you need to enable Full UHD Color as well as be using the " Computer" Picture Mode for everything to work as it should.
You also need to enable HDR in RetroArch Video Settings as well as in the Shader Parameters of the Preset.
You’re conclusion is probably a bit premature. Call it a theory or hypothesis instead at this point when there’s still a lot more experimentation, testing and variables to be factored in. What I’d say is that you’re getting closer to the root of the issue and you’ve made a lot of progress so far.
I wouldn’t worry too much about this. These Masks are precise down to the RGB subpixel, that’s why they any chroma compression destroys them. Rotated Masks/Shadow Masks and two pixel Masks like the ones used for the higher TVLs wouldn’t suffer as much from this.
On most TVs but not all. Some TVs can get just as bright or almost as bright in SDR mode compared to HDR mode.
I think with enough patience and understanding you can get it to work with your current TV.
No problem, I want as many users to be able to enjoy these things as possible.
Can I get some information on how pixel density effects the shaders? I am considering a 1440p 24’ monitor as I love the clarity and small screen size for web/regular gaming, but for retro gaming I am wondering how the high ppi will effect shaders. Will there be any difference from a more standard 1440p 27’? Will the scaling/scaline size/etc be effected? This is pretty niche so if there is a difference I assume there are no presets or guides for it.
Pixel density contributes to the CRT Shader experience but you still need raw resolution. 4K really is where it’s at now but 1440p can suffice if you’re not using an OLED display because there still remain some subpixel alignment issues with respect to 1440p and OLED which have been more or less solved at 4K.
So beware of 1440p OLED for accurate CRT Emulation.
PPI shouldn’t affect scaling but all features, including scanlines would naturally be more dense as a result of the denser display.
A 27" 4K or even 32" 4K you should be in for a real treat, especially if you get a very bright HDR compatible display.
1080p and 1440p can also work but I’m biased towards 4K because that’s what I have the most experience with.
Thanks, I understand 4k is optimal but my pc cannot run 4k for standard gaming and cant afford a second expensive display just for retro. Im going from my current 1080p 24’ to the same size screen but in 1440p, If I understand you mean the scanlines more dense or appear thinner on the screen than a standard 1440p but the scaling of each line per integer scaled game pixel would be the same? And does HDR really improve the visuals of the filter? I was under the assumption it was just about the brightness. Is that mostly important just to 4k displays?
Hi, just now getting my feet wet with these more complex CRT shaders and I believe I’ve found some good settings using the NX 4k HDR pack. However, I’d also like to experiment with some different bezel overlays. Unfortunately I discovered that the NX shader is applied to any bezels I activate through the “on-screen overlay” menu in Retroarch, messing with both the color and the brightness of the bezel image. Lowering the opacity of the overlay kind of helps, but it still doesn’t look great.
I’ve found several recommendations that involve injecting, for example, Duimon’s bezels into one of the Mega Bezel presets, but I can’t get that method to work with the NX presets. Am I just using the wrong syntax? Or is there a better way of doing this? Duimon’s bezels look pretty slick but I’d be fine using a simpler PNG bezel instead if it’s just not possible to combine them with the NX pack.
(For the record, I don’t think my PC is powerful enough to run the Mega Bezel shaders instead. Testing it made my framerate slow to a crawl, probably because I don’t have a discrete GPU.)
Hello, i recently came across your shaders from a video on youtube, but im a bit confused by the multiple versions . I have a mini led TCL TV with high brightness and i want to make use of your HDR presets, but also use the reflective bezels, which packs exactly do i need to download to make use of them ?
This makes everything you would like to do all the more difficult.
Please list the specs of the PC.
Yes, this is unfortunate. Those bezels you’re trying were probably never designed with HDR in mind so that might be your problem there. I just uploaded a new pack with some HDR Ready Mega Bezel presets but alas your computer.
Maybe try my CyberLab Megatron NX W420M Shader Preset Pack and keep everything in SDR in RetroArch then maybe use Windows 11 AutoHDR to apply HDR properly to everything in order to boost the brightness.
Post some photos of the issue. It helps.
Maybe you can use Duimon’s preset with the Sony Megatron Base Preset in Mega Bezel Reflection Shader.
By the way, in my understanding I don’t think Shaders are applied to regular Overlays. Maybe you’re referring to the effect of HDR when you enable it.
I have a 65 inch TCL C935. Thanks for those HDR presets, i will try them in a bit. Do i need a specific version of the mega bezel reflection shader or i can use the latest version ?
Sorry that im probably asking stupid questions for you, but it says the latest matching HSM mega bezel shader version, but i cant figure out how to check which version of those shaders does your presets use. Im currently using the latest version and it seems to be working fine tho ?
Nah, it’s okay. It’s not stupid but I do sense a lack of passion, determination and will from some though. Everyone isn’t the same but it saddens me when I feel like people don’t even try or don’t try hard enough.
I strongly suggest you take some time to read through the first post and at least browse the thread if you want to make the most of the potential of these preset packs as there is a slight learning curve.
Okay
Well what exactly do you define as working? Are you sure that what you’re seeing is even remotely close to how those presets are supposed to look? If you do then I guess ignorance is bliss and you obviously can’t miss what you’ve never seen.
Not to be taken personally micro-rant aside, feel free to ask questions after reading through the first post and admittedly outdated readme.txt files and I’ll be happy to answer as many questions as I can as all I would like is for others to experience the same joy, awe and nostalgia that I experience.
Estou usando o SDR, não testei ainda o HDR.
Mas estava com preguiça de editar o HDR.
Mas fica perfeito em SDR também.
Tenho medo de deixar em HDR por causa do Burning.
Mas agora estou testando na Sony Wega
Isn’t that considered cheating? You should do some comparisons with the Shaders set to off.
By the way what graphics card are you using? Also, your Mega Bezel version looks old. How come you haven’t updated to v1.14 yet?
That is a valid concern, especially if you’re not mixing your content. @hunterk created a shader which shifts the image down and up by 1 scanline so that there wouldn’t be uneven wear caused by the scanline pattern on OLED TVs.
It doesn’t seem to work properly with Sony Megatron Color Video Monitor yet though but it would be a nice feature to have built-in and maybe continued to be worked on.
In the meantime, enjoy your redundant shader tests…lol. Maybe you can try turning off the Mask and Scanlines. I can understand using shaders on a CRT Monitor but not a standard definition consumer TV because the TV and the Shaders are doing a lot of the same things.
Hello Cyber. I don’t have anything important to say other than thanking you for your hard work. I just discovered this 2 days ago and spent like 2h looking into documentation and settings.
Can’t wait to get back home on Sunday to try it. Cheers!