I do not change this parameter and keep the original 200.00. If there is any ideal proportion relationship for these two parameters I would like to know.
Dude, you obviously completely missed all of the Sony Megatron Color Video setup instructions. Tip: They’re in the Shader Parameters. They’re also in the first post, I think. They should also be in the GitHub Repo.
I also have some setup instructions in the readme file of my preset pack.
Anyway, once you set your display’s Peak Luminance Value, you then need to set your Paper White Luminance to whatever looks best to you.
There is no magic proportion. Just use your eyes. You can use your favourite games, probably a white background and/or the 240p Test Suite to assist with getting this right but this would have a significant effect on the overall brightness of the Shader on your display.
@RetroCrisis, we really need a Sony Megatron Color Video Monitor dedicated to the 3 most important setup steps after installing the Shader.
-
Turning HDR On in RetroArch if using Vulkan or On in RetroArch and Windows if using D3D.
-
Locating the Peak Luminance Value according RTINGS, other display reviews or the display’s spec sheet.
-
Setting Paper White Values according to what simply looks best to the user for their particular display and possibly using their favorite games, a white screen and/or the 240p Test Suite to assist.
I have a strong feeling that a significant portion of users might be failing on the first few steps and missing out on this awesome Shader.
Edit:
Step 0. Update RetroArch and all Slang Shaders via the Online Updater. With a warning to manually reapply Shaders which may be newer than the RetroArch Online Updater provides.
This is probably effected by the display’s tonemapping and possibly by the max luminance value in the EDID/Windows as well. I will have to look into this more at some point.
For now, i would suggest trying to get things looking good with trying Peak Luminance at 560, and falling back to 1000 if you can’t get a good result.
Interesting. I’m guessing this is another Windows 10 vs 11 thing.
Yep, and different people are more sensitive to different things.
Well, I made some changes based on your emphatic comment. This actually improves things a little. Thanks Cyber!
Sometimes I want to understand a little more technically, Sorry!
I end up forgetting to touch what is strictly optional (user taste). I was understanding that one parameter was the compensation for the other. Hence, I thought I could choose to control just the Peak Luminance parameter. I was naive and less curious.
And yes, I believe, there are several people, failing to experience Megatron’s potential, due to the times I read comments of disappointment with the results obtained here.
This might need a bit more prominence.
Maybe even a heads up about what users would see and have to deal with once within Shader Parameters?
Hey guys!
I don’t think I’m going to post any new discoveries, but I noticed a few things in my tests with Megatron. I would like to better understand what is happening here. Recommendations are very welcome.
I notice there is a correlation with these three parameters outlined in green:
I can use 4K Display Resolution on a 1080p monitor and have similar results to what 1080p resolution gives me, if I also choose a higher TLV. Is this behavior expected?
But one thing caught my attention the most. Should I or should I not use an integrated scale, in the Retroarch settings (Settings/Video/Scaling/Integer Scale)?
In this test I used a 1080p monitor. I also tested 4x3 aspect ratio, Full and Core provide.
It seems to me, by activating the integrated scale, I have better uniformity with darker horizontal lines separating pixels. I’m not sure which of the two forms is what I should expect from the shader.
I can’t take screenshots in HDR mode. But in these images I took with my cell phone camera, you might notice the difference in the white logo. Please compare these two images:
Integer OFF:
Integer ON:
I’m sorry for the bad images. But in any case, if you want to reproduce the scenario, I used these parameters:
Dusplay’s Resolution: 1080p
Screen Type: aperture grille
Resolution: 800TVL
Preset: crt-sony-megatron-viewsonic-A90f±hdr
You can also try other combinations, but it seems like the horizontal lines will always be there, in most cases.
Expected behavior. The selected TVL will be (approximately) correct if Display’s Resolution matches your actual resolution. Lowering the Display’s Resolution setting doubles the approximate TVL, increasing the Display’s Resolution setting halves it.
You should use Integer Scaling as your default. There are exceptions where you will want to use custom scaling values, but they are rare unless bsnes is your SupFam core of choice.
Those are scanlines. You could get rid of them by editing a saved preset to read
scale_type_yX = "source"
scale_yX = "9.000000"
under “shaderX = “shaders_slang/hdr/shaders/crt-sony-megatron-source-pass.slang”” for 4K (scale_yX = “4.000000” for 1080p), but i would sincerely question why you are using a CRT shader at that point.
Also, i would strongly recommend you try prepending an NTSC shader for at least gen 1-4 console games. For gen 5+ console games (PS1/Saturn/N64/etc.), that clean RGB/monitor look has more of a place, but before that an NTSC shader generally gives the image a softer, more “finished” look in my opinion, tho it can look kind of blurry rather than soft at 1080p compared to 4K.
Those lines in the first shot may indeed be because of the non-integer scaling. Looks quite severe though. The last time I noted this Major Pain said he had trouble seeing it with low resolution content like NES, but here are two parts from shots I took with integer on and off with the Viewsonic sdr preset at 1080p and resized x2, the lines are not quite even in the second example
The downside with integer scaling may be that you have larger borders, this depends on the core/content . You can try integer + integer overscale option, in most cases it should not cut off too much for consoles like NES.
There should be no difference between 600/800/1000 TVL at 1080p. This is because the TVL is calculated from the pixel size of the mask, but the minimum is 2 pixels. Therefore the best you can have is 1080/2 = 540 which is the actual number. If you choose 300 TVL it will use 4pix mask = 270 TVL. To get a 3 pix mask, choose 4k resolution and 800 TVL setting.
Thanks to @Azurfel and @Jamirus for their contributions.
I don’t know if I understood this instruction correctly. Sorry. I saved a preset with the modifications I made and added these parameters and nothing changed. Would it be this?
Again, I don’t think I understand. Shouldn’t I use a crt type shader?
Regarding this, I need to confess: I never liked the blurry look of NTSC shaders. I don’t understand why many people like to use them, other than for the nostalgia factor, because, even in the past, they no longer seemed good to me. Unless I don’t know how to use it correctly, If you could send me a preset ready for me to test, it might also help me understand what you are trying to show me.
This solves. Great!
Thank you very much!
None of my presets use any of Sony Megatron Color Video Monitor’s “code” but I do have presets I have customized for use with Sony Megatron Color Video Monitor.
Is there any reason why HDR would work properly in Windows using an Intel IGP but refuse to even show up as an option in RetroArch?
I’m posting this here for some more visibility. Maybe someone here can test and confirm if Sony Megatron Color Video Monitor works on their Intel IGP in HDR mode.
I’m using a small mini-pc with Intel UHD graphics with an LG monitor that does support HDR, and I’m able to enable HDR in the Windows display preferences, but I’m not seeing HDR available in the Video settings within RetroArch. WIth the Nvidia cards I’ve used in the past, it always was present and I don’t recall having to do anything special to enable it. I’ve tried with the Vulkan and D3D drivers in RetroArch but it didn’t make any difference. Does HDR only work with Radeon/Nvidia chipsets/d…
Retroarch doesn’t take screnshot in HDR mode? I have black screen shots !
Is there a way to apply a curved screen effect with these shaders?
i wanna know something, every single config, has already gave me a pink tint over all the image with megatron, i wanna know why?
That would actually be fantastic, I tried as well utilizing other shaders but it ended up messing up the overall look.
Retroarch doesn’t take screnshot in HDR mode?
Maybe you can try this method instead:
Here are the first HDR Screenshots of CyberLab Megatron 4K HDR Game SNES S-Video Smooth.slangp in action! https://mega.nz/file/FExXyKbQ#HBKrAf5FTLU8fng-bjOIXjMU6rlYm9etJu28RKEmAsw https://mega.nz/file/wZhQxIZS#FSJ4CNdmLSflMfMoCmhAcjf2q9w6FA1WUhVY8gIb5HM I used nVIDIA Shadow Play to take these but there are other methods available.
For anyone having trouble using LG OLED panels, specifically in regards to brightness, I figured out the optimal settings to get a very bright and vivid picture even with black frame insertion on.
Windows 11: Turn on HDR and Auto HDR. I also like to use the SDR-HDR slider and crank it to the max but this is not necessary, I just like using windows in HDR. Make sure to calibrate using the windows HDR calibration app.
Retroarch: Driver: Vulkan (VERY important, if you have it set to GL then HDR won’t work. D3D12 also works but not on all cores (for example can’t use GlideN64 with D3D12). Video: Integer scaling on. I also like to use integer scale overscale, as this makes the picture bigger without sacrificing much of the image. And obviously HDR ON.
Shader settings (I like the Megatron default shader) Paper white: 800 (by default, this is set to 200. This was what had the biggest effect on brightness for me) Peak brightness: 800 —I have an LG C1. If you have a newer panel you can probably set both of those numbers to 1000 or higher, based off the the maxCLL for your panel which you can find on multiple websites such as RTrings— Under CRT settings I bring down saturation, brightness, and contrast to 0 (if not already done so by default) Sub pixel layout: WRGB. Gamma: adjust to taste (I like 2.4). You can also play around with TVL and beam size settings.
That fixed the brightness for me, and actually the picture is very bright even in a decently lit room.
LG OLED settings: Especially If you’re interested in better motion clarity. For black frame insertion you can either use retroarch internal BFI under latency settings, or you can use OLED motion pro. It doesn’t really matter what picture mode you’re in as long as you have peak brightness set to maximum. I leave brightness (black level) at 50, and for temperature I set it to warm50. Dynamic tone mapping: On (in most settings this is brighter, sometimes HGiG looks better, you can play around with this and see what you like better)
Hope this helps anyone else! This shader is so good, especially with BFI, that I find I really don’t have to use my RGB modded trinitrons any longer as especially from afar I really can’t tell a difference/they are really close.
If anyone figures out how to append a curved CRT effect to the shader, let me know.