I was struggling to get a PC source port of Zelda Ocarina of Time (Ship of Harkinian) working with this shader via reshade, it uses DX11 but in the end the AutoHDR add-on caused the game to crash on startup. I managed to get HDR functioning via Special K and the shader seems to be playing along decently, are there any known issues with this kind of implementation that I should be aware of?
So Megatron/ReShade AutoHDR and Special K aren’t (inherently) doing the same thing with HDR.
At least with more modern games, Special K is recovering and presenting the native/true HDR information that a game was rendered with internally before it was converted into SDR for the final presentation.
Megatron and ReShade AutoHDR are instead converting the SDR image directly into an HDR space and then cranking the brightness to 11 to offset the brightness lost as a result of using a full strength phosphor mask.
But i suspect you can set up Special K to do what Megatron is doing given how ludicrously versatile it is? And that may even be what it does by default for games that aren’t internally rendered in HDR? (Assuming more recent versions of Harkinian aren’t rendering in HDR internally? I haven’t looked at what has been added for a bit now.)
Actually stopping to think on it, is Win11 AutoHDR is even fit for purpose?
Like, it’s a machine learning “AI” based blackbox attempting to mimic what a native HDR presentation for the material in question would look like, rather notoriously trained on the presumption that all games utilize the same realist “AAA” art style (or at least trained with those sorts of games in mind as it’s primary use case.)
Compared to AutoHDR, one could (probably) get more accurate results by directly displaying the SDR image in HDR using the “HDR/SDR brightness balance” slider as our paper white (with registry edits for higher values), tho that would also clamp the colors to rec 709, and that piecewise sRGB decode would still have to be compensated for somehow.
I know that the current color profile based solutions wouldn’t work for correcting the piecewise sRGB decode, as they also effect the final HDR image.
If Special K can be set up to use HDR solely to boost the brightness of the original SDR image, that may actually be the best course. I’ll have to look into this some more…
Blimey is it really using machine learning? I would’ve thought itd just use a calibration screen on start up and be done with it. I mean yes you might want to get a more artistic look but then it doesnt know the intention especially for any new games. Sounds a little over engineered to me - sounds like bored developers trying to jump on the A.I. bandwagon. I suppose there isnt a good solution so why not. 🤷 (Except for gpu cost and expense of electricity).
How does special K know where to inject (hdr tonemapper) and what to skip/replace - lots of games wrap/embed tonemapping into the final upscaler for instance.
Yep, part of Microsoft’s desperate search for a proper nail to go with their fancy new AI hammer.
In fairness to them, AutoHDR does by default enforce a whitelist of games it has been (hypothetically) tuned for. It’s a portion of the community that has decided to force it more generally.
I’m not sure right off. Probably best to go straight to the source and search/ask on the Special K discord. I suspect they would be pleased to have you around in general given that both Megatron and your “AutoHDR” ReShade plugin come up for discussion on occasion, and they take a certain pride in being a one stop shop for HDR discussion. (Tho personally i still have very mixed at best feelings about discord overtaking forums for these sorts of discussions.)
Yes, I have a big problem with this, as well. It’s all down the memory hole these days. To be fair, discord is to IRC what reddit is to forums, but reddit is becoming harder to search now, too.
Is it possible to have Slot Mask and Shadow (Dot) Mask Presets that look like this currently in Sony Megatron Color Video Monitor (not including the overlay/bezel/background)?
These all look immaculate on my WOLED TV in terms of CRT Mask structure.
I would really like to be able to do these using Sony Megatron Color Video Monitor.
Set TVL to 600, all three Scanline Min settings to 1.00, and all three Scanline Max settings to either 1.00 or 1.50.
You can also try Mins at 1.50 with the Maxs at either 1.50 or 2.00, but that is probably be a bit more diffuse and bloomy than you are looking for.
They all look like the masks that the Sony Megatron uses to be honest. What exactly are you missing? Can you make a close up shot of the particular thing and describe what youre missing or cant achieve? Just zoom in on your phone and take screenshots is all I need. What @Azurfel said should get you the shape of the scanlines.
Yes I should, when I did the Reshade plugin I actually got a message sent from the SK author about updates to the plugin. All were good suggestions if I had the time. I do have a plan to improve upon all of this in my current project and you will be able to create bespoke hdr setups using the games internal linear space lighting and have that play nicely with the games post processing. But its not going to be automated/one stop shop for all games unless theres some secret sauce Im missing to do that. Lets see where I get to though.
Greetings and Happy New Year to you and all.
There’s a long standing issue when Deconvergence is turned on that affects the scanlines negatively which ends up looking like the slot is a bit crooked and about one subpixel higher across the red as opposed to the green and blue phosphors. In screenshots/close ups, I can see a pulse wave like shape of the scanlines when Deconvergence is On. With Deconvergence Off the scanlines look straight again.
I mentioned this here before the focus shifted to the colourspace and transform improvements.
300TVL RWBG Slot Mask - Deconvergence Off
300TVL RWBG Slot Mask - Deconvergence On
Please view/analyse the images in the crosslinked posts as well.
What I’ve observed is that the Slot Mask example from my Mega Bezel Preset which uses CRT-Guest-Advanced, has a height of 5 subpixels.
The one in the example that @kokoko3k posted appears to have a height of 6 subpixels.
While the one in my Sony Megatron Color Video Monitor 300TVL example has a height of 7 subpixels.
I would like to be able to customize at least the exact height of the Slot Mask in a granular fashion, possibly down to the subpixel level.
I can understand using fixed/rigid widths for simplicity and as you’d notice in all 3 examples the width seems identical.
I don’t want to bite off more than I can chew at the moment but what I would say about the Shadow (Dot) Mask is that at lower TVLs it produces some strange looking pattern artifacts. What might help is a way to rotate/stagger the subpixel colours to assist with alignment on different display types.
At higher/Finer TVLs it isn’t as much of a problem because the dots are so fine that they’re look more like a subtle texture than, RGB phosphor triads in a triangular arrangement.
I don’t mind if we can get at least the Slot Mask to look flawless first on all display types before attempting to tackle the Shadow (Dot) Mask.
I have no issue with the display of Slot Mask on my WOLED TV using 3 different widths/TVLs and various heights I’ve tried when using CRT-Guest-Advanced via Mega Bezel Reflection Shader.
I’ve tried RBG, RBGW and RRBBGGW and they all look great.
Happy New year to you too @Cyber yes so there are different height slot masks you can get with Sony Megatron I believe the 5 pixel high one can be selected with 4k and 600tvl - I think! Try different resolutions and tvls to go through all available slot masks.
Yes this is how deconvergence works though - it will be a fraction to multiple sub pixels out. I suspect what is really the issue is that the fall off is too steep in combination with too low resolution and/or to coarse slot mask i.e the horizontal line is too fat due to it being one pixel in size.
Has anyone put this head to head yet with some of the CRT filters on the retrotink 4k?
Why put them head to head with RetroTINK 4K when they’ve already been put head to head with real CRTs?
Thanks, I’ll try them. What I wanted was to be able to get 4K, ~300 TVL and 5 or 6 subpixels high at the same time though.
I’ve since edited my post with zoomed in pics showing the crooked scanlines a bit more clearly.
Ohhh, okay, i think i can see what you are talking about now:
That?
If so, is that actually incorrect behavior? Because it does make physical sense when you consider how the technology worked. But i also honestly can’t recall ever seeing a real slot mask display with both clearly defined scanlines and such substantial deconvergence in the first place.
Edit: That does appear to be an accurate behavior. From this post:
I wouldn’t consider these scanline gaps to be crooked.
It doesn’t look like that in other shaders I’ve used though. No matter how much or how little deconvergence is used.
It’s a long time I’ve left that on the back burner but I don’t think it matters how much deconvergence is present in the preset. Once it’s on I saw that pattern. That can’t be right. If it is, it certainly does not look right.
Maybe the scanline gaps need to be darker like in the example you posted for it to begin to look like proper deconvergence but it just looks like a bug to me.
I think you might need to go back to the zoomed out image and take a look at these scanlines and triads for a bit. Something clearly isn’t right to me.
That doesn’t look anything close to the example image you posted or what is expected.
I understand when it comes to Special K, this is probably a rabbit hole that no one necessarily wants to jump down when there are already a multitude of other more significant and impacting things you might want to look into, but thought I’d provide a couple photos of a before and after when running the application with no shaders or HDR, then with Special K HDR + Sony Megatron shader for reference
I was planning on supplying a retroarch comparison using the same scene but it seems the source port deviates enough in how it presents the scenes that I wasn’t sure how helpful it would be
The pictures I took of Z64 with the shaders on definitely look a darker than they appear irl, just to make sure that’s known
Imgur: The magic of the Internet
Edit: for another comparison, not sure if it would be informative or not, here are the same scenes without the Sony Megatron shader but with Special K HDR 10 injection enabled https://imgur.com/a/2heLWDL
Here are the SK settings I found provided decent colour, though I didn’t muss around too much. When set like this it seems to look quite similar to the Auto-HDR add-on, including the intense oversaturation when the Sony Megatron shader isn’t active as tested in other games that don’t crash on me