I just measured the primary colors with my i1 Pro 2 spectrophotometer and there are pretty big differences between Windows 11 Auto HDR and the built in HDR in the Megatron shader within Retroarch.
To compare, first I used W11 Auto HDR in conjunction with the Megatron Reshade shader (neutral preset) with color system set to r709 in the shader menu. My TV is set to color temperature warm1, which is still a lot cooler than 6500K when used without a shader (warm 2 measures also slightly above 7000K).
And this is what I measure with W11 AutoHDR:
With the Retroarch Megatron shader (default HDR preset) and its own HDR tonemapping and also color system set to r709 and same TV color temp I get this:
This is exactly what I saw with my eyes before measuring, there is too much yellow in the green with the Megatron HDR tonemapping.
Also what I noticed is, that whitepoint is too far off the black body curve towards magenta. The whitepoint with Windows 11 Auto HDR is much closer to the black body curve and should also be more accurate, as my TV is set to warm 1 and should be close to 9000K.
The whitepoint with W11 Auto HDR measures around 9800K and with Megatron HDR around 6600K. I think, that 6600K is too warm for my LG OLED with color temperature set to warm 1.
Sure, the Windows 11 AutoHDR oversaturates colors more beyond Rec709, which is also not accurate, but in summary, to me the tonemapping looks just better and coherent with more pure green without shifting to yellow and a whitepoint that’s much closer to the color temperature set by the TV (when measured without the shader) and also much closer to the black body curve without the magenta shift, which otherwise gives the picture a pinkish tint, which I noticed right from the beginning using the Megatron shader with its own HDR TM.
I would love to see, that MajorPainTheCactus also compares Windows 11 Auto HDR with the Megatron HDR tonemapping, if time allows and give us some feedback what he thinks about it