@Mickevincent Thanks! I did a bunch of tests and I don’t think I can improve on it. With a brighter display you can use higher values for the beam shape high parameter, maybe up to 150.00. To completely eliminate all bloom with white pixels requires 200.00 or 250.00, I think. That would require probably OLED without automatic brightness limiting (probably involving some kind of dangerous firmware hack) or maybe a QLED with really good local dimming to keep the black level low when the backlight is cranked up.
Bright boost can’t go higher than 1.30, and should probably come down to 1.25 or 1.20 (or lower!) on higher-contrast and brighter displays.
I’m still trying to figure out some default settings. I assume most people are not going to have a custom LUT for Retroarch. @Dogway what are the best grade settings to use if people aren’t going to color manage retroarch / do any LUTing? I assume we want:
CRT gamma: still use the same recommendations for region
gamma type: sRGB
phosphor type: none
signal type: sRGB
display color space: sRGB
white point: not sure. I can go as high as 7000K before there’s any clipping, with hard clipping around 7200K. But 6500K is the spec for sRGB so maybe we should just do that?
In short, if we’re not going to color manage RA should we just set everything to the spec for sRGB? Or will it still be more accurate to set everything to the region-specific settings?
Also, when we make a LUT, I assume we want to make a separate LUT for PAL and NTSC color spaces? Or do want to make a LUT only to use with a wider display gamut and make the LUT for that gamut? Would choosing that display gamut in our graphics card settings be a faster but less accurate way of accomplishing the same thing?
Is restoring permissions something I need to do, or does this happen on a timer or something…? I didn’t quite get what you meant by that (talking about Github).