Some CRT shaders have implemented CRT phosphor decay emulation. However, at 60Hz, the decay can only happen every 16.7ms. This is too high and as a result phosphor decay is ghosting too much compared to a good CRT. Did anyone try to make use of 120Hz displays to provide a finer phosphor decay emulation? That is, the emulation runs at 60FPS but the shader is updating the image at 120FPS. Shader effects like this could then be more fine-grained.
Maybe other shader effects could also benefit from this, although I can’t think of any right now. Maybe motion blur effects.
Although for this to be possible, RA would have to implement an option that multiplies the FPS of the core by an integer value. 2x, 3x, etc, up to whatever the display supports. Right now, 240Hz is the highest refresh rate you can find in computer monitors. I guess VRR (freesync/gsync) would be needed to guarantee there are no stutters though.
RA already has something similar: black frame insertion. This doubles the output FPS. Maybe this could be extended in order to implement a core FPS multiplier?
Just a shower thought I had this morning