I thought this was just a glitch with ShaderGlass, not with other RetroArch versions and/or GPU drivers too. The fix is to hard-code all the for-loops to end at some maximum number of iterations so that the compiler can unroll them cleanly. I’ve gone ahead and done that now. I’ve also added in another fix to make the comb filter work in ShaderGlass, since it looks like ShaderGlass requires all passes to reference each other in sequential order without history or feedback.
I had also forgotten to linearize when correcting the colors for HDR. That’s also fixed in this version.
Let me know if this version works for you now. The previous one loads fine for me on RetroArch on Ubuntu, but whenever I feel like trying stuff with ShaderGlass on Windows, I have to do random fixes like this to make it compile. Hopefully these same fixes work on your end.
No, I meant any normal CRT-Shader that doesn’t include NTSC signal emulation of course.
Why don’t you try it without the CRT-Shader and Comb Filter and see how it looks first. After that you can try adding the CRT-Shader (aka Scanlines/Mask shader) in between the NTSC Colors Shader and HDR (Decoder?) Shader.
Even using that exact same preset, I can’t get it to look that bad on my laptop. Maybe it’s another one of those OS-dependent/driver-dependent problems, which are harder to discover and fix.
Or maybe you have some other setting on your PC or monitor that is correcting sRGB color into Display P3 color. I think that’s more likely to be the cause of the problem.
I’m leaning toward it being a driver/OS thing, I’ll see if I can isolate the problem later. I’m using an Nvidia 3090 Ti with the latest driver, Windows 11
I tried disabling “automatic color management” in Windows, but that didn’t solve it.
One part of the new CRT emulation code was modifying a mat3 like an array of 3 vec3s in column-major order. I’ve changed it to modify each individual float separately. I’m not 100% expecting this to work, but it’s worth a try.
Edit:
I thought I might clear this up. What I’m doing right now is still a compromise. What this prepend/append setup does is pass the CRT’s native RGB with a D65 white balance into the CRT shader, and then correct into any HDR (more precisely, wide gamut) color space in the final appended pass. The results depend on what CRT shader you pick, but in general, this means we’re preferring better individual RGB phosphor sizes, with some expense in scanline thickness and glow levels.
This brings me back to my “hunch” up here. For context, this earlier post was about simulating an effect where overdriven electron guns cause colors to bleed to the right.
This is a paper that I came across while aimlessly searching the web. I don’t remember what I was trying to do or how exactly I found it.
My interpretation here is that a pure D65 white doesn’t have all 3 phosphors at the same size, and that the maximum scanline thickness is different for each primary. I don’t know for certain whether current CRT shaders account for this or not, but I’m inclined to assume they all skip this. They might be accounting for individual RGB luminance weights somehow, or they might not be. I’ll need to check the code. This seriously matters for how I’ll go about combining my video signal shaders with other CRT shaders, or how I’ll possibly make my own CRT shader if it comes to that.
These different current ratios also determine how bright each primary can get before bleeding to the right. That’s the “hunch” that I had earlier. The problem is that I don’t have data for this for any of these CRTs. The best guess I have right now is the numbers from this paper which don’t have any citation.
“Ref 7” took me forever to find, and when I did find it on some obscure italian file sharing website, it turned out to be mostly useless, except for knowing that greens with a higher x-coordinate have higher brightness for the same amount of current. Here’s the link, if anyone is curious: https://it.annas-archive.org/md5/697568291741cabaf722948f4c80da25
There are some aspects of CRTs where things just fall together and still work despite apparent nonlinearities like this.
It’s true that the spot size ends up different for different colors, macro photographs verify it. But then somehow it all comes together and the colors still converge nicely and have a fairly linear response. People can adjust the color temperature and even the spot size and it somehow doesn’t result in convergence issues or gamma imbalance.
My concern here isn’t about convergence or gamma inbalance. It’s about the scanline thickness, the expanding size of individual phosphors, and glow/afterglow, all of which are things that CRT shaders do deliberately. I want to make sure that my signal and color effects are able to make sense when combined with a full CRT effect.
(Another thing is with color bleeding, when the settings for contrast and color are set too high for the CRT to handle, but that’s more of an “eye candy” effect that doesn’t need to be accurate and isn’t going to be used by anyone for more than maybe 10 minutes.)
I’m not sure. You are geting into complicated territory where things stop being able to be described with math. Empirically, we would expect to see red fringing around pure white scanlines. If we can’t see that, it doesn’t mean the effect doesn’t exist, but it could be too insignificant to have a visible effect or be compensated by some other mechanism. Chromatic aberration makes it difficult to measure this.
also I remember it was randomly moving (change size), and as said in the comments of reddit link, it kinda happen more in red and less in blue and very less in green
That’s kind of the “easy” part. The hard part is understanding how the bleed actually works at a hardware level. My current best guess is that the bleed to the right is just whatever amount of voltage/current went over the maximum limit.
I haven’t read the comments, but I’m inclined to believe that this is caused by even slight amounts of RF noise.
About the images which you’ve shared, those make it hard to tell the difference between chroma lowpass/bandpass and actual bright red/blue bleed. Both affect red and blue more than green. The former is normal and is already being partly emulated by FIR filters (“partly” because I believe a “complete” implementation should be more similar to the inductor/capacitor filters on real hardware which delayed things to the right). As for the red/blue high voltage/current bleed effect, that’s caused by an entirely different kind of electrical component, and it only happens if contrast is set insanely high (often due to wear, aging, or the end user not knowing any better). These images don’t look applicable.
Here are some links I just copied from an older post of mine.
About the RF noise, I just meant that slight RF noise would cause the bleed to randomly change size. RF noise itself doesn’t cause the bleed. It just causes it to befome shaky.