Lottes CRT shader GLSL vs CG

Hi, first post here, I’m selling my beloved Sony PVM and trying Retroarch, but finding it rather inconsistent in stability, performance and behaviour, but very consistent in frustration.

I’m working on most of the issues by going through the forum, but a question I’d like to ask is what’s the deal with GLSL vs CG shaders? All the CG shaders run nicely at 4K except Lottes, which is the one I’d like to use (Royale is decent too of course). So Lottes runs very slow, but the GLSL version runs superbly and looks identical, but the Halation edition of the GLSL version doesn’t actually look any different. The CG halation adds the extra bloom as expected, but that performance…

Is there any way to get the GLSL halation version to add the extra bloom? If not, why have the shader there in the first place?

And why do people prefer CG when the performance is so much worse? Maybe I’ve answered that question myself with this halation business. Is it down to video cards? I’m using NVIDIA GTX 1060 m-ITX edition with 6GB if that helps.

Thanks for reading

I just pulled down the GLSL shaders and don’t see any crt-lottes-halation.glslp preset at all, just the regular version.

For reference, the GLSL shaders used to be (mostly) machine-converted from Cg using a script. This script isn’t perfect and some auto-converted shaders may act strangely or not at all, and often with worse performance because the script outputs spaghetti code.

If you have an older batch of GLSL shaders, you may have an autoconverted version that acts/performs differently from the up-to-date Cg version.

That’s odd, it’s in the CRT folder. I opened the lottes_halation GLSLP file and it has the same text as regular lottes.

I’m finding the dot pitch too fine on the lottes shader at 4k anyway. Is it fixed to the resolution? I really want to run Retroarch at native 4K to avoid input lag from TV scaler, but I prefer the look of Lottes at 1440p for some reason.

I may concentrate my efforts on CRT-royale as that seems to have more options, the TV-out version actually looks more like my Sony PVM with RGB than the default royale settings

Yes, the mask effects are tied to the actual screen output, so higher resolution on your screen means higher dot-pitch on the mask.

Often, latency is identical between 4K TVs at 4K and 1080p. You might see if your set has been reviewed by RTINGS, which tests latency on all inputs: http://www.rtings.com/tv/reviews/vizio/d-series-4k-2016 << I know because I’ve been thinking of buying one of these :slight_smile:

If you want to reproduce the look of your PVM, I suggest checking out EasyMode’s “crt-aperture”, which looks really similar to my PVMs:

CRT-Aperture is a really nice shader, I am using that on my TV and it looks great.
Is this a new shader ? Because I don’t remember seeing this one up until recently.

Yeah, it’s relatively new. Looks like ~3-4 months old.

Ok good it’s not just me being senile and missing it then :slight_smile: It’s a relatively recent addition.

Thanks, it seems my tv has no extra lag at lower resolutions. Also quite enjoying the crt-aperture shader