Global Color Correction Settings

Hello all,

I’ve just got a quick question, I’m not sure if this is already in Retroarch and I just haven’t found the setting, or if not, whether it’s something that could be implemented or whether it’s impossible due to the way Retroarch works.

This probably sounds a bit weird, but I’ve been trying to recreate the washed out look of raw footage captured from old analog consoles outputs via RF/Composite, similar to the way the game footage looks in old episodes of Game Center CX. Here’s a screenshot of the show on the left, and my current emulator setup on the right for an example: https://i.imgur.com/qWX94uK.jpg

I’m using Blargg’s NTSC filter for cores that support it, or Themaister’s shader for those which don’t, and not cropping any overscan.

I’ve noticed however that the footage on the show usually has more washed out colours, for the NES/Famicom I’ve made myself a custom palette with RGB values I got from altering the lightness of a palette generator screenshot, but this only works for the one system.

I was wondering if there’s a way to turn up the lightness/brightness/luminosity (not sure on the correct terminology) in Retroarch, so that all cores have the sames lightly washed out look? (without changing my monitor settings).

Would appreciate any help or advice, thanks.

Yes there are a couple of ways to do that.

You can use the image-adjustment shader in the ‘misc’ subdirectory to adjust all sorts of things like that alone or in combination with the ntsc shaders.

The other option is to use the reshade/LUT shader, which can generate color profiles using custom LUTs created in Photoshop/GIMP/whatever:

Many thanks for tip, I’ll look into that image-adjustment shader. Would it matter if I placed that after the ntsc shader, or before, in terms of the order of shader passes? Sorry if that’s a dumb question, I know enough about shaders to fiddle around with them but I’m not too clued up on how they actually work.

Usually, I would suggest putting it before, but in this case, I think you might be better off putting it after.

Just had a quick mess around with it and I’ve found that just setting brightness boost up two notches to 0.10 seems to do the trick - I’ve gone years without knowing about something as simple as that image adjust shader, thanks for your help.

I also think washing out the colors looks better for the older consoles, but not too much… I remember when switching to an RGB cable back in the 90’s on analog PAL TV the colors where intense, especially the maxed out reds became almost painful to look at! I think when they created those games they exaggerated the colors on purpose to compensate for the loss of color with ‘sub-optimal’ cables and tvs, room lighting conditions etc. So while the RGB cable definitely looked good it was kind of too good :sunglasses:

Just thought I’d throw in a recommendation for the crt-lottes-fast shader which looks more like a decent PAL TV or maybe a VGA monitor but with slightly washed out colors. It runs at full speed on my Celeron N3160 so I’d recommend trying it out for slower computers! Now it doesn’t wash out the colors as much as your example pictures (that seemed to me too sub-par quality). But it does wash out the colors a little and I like the look of it, plus it runs perfectly on my Chromebook.

So for the crt-lottes-fast shader I just bumped up the curvature and edge round-off effect to mimic a small 15" TV those were usually pretty convex but otherwise default settings (again the colors are slightly washed out by default with this shader). I’m running this on a 14" widescreen 1080p. Here’s a few screenshots:

While I’m on the subject, and rather than making a new thread, I was wondering if anybody might know if there’s a way I can get Retroarch to mimic the look of a 240p console output scaled up for 720p or 1080p display? Rather than the clean way Retroarch usually scales things for your device’s native resolution.

I’m not exactly going for something that looks pretty here, I actually want to make it look as authentic (and therefore quite bad) looking as possible. I get that these are some weird requests, it’s just something I’m tinkering with to see if it can be done.

You’d need to post some photos of what you mean, but I think it’s going to vary from upscaler to upscaler. That is, various TVs bungle the process in different ways. Some deinterlace the signal erroneously, which looks like crap, then use whatever weird upscaling algo they use for 480p stuff on the resulting image.