Identity LUTs

I couldn’t make LUT.glsl output unmodified colors except when using big identity LUTs.

I just read that the Sega Genesis has a 9-bit RGB palette (512 colors). I generated an identity LUT containing just that amount of colors: When applied, to a Sega Genesis screenshot it did not preserve original colors!

According to this Wikipedia article, the Sega Genesis’ palette is this: It’s a different set of colors. The channels are swapped but I edited “LUT.glsl” and tried it and it also didn’t preserve colors!

For reference, here is the desired output (unmodified colors):

I don’t know what the entries must be or maybe there’s something wrong with the shader?.

I tried with SNES. According to Wikipedia the SNES has a 15-bit RGB palette (32.768 colors). So I used “32.png” which is bundled with glsl shaders and contains just that amount of colors. The result was the same, LUT.glsl produced altered colors.

So why did colors change and how can it be fixed?

I’ll attach the shader mod and the LUTs: Modified LUT.glsl and Sega Genesis identity LUTs

Good catch. The shader needed to sample in linear gamma to avoid faintly screwing with the colors. I just pushed a fix for it, so hopefully that will get you fixed up on the SNES one, at least.

Not sure on the Genesis one, though. I’ll have to take a closer look at it, probably tomorrow.

Thank you.

But banding got more uneven. Some blues dissapeared also. Take a look: Input Gamma: 2.2 (your mod).

Input Gamma: 2.4

Input Gamma: 2.5

I kept bumping gamma step by step from 1.0 and nothing worked.

Take a closer look:

Input Gamma: 2.2 (your mod).

Edit:

I replaced this line: float blue = floor( imgColor.b * (LUT_Size - 1.0) ) / LUT_Size;

By this one: float blue = floor( imgColor.b * (LUT_Size - 1.0) + 0.5 ) / LUT_Size;

And removed the step of linearizing. Now it looks like this:

That was SNES. I’ll take a look at Sega Genesis and report back later.

Edit 2:

Sega Genesis looks like this now: Greys are more neutral now, specially the light ones. But still uneven and missing blues.

Alright, try it now. I think I got it fixed up.

I tested your last changes, but they produced this:

The changes to LUT.glsl in my last post made everything I tested so far work fine, except for Sega Genesis. I tested it on GBA (same LUT as SNES) and Master System with this 6-bit palette: Those worked too. All this makes me believe that I don’t have the right Sega Genesis palette. Further reading on Wikipedia led me to the following quote:

“The Mega Drive/Genesis used a 9-bit RGB palette (512 colors, 1536 including shadow and highlight mode)…”

I’ll have to find out what colors that “shadow and highlight” mode outputs.

Now, about those changes, it was a random guess, I don’t actually understand some of the things in that shader. But I’m afraid to ask.

Anyway, I’ll report back when I had something new. Thank you!.

well shit. I wonder if there’s some GPU-specific shenanigans going on, as it looked fine on my radeon GPU but is missing colors on my HD3000. I’ll keep investigating, as well.

Can you try this one? https://pastebin.com/z613kb6i

It looks like there are some undefined values and each GPU vendor handles those undefined values differently (AMD apparently does the nicest thing, Intel is the worst result and Nvidia is only slightly better), so I had to put in a stupid check to confine it to safe values.

Screenshots are in png form, could lossy compression be causing invalid color entrys?

Is this an attempt to use hardware accelerated rendering for snes and genesis?

not hardware-accelerated rendering, just tonemapping, hue/saturation adjustment, etc. via LUT instead of fiddling with numbers.

Good thought on the lossy compression, but it looks that way in-game, too, so I think the loss of precision is just coming from the shader itself, which is what the additional lines that I added (and which screwed up the red and green channels…) are what keeps the precision up.

Like the gba color saturation shader that emulates the lcd palette of the original gba?

yeah, exactly :slight_smile:

With this shader, you can take the accompanying LUT(s) into photoshop and append it to a screenshot (say, of a GBA game) and then do whatever you need to do to make the screenshot look right. Then, crop off the screenshot leaving just the palette LUT and when you load up the shader, it will make those same photoshop operations apply to the game image.

@hunterk: I tested your last mod, and I got: Nvidia hardware I’ll try another Nvidia card as soon as I can.

@guicrith: The png LUTs bundled with glsl shaders were in fact compressed, but I tried them uncompressed and there was no difference. And yes, like the GBA color saturation shader.

I don’t know if this helps, but I made a tiny version of the “Color Bars” test pattern in the 240p suite as a PNG file so that it only contains unique colors. Then I loaded it in RetroArch and made a snapshot of it with stock shader, and another one with your last version of LUT.glsl. When I diff’d their color entries this is what I got: https://pastebin.com/7PjCuFqV Here are the screenshots compared: (stock.png) (hunterk-mod.png) (The PNG loaded with Imageviewer)

I also want to add that I tried generating the 15-bit RGB LUT with floor’d values, ceil’d values and rounded (same as you did with your LUTs, I suppose) and there were always differences. Well, I just let you know just in case this is helpful.

I’ll be a bit busy for a while. I’ll test later. Thanks again, and see you soon.

Update:

I was taking a closer look at the color values on the “Color Bars” test pattern from the SNES 240p suite and I noticed they follow a “weird” sequence: 5-bit per channel color values (15-bit RGB)

Next, I created a LUT that matches the “weird” sequence of values. And tested with two LUT.glsl versions. Got next results:

The most recently modified one had next differences: stock.glsl vs. new-LUT.glsl (using 32-SNES.png LUT)

and this old LUT.glsl mod produced identical output (expected behaviour).

So, the differences I noticed after this mod were actually in the LUT.

2 Likes