PlainOldPants's Shader Presets

That looks completely wrong. This is what it looks like on my computer:

I have no idea what could be causing the output to look so terrible on your end. For now, all I can suggest are generic things, like resetting RetroArch’s settings to defaults, or updating your GPU drivers, but I don’t expect any of that to help. It’ll probably only waste your time.

Maybe, try loading the built-in shaders_slang/ntsc/ntsc-256px-composite shader and appending my ntsc-colors-crt shader. If you do that, you get Maister’s SNES NTSC signal and my NTSC colors.

2 Likes

I had HDR enabled like an imbecile

2 Likes

@beans I took a look at the SNES schematic and it looks like the chroma is definitely lowpass filtered. It looks to be around 2.6 MHz, wider than standard but possibly enough to have a visible impact.

I’m not seeing any luma trap though. There is filtering, but it looks like it’s outside the range of the video signal, i.e. DC and EMI rejection filters, not a notch filter centered on the colorburst. I could be missing something though. I’ll look at the Genesis one when I get a chance.

1 Like

Can I do the same with your NES stuff too?

I’m eagerly awaiting HDR support for this (no rush!) so I can give it the “100% mask” treatment :smiley:

1 Like

Speaking on HDR, I need a new upgrade to either the LG G5 or Samsung QD-OLED. The monitor that I have is mediocre but it’s 4K so I roll with it.

1 Like

Just make sure whatever you buy next says “Vesa DisplayHDR 1000” or whatever somewhere on the box and you should be good :slight_smile:

Does that apply to TVs as well? I don’t think I want to buy a monitor.

Yep! There are a variety of options available, now. Though, depending on your priorities, you may want True Black HDR 600. It’s a conundrum. If you want to get really crazy with masks and beam simulator, maybe opt for an outdoor display, 3,000 nits :smiley: I don’t even want to know what black looks like on that thing.

Man, CRT Emulation is something special alright. I can’t wait until it’s sufficient enough we play on a potato display or something in that area.

On NES, you should use my -nes-unfinished preset, and set the emulator’s palette to Raw. Don’t use ntsc-256px-composite on NES; that should be for SNES only. There is no way to get close to the right artifacts on NES except by making use of the Raw palette, which only a few shader presets use currently.

What is format of the raw palette? How are you supposed to take the values and map them to YC space?

Don’t make the mistake of getting a QD-OLED display for CRT Emulation.

The G5 would definitely be the better bet but do note that it uses a different subpixel layout from all other previous WOLED TVs. So it may not be compatible with current CRT Shader Mask Layouts. However, based on my understanding of these things, adding support for this new layout should be a relatively simple process.

I suspect that all that would need to be done is whatever subpixels LG display swapped around might have to be swapped around in the existing RRBBGGX mask.

If you get a G4 or G3 instead you wouldn’t have to worry about that.

Do also note that due to the additional subpixel making the LG WOLED display a 4 subpixel per pixel display and the fact that only 3 subpixels per pixel can be active at a time, this precludes the WOLED displays from properly supporting 3 Subpixel RGB/RBG Mask Layouts.

Maybe a 4 Subpixel RGBX mask might work or maybe a 3 pixel RBG mask layout might work if a Shader devloper figured out how to map the subpixel layout to the RWBG Display subpixel in such a way that the black subpixel in the Mask Layout maps to where the white subpixel currently is located.

That layout might end up being XBGR with the XBG being from the same pixel and the R being from the adjacent pixel. Just a theory I have waiting for someone qualified to correct it.

In other words what I’m saying is that WOLED isn’t fully compatible with a wide range of subixel mask layouts. Mostly mid to higher TVLs.

4K QD-LED and miniLEDs with VA/HVA/WHVA/IPS/Ads Pro are compatible with almost everything in that regard though and have the required brightness to power through the full strength mask and scanlines and also BFI without getting burned in.

2 Likes

The raw palette is in the NES’s internal hue-level-emphasis format, in that order, normalized from 0 to 255. The NES PPU writes composite video directly by switching between just a few fixed voltage levels, without any concept of Y or C, so in order to convert that into YC space, you have to simulate the NES PPU’s composite video and decode it.

This article goes over the whole process with some sample code. https://www.nesdev.org/wiki/NTSC_video

For me, the easiest way to understand has been to read source code. These two programs generate only the colors, as opposed to a whole NTSC-filtered image. https://github.com/ChthonVII/gamutthingy https://github.com/Gumball2415/pally

Maybe a text summary would be better, though.

The level is an integer from 0 to 3. This selects a pair of voltages. The NES directly outputs a square wave that alternates between those two voltages. That affects both Y and the saturation of C.

The hue is an integer from 0 to 15. Hues 0 and 13 are grays, which use only one of the two voltages. Hues 1 through 12 output the square wave with one of 12 different phases, giving 12 different hue angles to pick from. If you pick hue 14 or 15, the result is always the same as picking level 1 and hue 13 together, even if level has been set to something other than 1.

Emphasis is 3 bits: One for red, one for green, and one for blue. Each emphasis bit corresponds to a specific twelfth of the chroma phase. If the emphasis bit is set to 1, that twelfth of the phase is attenuated down to a different, lower preset voltage. In effect, this causes some reduction of red, green, or blue. Emphasis is skipped if you set hue to 14 or 15.

The NES’s output isn’t shaped perfectly, however. For each increase in “level”, you get some shift in hue. Hues 2, 6, and 10 (if memory serves) have a hue shift too, as well as a slight increase in Y. The amount of hue or Y shift varies significantly between different revisions of the NES PPU.

Here’s something that those programs and that article don’t address, though I haven’t ruled out the possibility that my NES just isn’t working properly. On my own NES, choosing between RF or composite also has a significant effect. The NES is much sharper over composite than over RF, with composite having a grainy, sandpapery look, and RF having a more flat, blurry look. Connecting the NES’s composite into a VCR to convert into RF allows you to get the sharp, grainy look on an RF-only TV set. I need to check this again, but I believe I’ve noticed my NES having more saturated colors when connected over RF instead of composite. Don’t forget that the original Famicom in Japan had only RF (though it could be modded for composite), while the NES in the rest of the world had composite, so simulating both is necessary to reflect different developers’ intents.

Edit: I forgot to mention, in the past, I have also video-captured my NES’s palette, but this never turned out great. Decoding the NES’s colors into RGB results in some values becoming less than 0 or more than 255, which makes it impossible to convert them back to YUV/YIQ. That is why using FirebrandX’s Composite Direct palette (or any normal NES palette at all) with a composite video emulation shader does not work. In an attempt to fix this, I did my own NES video capture, with the capture’s black level increased and white level decreased, to keep all values between 0 and 255 without clamping, and in my shaders, I would perfectly undo that change. I have tried this several times, both on Ubuntu and on Windows, and the capture has had problems every single time, so I won’t link it anywhere. Therefore, for now, we’re stuck with emulated palettes, not video capture.

1 Like

I have had two different Sony LCD TVs with HDR capabilities and I have basically zero complaints about them.

@PlainOldPants If you take one of Mesen’s normal palettes, quantize the values according to the measured values, map that to YC, and reconstruct the dot pattern, are you not just doing the same thing effectively? It sounds like you need to use a lookup table and reconstruct the dot pattern either way. Does the raw palette have a bit indicating the missing dot? Can it tell me about the Battletoads exception? But I suppose the raw palette could be more convenient.

I remember trying one of the raw palette shaders and thought it was ugly. But I was a RetroArch noob and maybe was just using it wrong LOL.

IMO we shouldn’t be using these goofy kind of constructions. They’re hacks. The cores should be able to send a block of metadata data to the shaders. Each frame. It would solve a lot of problems and limitations.

2 Likes

Which shaders can append to your shader to make it look even better?

Pretty much any CRT shader works - guest-advanced, CRT-Royale, etc.

2 Likes

Thanks, Nesguy. I wanted to smooth out his shader with like a CRT shader but wanted to make sure.

@Cyber Thanks on the recommendation. I thought all you need was the Samsung QD-OLED since I’m hearing WOLED are at their peak and aren’t going to get better.

That’s based on testing and opinions of others who have no clue what we’re doing here with CRT Shaders which have different performance requirements compared to general usage.

If WOLED was at its peak, why are the LG G5 and Panasonic Z95B 2 of the best and most accurate TVs money can buy?

Why are the brightest OLED TVs ever tested? Why are they brighter than last year’s models? Why do they have wider colour gamuts than last year’s models?

Why is there a roadmap for continued development of the technology?

Out of the two main competing OLED technologies, why are they the only one that has proper black levels in a bright room? You did know about the blacks getting raised and turning brown on QD-OLED displays when there is light in the room, did you?

You have to be able to sift through marketing spin and analyze the individual numbers and characteristic for yourself to determine which TV or display is the best for you. Which may not necessarily be the winner of any annual shootout.

That’s why many end up disappointed. QD-OLED is not good for or good at these things we do here at all.

OLED on the whole is not universally better or the best OLED is the best in some aspects of CRT Emulation, while good miniLED is the best in many other areas where OLED struggles to compete.

So it’s up to the user or potential purchaser to acknowledge the strengths and limitations and go with the one that they think would give them the best experience for them or the one that has compromises that they are willing to live with.

You will not find a single QD-OLED display in this list:

1 Like

OLEDs have two things going for them: low black levels and low response time. And people hear about that and must think they’re like CRTs because CRTs had those things, too.

But really I think the most important metric for our shaders is brightness, followed by color linearity (because how can you mimic another display if your own display is inconsistent)? Black levels are nice, but not really as important for 2D games. Contrast is more important than absolute black level I think. Low input lag is a ‘nice to have’, a certain value is good enough, especially if you have a 120 Hz+ display.

2 Likes