PlainOldPants's Shader Presets

On NES, you should use my -nes-unfinished preset, and set the emulator’s palette to Raw. Don’t use ntsc-256px-composite on NES; that should be for SNES only. There is no way to get close to the right artifacts on NES except by making use of the Raw palette, which only a few shader presets use currently.

What is format of the raw palette? How are you supposed to take the values and map them to YC space?

Don’t make the mistake of getting a QD-OLED display for CRT Emulation.

The G5 would definitely be the better bet but do note that it uses a different subpixel layout from all other previous WOLED TVs. So it may not be compatible with current CRT Shader Mask Layouts. However, based on my understanding of these things, adding support for this new layout should be a relatively simple process.

I suspect that all that would need to be done is whatever subpixels LG display swapped around might have to be swapped around in the existing RRBBGGX mask.

If you get a G4 or G3 instead you wouldn’t have to worry about that.

Do also note that due to the additional subpixel making the LG WOLED display a 4 subpixel per pixel display and the fact that only 3 subpixels per pixel can be active at a time, this precludes the WOLED displays from properly supporting 3 Subpixel RGB/RBG Mask Layouts.

Maybe a 4 Subpixel RGBX mask might work or maybe a 3 pixel RBG mask layout might work if a Shader devloper figured out how to map the subpixel layout to the RWBG Display subpixel in such a way that the black subpixel in the Mask Layout maps to where the white subpixel currently is located.

That layout might end up being XBGR with the XBG being from the same pixel and the R being from the adjacent pixel. Just a theory I have waiting for someone qualified to correct it.

In other words what I’m saying is that WOLED isn’t fully compatible with a wide range of subixel mask layouts. Mostly mid to higher TVLs.

4K QD-LED and miniLEDs with VA/HVA/WHVA/IPS/Ads Pro are compatible with almost everything in that regard though and have the required brightness to power through the full strength mask and scanlines and also BFI without getting burned in.

2 Likes

The raw palette is in the NES’s internal hue-level-emphasis format, in that order, normalized from 0 to 255. The NES PPU writes composite video directly by switching between just a few fixed voltage levels, without any concept of Y or C, so in order to convert that into YC space, you have to simulate the NES PPU’s composite video and decode it.

This article goes over the whole process with some sample code. https://www.nesdev.org/wiki/NTSC_video

For me, the easiest way to understand has been to read source code. These two programs generate only the colors, as opposed to a whole NTSC-filtered image. https://github.com/ChthonVII/gamutthingy https://github.com/Gumball2415/pally

Maybe a text summary would be better, though.

The level is an integer from 0 to 3. This selects a pair of voltages. The NES directly outputs a square wave that alternates between those two voltages. That affects both Y and the saturation of C.

The hue is an integer from 0 to 15. Hues 0 and 13 are grays, which use only one of the two voltages. Hues 1 through 12 output the square wave with one of 12 different phases, giving 12 different hue angles to pick from. If you pick hue 14 or 15, the result is always the same as picking level 1 and hue 13 together, even if level has been set to something other than 1.

Emphasis is 3 bits: One for red, one for green, and one for blue. Each emphasis bit corresponds to a specific twelfth of the chroma phase. If the emphasis bit is set to 1, that twelfth of the phase is attenuated down to a different, lower preset voltage. In effect, this causes some reduction of red, green, or blue. Emphasis is skipped if you set hue to 14 or 15.

The NES’s output isn’t shaped perfectly, however. For each increase in “level”, you get some shift in hue. Hues 2, 6, and 10 (if memory serves) have a hue shift too, as well as a slight increase in Y. The amount of hue or Y shift varies significantly between different revisions of the NES PPU.

Here’s something that those programs and that article don’t address, though I haven’t ruled out the possibility that my NES just isn’t working properly. On my own NES, choosing between RF or composite also has a significant effect. The NES is much sharper over composite than over RF, with composite having a grainy, sandpapery look, and RF having a more flat, blurry look. Connecting the NES’s composite into a VCR to convert into RF allows you to get the sharp, grainy look on an RF-only TV set. I need to check this again, but I believe I’ve noticed my NES having more saturated colors when connected over RF instead of composite. Don’t forget that the original Famicom in Japan had only RF (though it could be modded for composite), while the NES in the rest of the world had composite, so simulating both is necessary to reflect different developers’ intents.

Edit: I forgot to mention, in the past, I have also video-captured my NES’s palette, but this never turned out great. Decoding the NES’s colors into RGB results in some values becoming less than 0 or more than 255, which makes it impossible to convert them back to YUV/YIQ. That is why using FirebrandX’s Composite Direct palette (or any normal NES palette at all) with a composite video emulation shader does not work. In an attempt to fix this, I did my own NES video capture, with the capture’s black level increased and white level decreased, to keep all values between 0 and 255 without clamping, and in my shaders, I would perfectly undo that change. I have tried this several times, both on Ubuntu and on Windows, and the capture has had problems every single time, so I won’t link it anywhere. Therefore, for now, we’re stuck with emulated palettes, not video capture.

1 Like

I have had two different Sony LCD TVs with HDR capabilities and I have basically zero complaints about them.

@PlainOldPants If you take one of Mesen’s normal palettes, quantize the values according to the measured values, map that to YC, and reconstruct the dot pattern, are you not just doing the same thing effectively? It sounds like you need to use a lookup table and reconstruct the dot pattern either way. Does the raw palette have a bit indicating the missing dot? Can it tell me about the Battletoads exception? But I suppose the raw palette could be more convenient.

I remember trying one of the raw palette shaders and thought it was ugly. But I was a RetroArch noob and maybe was just using it wrong LOL.

IMO we shouldn’t be using these goofy kind of constructions. They’re hacks. The cores should be able to send a block of metadata data to the shaders. Each frame. It would solve a lot of problems and limitations.

2 Likes

Which shaders can append to your shader to make it look even better?

Pretty much any CRT shader works - guest-advanced, CRT-Royale, etc.

2 Likes

Thanks, Nesguy. I wanted to smooth out his shader with like a CRT shader but wanted to make sure.

@Cyber Thanks on the recommendation. I thought all you need was the Samsung QD-OLED since I’m hearing WOLED are at their peak and aren’t going to get better.

That’s based on testing and opinions of others who have no clue what we’re doing here with CRT Shaders which have different performance requirements compared to general usage.

If WOLED was at its peak, why are the LG G5 and Panasonic Z95B 2 of the best and most accurate TVs money can buy?

Why are the brightest OLED TVs ever tested? Why are they brighter than last year’s models? Why do they have wider colour gamuts than last year’s models?

Why is there a roadmap for continued development of the technology?

Out of the two main competing OLED technologies, why are they the only one that has proper black levels in a bright room? You did know about the blacks getting raised and turning brown on QD-OLED displays when there is light in the room, did you?

You have to be able to sift through marketing spin and analyze the individual numbers and characteristic for yourself to determine which TV or display is the best for you. Which may not necessarily be the winner of any annual shootout.

That’s why many end up disappointed. QD-OLED is not good for or good at these things we do here at all.

OLED on the whole is not universally better or the best OLED is the best in some aspects of CRT Emulation, while good miniLED is the best in many other areas where OLED struggles to compete.

So it’s up to the user or potential purchaser to acknowledge the strengths and limitations and go with the one that they think would give them the best experience for them or the one that has compromises that they are willing to live with.

You will not find a single QD-OLED display in this list:

1 Like

OLEDs have two things going for them: low black levels and low response time. And people hear about that and must think they’re like CRTs because CRTs had those things, too.

But really I think the most important metric for our shaders is brightness, followed by color linearity (because how can you mimic another display if your own display is inconsistent)? Black levels are nice, but not really as important for 2D games. Contrast is more important than absolute black level I think. Low input lag is a ‘nice to have’, a certain value is good enough, especially if you have a 120 Hz+ display.

2 Likes

Don’t forget the subixels man. That’s where OLED displays fall flat on their faces and regular LCD technology shines.

But like nobody in the mainstream realm of RetroGaming seems to be aware of the existence of these things.

2 Likes

So, like MiniLed in a decade or so should be the most optimal since they have the best of both worlds, right?

1 Like

No, that doesn’t work at all. The NES’s video signal works too differently from normal for us to be able to do that.

The first problem has to do with how the NES lacks true RGB to begin with. As I described in my previous post, decoding the NES’s video signal into RGB results in both some negative numbers and some excessively high numbers. In order to convert back to YC, you need to have those out-of-bounds RGB values intact. Those NES palettes have everything clamped between 0 and 255, which makes it impossible to get the correct YC. If you use that incorrect YC to simulate the composite signal, like with Blargg’s filters, you get less signal interference than you normally would, and the signal interference only gets reduced for colors that got clamped.

The second problem is with how the NES’s signal has a specific shape that needs to be emulated, which I also described in my last post. You already know that the NES’s signal is a square wave, but it’s not as simple as modulating C as a square and adding it to Y, since there is no Y or C (or “modulating” or “adding”) in the NES video signal. The active portion of the signal is output by switching between just 7 possible voltage values (or 14 if you count de-emphasis), with exactly 12 possible hue angles that are spaced evenly apart by 30 degrees. It’s not hard to understand why sticking to just those 7 voltage values and 12 evenly-spaced hue angles is important for getting convincing signal artifacts, as this does impact how different colors (including grays) of the NES palette are going to interfere with each other when decoding. A less obvious problem is in games that quickly cycle the colors, such as Ninja Gaiden 1 (when getting a game over) or to some extent the Game Genie title text, where real hardware makes it easy to see the chroma artifacts moving at a consistent speed as the colors are rotated, which happens because the 12 hue angles are spaced perfectly 30 degrees apart. I don’t see a good way to recreate these artifacts convincingly without the raw palette.

You don’t need a lookup table for this. GTU-famicom does use a LUT for speed, but this isn’t necessary at all.

Unfortunately, I’m fairly sure this is exactly the only information we don’t get from using the raw palette. It wouldn’t hurt to try to put in an issue or pull request for this feature, but that doesn’t exactly help to solve the bigger problem that we need some way to get more information from the cores.

Looking at them right now, I agree that they’re not that great. All five of them (mine included) are old and each have their own unique problems.

I don’t know what you mean by this “missing dot”. Following the documentation at https://www.nesdev.org/wiki/NTSC_video , the raw palette gives enough information to reconstruct the NES’s video signal in its ideal form, excluding impedance-related distortions like the so-called “row skew”, and excluding Battletoads and Battletoads + Double Dragon’s three-frame phase cycle. From there, we can get convincing graphics by digitally decoding that signal and applying a simple color correction for row-skew. The next step, which isn’t in any shaders to my knowledge, would be to make solid enough approximations of the filtering circuits found in the console and in some known CRTs, instead of using these simple symmetrical FIR filters that are based on standards and eyeballing.

I agree. With the current system, there are so many different workarounds that we’re having to do, when it could all be solved by getting some basic metadata. Even just knowing what core or console is being played would help.

My old NES raw palette shader, which I was just trying a few minutes ago, absolutely reeks of these workarounds, even just from looking at the preset’s file name which states the specific emulator that it’s compatible with. At the time, only Mesen output the de-emphasis portion of the Raw palette correctly, while FCEUmm had a glitch with its raw palette. Then, in that long list of settings, there’s a manual switch for the Battletoads games, along with a bunch of settings meant for only other consoles, which control things like how to “detect” different screen resolutions and cropped overscan (which can only be guessed, not known decisively), plus a switch to convert Genesis Plus GX’s colors into BlastEm’s colors. What a mess. All of this information should have been provided by the core instead. The punch line is when I did create an issue in FCEUmm regarding its raw palette: When the raw palette did finally get fixed, it was done in a way that broke some (if not all) custom palettes for that emulator.

Edit:

Something else that’s related is the complex directory structures and long lists of .slangp files that we have in large packs like with sonkun, CyberLab, Mega Bezel, etc. (Not the fault of the authors, but of the system.) It’s starting to look like the entire shader system needs yet another overhaul, but is it really worth the time and effort?

2 Likes

It’s hard to say where things will be in a decade. I’m hoping that LG Display is going to retire the white subpixel as they improve efficiency with RGB Tandem OLED and reemploy MLA, despite the cost issues of implementing it.

That might be the ultimate for CRT Emulation if it materializes in a few years.

Right now folks are sleeping on the TCL QM851G, QM9K and QM8K for CRT Shaders. I have a TCL QM751G and it totally rocks with CRT Shaders but all of those I’ve listed there have way more brightness and dimming zones and supposedly better black levels and contrast. Not to mention the 8K and 9K have higher precision Backlight technology, wide viewing angle tech, faster Backlight response and R-G-B Subpixel layout!

Then there are the Bravia 9 and 7 which seem to do more with less zones and peak brightness in the home cinema, sports and TV show sphere.

So things are already at a really nice point technology wise.

Next year we’ll see what happens when RGB miniLED takes the stage.

The worst aspects of current miniLED technology for me when it comes to CRT emulation are the blooming that comes with off angle viewing and dark room performance and the generally poor off axis viewing that results in colour saturation and gamma shifts.

OLED is amazing for CRT Emulation in a dark room, once you can live with the fact that it can’t handle all the RGB Masks and TVLs well all the way down to the subpixel level.

Scanline gaps do cause uneven wear over time though.

The missing dot is from the scanline with one less PPU dot every other frame. But you can’t actually know which frame is odd or even without guessing, right?

Let me put it this way: the NES video output can be represented as a 4096x240 monochrome image, right? If we take that and know the relative time for each subpixel t, we should be able to treat it as the composite signal directly and determine the right phase to demodulate at each subpixel. The raw palette gives us the information needed to reconstruct the monochrome image, but doesn’t give us the info to get time at that subpixel. We still need to guesstimate that, right? Is there any room bitwise in the raw palette to squeeze in more metadata? Like you would only need one bit somewhere to flag the current field, another to flag if the missing dot is present in the field, etc.

There actually is, since the colors in the NES’s raw format are only 9 bits each, while the raw palette expands this into 24-bit color. Staying compatible with existing shader presets might be slightly tricky, but it’s doable, although it should also be simple to modify those shaders too, since there are only five presets total.

Is the raw palette an actual palette like the other palettes? Or is it a different mode entirely? Like the does the emulator actually read off current the PPU state to write out the output color value?

It is just a color palette. Shaders currently have to make up the rest of the information from thin air, like with mine which has a manual switch for a 2-frame or 3-frame phase cycle, or the other shaders which only support a 2-frame cycle.

1 Like

I remain unconvinced that this is the case for at least LG panels from 2020 on, so long as the Screen Move/pixel shift mitigations remain enabled, and the refresh cycles are allowed to run every 4-6ish hours of use.

Well not everyone who has a WOLED TV has a 2020+ LG Display OLED panel eh?

I can only speak from my experience with my 2016 LG E6P which had Pixel Shift enabled and I never unplugged my TV so it was able to run all its panel refresh cycles on schedule.

The thing is once I noticed that there was an issue, Clear Panel Noise made no difference whatsoever.

Not sure if it was due to the fine pitch between the more worn and less worn areas.

Plus there’s no way to predict how every user is going to use their TV.