Don’t forget the subixels man. That’s where OLED displays fall flat on their faces and regular LCD technology shines.
But like nobody in the mainstream realm of RetroGaming seems to be aware of the existence of these things.
Don’t forget the subixels man. That’s where OLED displays fall flat on their faces and regular LCD technology shines.
But like nobody in the mainstream realm of RetroGaming seems to be aware of the existence of these things.
So, like MiniLed in a decade or so should be the most optimal since they have the best of both worlds, right?
No, that doesn’t work at all. The NES’s video signal works too differently from normal for us to be able to do that.
The first problem has to do with how the NES lacks true RGB to begin with. As I described in my previous post, decoding the NES’s video signal into RGB results in both some negative numbers and some excessively high numbers. In order to convert back to YC, you need to have those out-of-bounds RGB values intact. Those NES palettes have everything clamped between 0 and 255, which makes it impossible to get the correct YC. If you use that incorrect YC to simulate the composite signal, like with Blargg’s filters, you get less signal interference than you normally would, and the signal interference only gets reduced for colors that got clamped.
The second problem is with how the NES’s signal has a specific shape that needs to be emulated, which I also described in my last post. You already know that the NES’s signal is a square wave, but it’s not as simple as modulating C as a square and adding it to Y, since there is no Y or C (or “modulating” or “adding”) in the NES video signal. The active portion of the signal is output by switching between just 7 possible voltage values (or 14 if you count de-emphasis), with exactly 12 possible hue angles that are spaced evenly apart by 30 degrees. It’s not hard to understand why sticking to just those 7 voltage values and 12 evenly-spaced hue angles is important for getting convincing signal artifacts, as this does impact how different colors (including grays) of the NES palette are going to interfere with each other when decoding. A less obvious problem is in games that quickly cycle the colors, such as Ninja Gaiden 1 (when getting a game over) or to some extent the Game Genie title text, where real hardware makes it easy to see the chroma artifacts moving at a consistent speed as the colors are rotated, which happens because the 12 hue angles are spaced perfectly 30 degrees apart. I don’t see a good way to recreate these artifacts convincingly without the raw palette.
You don’t need a lookup table for this. GTU-famicom does use a LUT for speed, but this isn’t necessary at all.
Unfortunately, I’m fairly sure this is exactly the only information we don’t get from using the raw palette. It wouldn’t hurt to try to put in an issue or pull request for this feature, but that doesn’t exactly help to solve the bigger problem that we need some way to get more information from the cores.
Looking at them right now, I agree that they’re not that great. All five of them (mine included) are old and each have their own unique problems.
I don’t know what you mean by this “missing dot”. Following the documentation at https://www.nesdev.org/wiki/NTSC_video , the raw palette gives enough information to reconstruct the NES’s video signal in its ideal form, excluding impedance-related distortions like the so-called “row skew”, and excluding Battletoads and Battletoads + Double Dragon’s three-frame phase cycle. From there, we can get convincing graphics by digitally decoding that signal and applying a simple color correction for row-skew. The next step, which isn’t in any shaders to my knowledge, would be to make solid enough approximations of the filtering circuits found in the console and in some known CRTs, instead of using these simple symmetrical FIR filters that are based on standards and eyeballing.
I agree. With the current system, there are so many different workarounds that we’re having to do, when it could all be solved by getting some basic metadata. Even just knowing what core or console is being played would help.
My old NES raw palette shader, which I was just trying a few minutes ago, absolutely reeks of these workarounds, even just from looking at the preset’s file name which states the specific emulator that it’s compatible with. At the time, only Mesen output the de-emphasis portion of the Raw palette correctly, while FCEUmm had a glitch with its raw palette. Then, in that long list of settings, there’s a manual switch for the Battletoads games, along with a bunch of settings meant for only other consoles, which control things like how to “detect” different screen resolutions and cropped overscan (which can only be guessed, not known decisively), plus a switch to convert Genesis Plus GX’s colors into BlastEm’s colors. What a mess. All of this information should have been provided by the core instead. The punch line is when I did create an issue in FCEUmm regarding its raw palette: When the raw palette did finally get fixed, it was done in a way that broke some (if not all) custom palettes for that emulator.
Edit:
Something else that’s related is the complex directory structures and long lists of .slangp files that we have in large packs like with sonkun, CyberLab, Mega Bezel, etc. (Not the fault of the authors, but of the system.) It’s starting to look like the entire shader system needs yet another overhaul, but is it really worth the time and effort?
It’s hard to say where things will be in a decade. I’m hoping that LG Display is going to retire the white subpixel as they improve efficiency with RGB Tandem OLED and reemploy MLA, despite the cost issues of implementing it.
That might be the ultimate for CRT Emulation if it materializes in a few years.
Right now folks are sleeping on the TCL QM851G, QM9K and QM8K for CRT Shaders. I have a TCL QM751G and it totally rocks with CRT Shaders but all of those I’ve listed there have way more brightness and dimming zones and supposedly better black levels and contrast. Not to mention the 8K and 9K have higher precision Backlight technology, wide viewing angle tech, faster Backlight response and R-G-B Subpixel layout!
Then there are the Bravia 9 and 7 which seem to do more with less zones and peak brightness in the home cinema, sports and TV show sphere.
So things are already at a really nice point technology wise.
Next year we’ll see what happens when RGB miniLED takes the stage.
The worst aspects of current miniLED technology for me when it comes to CRT emulation are the blooming that comes with off angle viewing and dark room performance and the generally poor off axis viewing that results in colour saturation and gamma shifts.
OLED is amazing for CRT Emulation in a dark room, once you can live with the fact that it can’t handle all the RGB Masks and TVLs well all the way down to the subpixel level.
Scanline gaps do cause uneven wear over time though.
The missing dot is from the scanline with one less PPU dot every other frame. But you can’t actually know which frame is odd or even without guessing, right?
Let me put it this way: the NES video output can be represented as a 4096x240 monochrome image, right? If we take that and know the relative time for each subpixel t, we should be able to treat it as the composite signal directly and determine the right phase to demodulate at each subpixel. The raw palette gives us the information needed to reconstruct the monochrome image, but doesn’t give us the info to get time at that subpixel. We still need to guesstimate that, right? Is there any room bitwise in the raw palette to squeeze in more metadata? Like you would only need one bit somewhere to flag the current field, another to flag if the missing dot is present in the field, etc.
There actually is, since the colors in the NES’s raw format are only 9 bits each, while the raw palette expands this into 24-bit color. Staying compatible with existing shader presets might be slightly tricky, but it’s doable, although it should also be simple to modify those shaders too, since there are only five presets total.
Is the raw palette an actual palette like the other palettes? Or is it a different mode entirely? Like the does the emulator actually read off current the PPU state to write out the output color value?
It is just a color palette. Shaders currently have to make up the rest of the information from thin air, like with mine which has a manual switch for a 2-frame or 3-frame phase cycle, or the other shaders which only support a 2-frame cycle.
I remain unconvinced that this is the case for at least LG panels from 2020 on, so long as the Screen Move/pixel shift mitigations remain enabled, and the refresh cycles are allowed to run every 4-6ish hours of use.
Well not everyone who has a WOLED TV has a 2020+ LG Display OLED panel eh?
I can only speak from my experience with my 2016 LG E6P which had Pixel Shift enabled and I never unplugged my TV so it was able to run all its panel refresh cycles on schedule.
The thing is once I noticed that there was an issue, Clear Panel Noise made no difference whatsoever.
Not sure if it was due to the fine pitch between the more worn and less worn areas.
Plus there’s no way to predict how every user is going to use their TV.
“NES Color Decoder” looks very similar to Composite Direct FBX. I think NES Color Decoder + Raw looks a bit better.
I came to the same conclusion- the NES needs its own composite video shader.
I’ve been doing a crude workaround where I just eyeball settings between an RGB and composite video preset (guest-advanced and guest-advanced-ntsc) until everything looks equal, and then applying the NES color decoder. Not ideal, obviously.
This translates to around +20% Saturation and +20% NTSC Saturation using guest-advanced-ntsc.