How crappy of a TV do you need for composite video artifacts?

I mean the main section of color is pretty straightforward imho, as NTSC has a color standard, as does PAL, if I’m not mistaken.

It’s when you get into how different hardware actually handled that, is where things get murky imo.

How different consoles output that signal, how different CRTs pull in that signal, how different CRTs actually process the signal before the electron gun fires.

1 Like

So when you say dithering what exactly are you talking about with regards to a CRT shader? I know what dithering is btw just trying to understand what part you think it plays.

I assume he’s referring to the Genesis primarily, which made heavy use of dithering for both “adding” colors and simulating transparencies, over “dirtier” signals.

2 Likes

Yes but that’s in the original artwork though or is there some piece of hardware in the megadrive’s output circuitry doing things on top of the relatively standard NTSC/PAL conversion?

In a very general sense, I would expect a CRT shader with options of RGB and composite input to display dithering differently. There is such a thing as RGB dithering obviously, but I would for example expect a shader to be flexible enough to degrade the composite signal in such a way as to allow the blending of pixels for instance. Maybe, this isn’t quite as basic as I think, lol. To me this is like “this is a good reason you should actually use composite for this content”.

3 Likes

I had proposed separate “modules” for composite outputs for the different systems to account for the differences in the output side that enabled the dithering tricks. These effects weren’t due to the TV’s composite input characteristics alone.

I’ve kinda been doing this using custom settings for Blargg NTSC Filter to recreate the output side, while my raw shader preset represents the TV’s input side.

2 Likes

Right yes ok so you’re talking about the dithering in the original artwork that gets ‘noised’ during transmission to the CRT which is what my first thoughts were when you mentioned dithering.

Essentially this should happen when using ntsc-adaptive.slangp no? I haven’t tested the megadrive again yet. I’ll give it a spin!

Sorry, I can’t comment on the actual capabilities of NTSC-adaptive, it’s obviously more flexible than those old 256px NTSC etc. shaders that were there first (great). My personal project in the near future is trying to explore the capabilities of MAME-HLSL a bit for the purpose of colour artifacting (=turn something monochrome into colour on a specific platform).

I will however leave another link, concerning dithering, this is a video that explains Playstation dithering in detail, mentioning for example how it’s an actual hardware feature:

2 Likes

N64 did hardware level blending. At least I think it was hardware.

2 Likes

At 8:40 there’s also talk about the N64, there’s actually quite an amount of different examples shown, it’s great. (I’m not associated with the channel in any way :laughing:)

2 Likes

N64 and PS1 both have hardware-level dithering, though PS1’s is just a basic pattern overlay.

Re: NTSC simulation in general, you can isolate the NTSC passes from MAME-HLSL and, while I don’t think they generally look as good as ntsc-adaptive (which is just the ntsc-256/320-px shaders mooshed together with some logic to choose which one to use), they do expose a lot more signal values for tweaking. See: dannyld’s mega drive rainbow thread for more details.

2 Likes

I like the idea of different modules for different systems. The TurboGrafx 16 for example was known for having a very clean composite output, while the Genesis was terrible.

To do this we would need to get all the systems to one person so they can be connected to the same display, in order to eliminate the display as a variable. Or just get the raw captures from different people, but that seems harder?

2 Likes

So most of the above stuff as far as I can tell is stuff that should be emulated by whatever core we’re using at the time as in it falls outside the domain of a CRT shader. Certainly PS1 and N64 dithering falls into that camp. What we’re really interested in are the black boxes that is the output circuitry. I’m not terribly certain a whole lot of progress has been made in this area of emulation - there was a video on the SNES circuitry I watched a couple of years back that talked about it. I’ll try and find it.

1 Like

It was actually an article I was remembering and it sounds like they actually did emulate the things I was talking about

3 Likes

NTSC aka Never The Same Color :smiley:

2 Likes

It’d be really great if we knew what the PPUs were doing. Is there a big difference in gamma between SNES and NES? There are no settings that look right for both systems. Seems like a 0.20 - 0.30 difference in output gamma is required. I think I recall reading somewhere that the SNES had a very saturated output, so it kinda makes sense that a higher output gamma would be needed, if true.

2 Likes

FYI one of the NES presets uses 1.80 gamma. I wonder if, other than using a power-pc Macintosh, there is any reason for this :thinking:

\presets\nes-color-decoder+colorimetry+pixellate

I feel like we haven’t paid enough attention to NES color palettes on this forum. Then again it is a can of worms!

3 Likes

Is this PAL or NTSC?

This is from a PAL Wii. Note though that I didn’t turn the sharpness control all the way down for these photos, that would make a difference for composite, though not the extent it would suddenly mimic Genesis/Megadrive output.

Wasn’t aware of this until I stumbled over it recently. At least some Sony WEGAs have adjustment settings for their comb and notch filters in service mode.

I also read that on some sets you can even disable interlacing, though probably not on the WEGAs.
4 Likes