Emulating the sharpening circuit

Now that we have several very talented people working on highly accurate NTSC emulation, it might be a good time to talk about sharpening circuits.

Currently, we have several options for emulating a composite video signal, and now we even have people working on console-specific outputs, comb filters, and such. To my knowledge, no one has yet attempted to digitize the sharpening circuits used by TVs to sharpen high frequency content. This is basically what the user is controlling with the “sharpness” setting on a CRT.

Here’s what ChatGPT gave me… I have no idea if it’s useful :grinning: Are we already doing this? Can the accuracy be improved by looking at the data sheets for actual TV circuits?

Replicating a 1980s CRT sharpening circuit in a shader involves simulating an analog video peaking filter using digital signal processing techniques. The basic principle is to detect and amplify high-frequency signal changes, which correspond to sharp edges in the video image.

The analog circuit: A peaking filter In 1980s consumer CRTs, the “sharpness” control adjusted a peaking filter in the video amplifier. The circuit is designed to boost high-frequency components of the video signal, which represent details and edges. A typical peaking filter is a second-order bandpass filter. The digital equivalent of this is applying a convolution filter with a specific kernel to the source image. This can be done in a pixel shader using fragment coordinates and texture lookups.

Shader implementation steps

  1. Sample neighboring pixels Inside your shader’s fragment function, sample the texture at multiple points: the current pixel’s coordinate and several neighboring pixels. The number of samples and their offsets determine the filter’s “radius” and strength.

// Get current pixel color vec4 color = texture2D(inputTexture, texCoord);

// Define offsets for neighboring pixels float dx = 1.0 / textureSize(inputTexture, 0).x; float dy = 1.0 / textureSize(inputTexture, 0).y;

// Sample neighbors vec4 neighbor1 = texture2D(inputTexture, texCoord + vec2(dx, 0.0)); vec4 neighbor2 = texture2D(inputTexture, texCoord - vec2(dx, 0.0)); vec4 neighbor3 = texture2D(inputTexture, texCoord + vec2(0.0, dy)); vec4 neighbor4 = texture2D(inputTexture, texCoord - vec2(0.0, dy));

  1. Apply a convolution kernel The core of the sharpening circuit simulation is a convolution operation. A common sharpening kernel is a simple unsharp mask, which can be expressed in a shader as:
  • Original pixel * (1 + weight)
  • Neighboring pixels * (-weight) For a basic cross-shaped filter, the kernel can be applied like this:

// Unsharp mask kernel weights float centerWeight = 5.0; // Adjustable “sharpness” float edgeWeight = -1.0;

// Apply the kernel vec4 sharpenedColor = centerWeight * color + edgeWeight * (neighbor1 + neighbor2 + neighbor3 + neighbor4);

  1. Handle signal “ringing” A side effect of aggressive analog peaking filters was “ringing”—an over-emphasis that created a halo or double-edge effect around sharp transitions. To replicate this authentically, you can:
  • Prevent clamping: Allow the result of the sharpening kernel to go outside the normal [0, 1] range for a more authentic, artificial look. This will create noticeable bright halos.
  • Implement a “soft clip”: Simulate the analog electronics by gradually clamping the sharpened value back toward the normal range, as the original circuits would have.

// A simple way to introduce bloom from over-emphasized signals vec4 finalColor = vec4(sharpenedColor.rgb, 1.0);

// Use a separate bloom pass after this step, or mix in some over-bright pixels // for an easy simulation of the halo effect. vec4 overbright = max(vec4(0.0), finalColor - vec4(1.0)); finalColor.rgb = min(vec3(1.0), finalColor.rgb) + overbright * 0.5; // Example bloom

2 Likes

Yup, https://en.m.wikipedia.org/wiki/Unsharp_masking

I don’t know how TV’s circuits handle this, maybe the logic principle original-blurredl=sharp still applies, but unsharp mask is pretty common and already implemented in shaders.

I’m pretty sure @guest.r did that long time ago, but I ignore if it is placed before or after the ntsc decoding pass.

2 Likes

So, basically what we have available in guest-ntsc is already functioning as a peaking filter / second-order bandpass (digitally, a convolution filter)?

We have two sharpenings - the fast sharpen and the horizontal filter. I’m not sure when each is being applied.

I think if we’re doing composite video stuff we want at least one of the sharpenings to happen after the composite video stuff? (How did TVs do it?)

I probably just need to understand the existing options better? I think I’m making some progress :slight_smile:

1 Like

My first sharpen improvement was dynamicaly adjusting the lowpass filter (pass2). It could do pretty well, but later i added a more generic luma sharpen filter. Later on (luma) deblur was introduced, which is not positioned entirely accurate, but you can get this nice trinitron composite blending + sharp edges with it etc.

Currently i try to keep it generic, simple and fast. I’m sure @PlainOldPants will keep doing more specialized and technically (not only visually) faithful shaders and presets. Afaik his latest implementations already simulate sharpening circuits.

Judging from screenshots some later TVs could do very sharp edges and blend dithering via composite, i guess the logic behind was improving over time, since, like an old bugger harekiet once said, “they had a pair of f*cking eyes too.” :grin:

4 Likes

Ok so it’s a skill issue on my part, lol.

That’s awesome, I can’t even keep up with all of these new developments - it seems every time I check back in with shader development there’s something new and exciting going on :smiley:

Yep, Trinitrons are probably the best it gets not just due to the 3d comb filter they use but also the sharpening circuits. You wouldn’t see fully blended dithering on these, somewhat akin to increasing the NTSC resolution parameter a few notches :wink:

2 Likes

@guest.r

From ChatGPT, is any of this remotely accurate? :smiley:

I think it got deblur wrong/reversed.

And which one happens first? (Fast Sharpen or Horizontal Filter)? I.e., which is more the “analogue” part and which is the “sharpening circuit?” Or does the analogy not work?

Fast Sharpen Parameters (how they behave)

  • Sharpen Strength (0.00 – 5.00) Overall intensity of the sharpening kernel. Pushes local contrast at edges.
    • Low (<1.0): very subtle.
    • Medium (1.5–2.5): CRT-like edge pop without harshness.
    • High (>3.0): looks artificially processed (not CRT-like).
  • Sharpen Contrast / Ringing (0.00 – 0.25) Controls how much overshoot/undershoot you get around edges (like halos). Real CRT “sharpness” circuits did create mild ringing, especially on high-contrast edges, so a tiny bit (0.05–0.10) can look authentic. Too high makes it video-processor-ish.
  • Details Sharpened (0.00 – 1.00) Weighting between fine vs broad detail enhancement.
    • Low values (<0.3): favors larger structures, edges.
    • Mid (~0.5): balanced.
    • High (>0.7): starts to pull out pixel noise/ dithering (can look fake compared to real CRTs, which smear instead).
  • Deblur Strength (1.00 – 7.00) Acts like a pre-blur before sharpening (the higher, the stronger). Useful because composite CRT video always had some horizontal blur from bandwidth limits.
    • Lower (~1–2): more raw detail, but less composite-like.
    • Mid (~3–4): good CRT zone — mild blur that sharpening then “edges up.”
    • High (>5): very soft, then over-sharpened edges — more VHS than broadcast CRT.

:wrench: Horizontal Filter Parameters

  • Horizontal Filter Range (1.00 – 8.00) Controls how wide the horizontal kernel is (how far pixels are averaged).
    • Low (1.0–2.0): sharper, but less CRT-like.
    • Mid (2.5–4.0): typical broadcast CRT.
    • High (5.0+): very smeared (like a bad RF connection).
  • Horizontal Blur Sigma (0.10 – 7.00) Gaussian blur intensity within the filter. This is the core “softness” of composite.
    • Low (0.2–1.0): very sharp, almost RGB.
    • Mid (2.0–3.5): looks like a decent Trinitron with composite.
    • High (>5.0): soft like VHS or low-tier Zenith.
  • Subtractive Sharpness (0.00 – 3.00) Adds back high-frequency detail that was lost in the blur — basically a “peaking” filter.
    • Low (<0.5): smooth, soft.
    • Mid (~1.0): realistic CRT sharpness control.
    • High (2.0+): artificial crispness, like turning a TV’s sharpness knob way up.
  • Sharpness Definition (0.00 – 2.00) Controls what frequencies the sharpness boost targets.
    • Low (<0.5): boosts larger edges, ignores fine noise.
    • Mid (1.0–1.5): good balance, like most Trinitrons.
    • High (>1.5): starts pulling up dot crawl, noise, and pixel shimmer (good for “raw” composite look).
  • Maximum Sharpness (0.00 – 0.30) A limiter to prevent sharpening from going out of control. CRT sharpening circuits had a ceiling, so a little limit (~0.10–0.20) is realistic.
  • Subtractive Sharpness Ringing (0.00 – 4.00) How much haloing the sharpness boost creates around edges.
    • Low (<0.5): clean, restrained.
    • Mid (~1.0): authentic mild halos, like Trinitron.
    • High (2.0+): smeary, video-processed look (like cranking a cheap Zenith’s sharpness knob).

My latest implementations don’t have a separate sharpening filter, but they instead have a notch filter with controllable sharpness. The tricky thing about implementing a sharpening filter faithfully is that it’s hidden inside of the decoder chip, preventing us from knowing how it works. All I can find is the datasheet for the Sony CXA2025AS, which has a graph for the frequency response of the sharpening circuit. Other filters use fully visible inductor/capacitor circuits which are possible to run directly, so those can be done faithfully in theory. Currently, all I have implemented in code are simpler FIR filters.

2 Likes

I’m a bit surprised how good it got at estimating circumstances.

Horizontal filter has a special role to fit the viewport horizontal resolution in a decent manner. It’s a neccessary part, has some adjustable features, but it’s purpose is a general small resolution -> bigger resolution upscaling.

GPT got confused over deblur a bit, since it has “a cheap bonus sharpen stage” later on. First implementations of Fast Sharpen were very straight forward 1D convolution sharpeners, with some features like “smart detail sharpen” and clamping. Deblur is generally stronger, better, more instructions.

Custom-fast-sharpen + horizontal filters are not directly a part of the ntsc-adaptive composition, it’s more like a part of the guest-advanced-ntsc bundle, helping to achieve a nicely looking appearance and provide a filtering attachment with the used modern display.

3 Likes

Yeah I’m pretty impressed by how well it understands shaders in general. It’s helped me tweak a few things. Mostly it’s a back and forth process of “no, that’s not right, try again” until it comes up with something, but it seems to understand parameters and how they interact fairly well, most of the time. It correctly understood the distinction between regular and magic glow, for example.

2 Likes

Thinking about it some more -

In guest-advanced-ntsc, NTSC Adaptive Sharpness is the “sharpening circuit” - it’s a context-dependent sharpening intended to make the composite video stuff look better.

The horizontal filter is the analogue part - how sharp/clean the input signal is (maybe determined by higher quality electronics for the signal path).

The fast sharpen is like the “edge enhancement” that got added in the 90s (usually turned off completely by professional calibrators :smiley: )

2 Likes

Kind of a reference shot for sharpness- although this looks to be a bit oversharpened. This shows that sharpness was not really a problem for composite video, given the right sharpening circuits / comb filter / etc :slight_smile:

https://ibb.co/QvTwyDP0

2 Likes

This is a great shot, not perfect because there’s some. moiré, it’s slightly dark/under exposed and I’m almost certain it was more saturated in person but you can see that there’s only minimal chroma bleeding/chromatic aberration on the right vertical sides of the black and white tiles. I would add that this is not something you’d notice or would have stood out when playing a game, especially as a child back in the day but it did form part of the image etched in our brains.

I would add that excessive bleeding/blur/softness as well as that infamous chroma saturation loss was more a feature of an NTSC VHS recording of a composite signal being played back via composite as well.

The slower the recording speed the more bleeding, crosstalk and softness (sharpness/contrast loss).

Trust me, I used to record a lot of gameplay and watched my recordings for too long after selling my NES then returning a SNES that I borrowed.

In the meantime, I mostly played ultra jaggie PC games with the PC speaker to get a fix before eventually getting a soundcard late in the game.

Then, I got my TurboGrafx16 and after that my TurboDuo.

After that it was PC gaming all the way but I did enjoy consoles like the PS1 and N64 by friends and family and I was exposed to the DreamCast, GameCube and PS2 via stores, kiosks and friends and family again.

I did own a PSP and wasn’t a fan of the relatively underpowered GameBoy Advanced/GameCube then DS and it’s variants. I also had fun playing borrowed GameBoys back in the day.

This pic would have come out even better if the room was completely dark.

2 Likes

Eu tenho essa Tv aí de 38 polegadas, está aqui guardada onde trabalho. Mas sempre trago um Wii ou Xbox 360. Está absolutamente nova ainda e ainda acho ela muito melhor que todas as Wegas Subwoofer que tenho. https://youtu.be/kv08MmBI2mY