The Scanline Classic Shader

This is official support thread for my Scanline Classic shader. You can find the latest version with instructions here:

Right now only slang is supported, but I plan to port a GLSL version for standalone emulators and a BGFX version for MAME. There is also only one preset. I plan to make presets for a variety of systems and use cases.


  • Color correction and transformations: the shader supports three color mode. The first two are monochrome modes. You can specify either one or two color primaries which determine the tone of grayscale. This can be used to simulate a variety of monochrome monitors. The third is the color mode. Any three-primary color space can be simulated by providing the appropriate chromaticity values and weights. If you need help determining the values for a particular look, feel free to ask for help.

  • Phosphor decay simulation: the shader simulates phosphor decay in XYZ space, which has the advantage of allowing some very cool effects when used with the two-primary monochrome mode. Other shaders do their decay in RGB space, so these effects wouldn’t be possible.

  • Scanline and interlacing simulation: Flexible scanlines can be added. The scanlines are always solid, but the ratio of active scanline width to blank scanline width can be adjusted. Line-doubling can also be simulated by setting ON_PIXELS to 2 and OFF_PIXELS to 0. Interlacing can be supported by setting an appropriate HI_RES_THRES value and this allows for automatic progressive and interlaced mode switching for systems like the PlayStation.

  • Geometry simulation: There are settings for pincushioning and barreling the image. Various tweaks are possible to get a variety of effects. These effects account for the display aspect ratio, something many shaders don’t account for. The overall output can be scaled to a desired picture fill.

  • Shadow mask simulation: Several of the usual shadow mask patterns are available.

Here are screenshots showing what this shader can do:


Updates on what I’ve been working on

I want a shader that can mimic the functionality of Multi-Scan CRTs. It’s a myth that CRTs have no native resolution or can handle any display mode thrown at them, otherwise we would have never needed circuitry to handle NTSC signals on PAL televisions! While CRTs are a lot more flexible than LCDs in how they present a signal, they are in many ways a lot less flexibile and scaleable than LCDs because they usually only operate on a fixed horizontal frequency. For decades that was about 15 kHz, but that started to change in the 16-bit computing era.

One of the pioneers for faster scanning displays was Atari, who used ‘medium res’ monitors in some of their games. The idea behind medium res is simple, take the resolution of a PAL monitor and combine it with the refresh rate of an NTSC monitor. Voila, you have a monitor that can display more lines without the flicker.

If you see a computer system or arcade game that runs at around 384 lines, you know it’s a medium res game. Mac 68ks and the IBM EGA monitor used these resolutions.

Multitasking started to become more important in the mid-80s. 240 and 384 line displays were adequate for single-task workflows, but felt cramped in multi-tasking environments. The easiest way to experience this is to emulate something like a Mac SE/30 and exploring the file system with Finder. Your screen will fill up with windows very quickly. 480 line displays were sorely needed. The Japanese had already solved this problem with 480i modes (because their language requires more pixel-space to render clearly), but this was improved upon by doubling the 15 kHz scan rate to 31 kHz, and thus 480p monitors came about.

But what if we want to display a 15 kHz signal on a 31 kHz monitor? Now the issue of being able to drive multiple signal types with a CRT really needed to be solved. It couldn’t simply be ignored because we needed backwards compatibility with old software designed for ~240 line environment. Video cards also supported more colors on the lower resolutions, which were critical for games. The Japanese (again) had already solved this problem. When a 31 kHz monitor encounters a 15 kHz signal, it doubles the lines and displays the signal with the vertical resolution essentially upscaled as NN. But these monitors were quite expensive. IBM came up with the solution of doing the line-doubling with the video card, so any 31 kHz monitor can be used with the ‘15 kHz’ signals.

Finally it was in the 90s where Multi-Scan capable monitors started to become more common. A CRT monitor would be advertised as supporting a maximum resolution, and the CRT would have built-in circuitry to handle most of the common modes. The CRT could theoretically handle a much larger variety of modes as long as it fell within the configurable horizontal frequency range. Over time, the minimum frequency increased until it capped to 31 kHz, VGA frequency, because VGA compatibility was so critical.

Why does any of this matter for shaders? It doesn’t matter at all for consoles (at least not until we get into HD consoles, at which point CRTs were so rare it’s perhaps not meaningful to have CRT shaders for them). It only matters for computer emulation, and only if we care about the artifacts of a multi-scan monitor. There are three primary complications that multi-scanning has: 1. Scanlines become more visible as lower resolutions are used; 2. the overall brightness of the screen diminishes at lower resolutions (because of 1); and 3. the geometry and positioning of the screen changes depending on the resolution chosen.

1 and 2 are easy enough to implement. We just have to decide on what the ideal resolution of our simulated monitor is. That resolution would not have visible scanlines. When the emulator switches to a mode with a smaller resolution, we would add scanlines. Adding the scanlines would automatically cover 2. 3 is complicated, and every monitor would handle these differently. Actually, it goes back to number 1, because they wouldn’t necessarily cover the same picture area with a lower resolution, and so scanlines might not scale linearly in that respect. Another complication is aspect ratio. Most monitors would just spit everything out in whatever aspect ratio they were designed for, so a 4:3 monitor would distort a 5:4 resolution and vice versa. Late professional monitors (very expensive) could actually handle different aspect ratios and display them correctly.

But 3 is really a non-issue when you consider that by the time Multi-Scan monitors were common, geometry controls on monitors were also common, and usually the user would adjust the picture to fill out the screen to a comfortable (usually we would leave a bit of space to avoid too much curvature). So really it is just 1 that I’m going to focus on.


FYI the Multi-Scan feature has been added and the shader is now at version 4.1. At this point I feel like the shader has all the features needed, but I’ll play with it for a few months to see if I can think of anything else to add.

As a bonus here’s a screenshot of DOS in 80x25 mode on a simulated 1024x768 monitor:


Hey there anikom15,

I read your comment about contrast simulation of a CRT in this other thread ( ). That’s definitely an interesting topic. Since you seem to value gamma quality I thought you may find the following about contrast simulation useful.

For simulating the analog CRT “brightness” (black level in analog terms) and “contrast” (brightness in analog terms) of old monitors and TV’s implementing BT. 1886 gamma control seems to do exactly what you want regarding analog contrast, without faking it. Note that Appendix 1 and 2 of the official spec document is what you would want to look into, not the vanilla implementation.

I’ve written a bit about it in the past in another thread and separately from this recently Dogway has implemented it in his grade - update 2023.slang shader.

My post on it here: SpectraCal’s Director of Software Development, Joel Barsotti, argues that the BT. 1886 gamma function (EOTF) more accurately matches legacy CRT technology than previous power law based functions. The argument is that it properly takes into account the legacy “contrast” and “brightness” controls on CRTs.

SpectraCal FAQ on BT. 1886 here: BT.1886 – 10 Questions, 10 Answers

The relevant Rec. ITU-R BT. 1886 document here: Recommendation ITU-R BT.1886 . Be sure to read Appendix 1 and Appendix 2 as that is what you would want to play around with.

Not that you need it, but if you want a quick look at the effect of implementing these analog controls through BT. 1886 EOTF you could try Dogway’s GRADE shader. Here’s his recent implementation forum post: I updated Grade with what I’ve learned in the last 2 years. And his GitHub with his latest RC6 release of the Grade shader: grade - update 2023 RC6.slang


FYI, a bunch of monochrome presets are now available in the repo.

@rafan, thanks for pointing this out. I’m not sure if I want to add this complexity at this point, but I think it will end up being something that gets added to the advanced shader later down the road.


Well, porting this to MAME is on halt because I can’t seem to compile bgfx-tools and I haven’t been able to get in touch with the guy who maintains that. I probably didn’t set up my environment correctly.

I think I will add another parameter which will break compatibility again. I think having a MIN_SCAN_RATE parameter to finely control the line doubler is justified.

An update! But nothing new to try, just plans for the future. Life moves on and unfortunately I do have a fulltime job and lifestyle I have to keep up. But I digress.

After almost a year of using this shader, I am very happy with it. I never got it ported to MAME’s BGFX driver, but I did create a single pass version that can work as a standalone GLSL shader. I’ll provide that as an update sometime soon. This can be used in many emulators that Retroarch doesn’t support, like 86Box.

The big new thing I’m going to work on this year though will be support for colorspaces other than sRGB! My TV supports 10- and 12-bit input and I just got a video card that can support it. My TV also can accept input signals for DCI, Adobe RGB, and BT.2020 (we’ll see if I can find out how much coverage of each color space the TV supports). Here’s a quick overview of what all of that means:

Adobe RGB is the oldest and was originally used by Photoshop to provide a bigger colorspace for photographers to edit their work in. Adobe RGB is mainly supported in expensive professional monitors. It has largely been supplanted by other spaces.

DCI is the color space used by digital cinema equipment. It is famously used by Apple with some slight variations in their iPhone (and maybe some Mac OS X support?). This could be an important space to support for Retroarch on mobile.

BT.2020 is the largest and most recent of the color spaces. It’s really meant to cover as much of our visible color spectrum as possible in order to display content made in either sRGB, DCI, or something else. This will be the one I work on first as this is the current standard for UHD and HDR content.

10- and 12-bit color is also important. sRGB uses 8-bit color, so if you try to use a larger gamut with just 8 bits you’ll get banding. How visible that is is a matter of debate, but 10-bit and 12-bit color provide enough dynamic range to present any of the above color spaces without banding. My display and video card supports both modes and I will test both.

My TV also supports automatically switching colorspace depending on the content providing, but I’m not sure how that will work with Windows and Retroarch. For now, I will have to manually set the colorspace through the shader and through the TV to match. The big question is if Retroarch will support a 10-bit or 12-bit framebuffer. If it doesn’t, I’ll have to modify it to work. This might be tricky to test!