Interlacing shader w/ PAL PSX games


I can’t overstate how much I adore the interlacing shader. It’s the cornerstone of any shader preset I setup. I ran into a problem with it though.

Whenever I boot up a PAL PSX game, the scanlines present unevenly. This doesn’t occur with any NTSC title, and also doesn’t pop up on the Saturn or Genesis PAL games I tried to boot.

Integer scaling is on, aspect ratio is set to Core Provided (scanned through them, didn’t see any AR fix it anyway) but I can’t seem to figure this one out, and it’s hurting my head lol so I thought I’d ask for some help.


Are they uneven all the time or just during the BIOS sequence?


I’m afraid I can’t help because I never ever run PAL games in PAL mode. The few european exclusives (yes, I’m looking at you, Terranigma. Also you, Wip3out SE) I play are NTSC patched.

But I wonder, what’s so interesting about interlacing?


I took Mega Man X3 all the way into gameplay and it was still having the issue with the lines being uneven, so it’s not just the BIOS screen from what I can tell.

The view-space is noticably smaller for these PAL games as well. Almost letterboxed. I assumed that was just because of the weird PAL video quirks…

These shots are at 1080p, integer scaling on, core provided aspect ratio.

Mega Man X3 (PAL) Mega%20Man%20X3-181128-162842

Tomb Raider 3 (NTSC) Tomb%20Raider%203%20-%20Adventures%20of%20Lara%20Croft-181128-163058


What does patching to NTSC change, exactly? Does it just change the Hz or does it also change the display area? Wondering if maybe I should do that.

The Interlacing shader is awesome though because (this instance aside) It gives a perfect scanline effect for 320x240 resolution, and also interlaced scanlines at 640x480 - just like the old CRTs would do.

It works great on my actual CRT monitor with CRTSwitchRes too - which I was very happy about because I wasn’t playing PSX games at 480’p’ back in the day lol.


It doesn’t rescale the game or anything complicated like that. An NTSC patch basically fools the game into believing that it’s running on an NTSC system so it will be displayed at glorious 240p/60hz. Almost every PAL game released before the Dreamcast (as well as many ps2 titles) had ugly, content-squashing horizontal black bars (PAL being 576i/288p) and ran slower. I also feel that they had more input lag. Being european, I knew nothing about that as a child, but a japanese friend from high school taught me about NTSC during the ps1 era… I haven’t played anything PAL50 in about 20 years, and never again will. Shaders are designed with 240p in mind, that’s why you get the scaling issues.

Just get USA roms, and patch those must-have european exclusives (Ufouria, The Firemen, Terranigma, Wip3out SE and Keio’s Flying Squadron 2 are a few notorious examples).

Maybe I’m missing something, but I don’t get this. At 240p (320x240, progressive) you should get good scanlines anyway (which are actually the result of running 240p content on 480i screens), I don’t see why you would want to interlace that beautiful progressive signal. And 480i is natively interlaced, so why would you want to interlace already interlaced content? A de-interlacer would be a much more useful shader, actually. For example, most ps2 games are plagued by combing artifacts when played on a progressive display, and often very blurry when deinterlaced by pcsx2. It’s a tough nut to crack, that one.


So yeah, those black bars - does patching to NTSC remove those? It really seems like that’s what’s killing my setup for these games. It’s just weird 'cause it’s only the PSX PAL games, and not PAL on any other system (that I’ve seen, anyway), that causes the issue.

As for the point of using the shader - I think I left out some context. Particularly that I’m talking about internal resolution - and which displays I’m using. It kinda works like this:

LCD TV - Shader Off - 1080p Resolution

  • Internal 240p = No native scanlines (typical stock image [bad])
  • Internal 480i = No native interlacing (shows as 480p)

CRT PC Monitor - Shader Off - CRTSwitchRes On

  • Internal 240p = Native scanlines (100% perfect)
  • Internal 480i = No native interlacing (shows as 480p)

LCD TV - Shader On - 1080p Resolution

  • Internal 240p = Proper scanline effect (‘look’ of perfect scanlines)
  • Internal 480i = Proper interlacing effect (‘looks’ exactly like 480i)

CRT PC Monitor - Shader On - CRTSwitchRes On

  • Internal 240p = Native scanlines (100% perfect)
  • Internal 480i = Proper interlacing effect (‘looks’ exactly like 480i)

The really nice thing is that at 240p via SwitchRes on the CRT, the shader doesn’t show at all, so it’s like it’s not even on during those times, but when it switches to ‘internal’ 480i content, you get the interlacing effect automatically. Basically making it behave like a CRT TV would in a real hardware setup. I’m sure there are some minor inconsistencies that I’m not aware of between the shader effects and ‘real’ 240p & 480i - but for my money - it looks as perfect as I could ever ask for.

What’s more, is that with these effects in place, I was able to create shaders that (to me) really mimic what different cable connections look like. Here are some examples so you can see what I mean.

Gonna have to right click and “View Image” to get them to show properly.

No Shader




I’m sure it’s pretty obvious that these aren’t a perfect match to the connections I labeled them as, it’s just sort of what I was able to throw together on top of interlacing.glsl - and for my purposes I think it looks very nice.

I’m sure someone better with shaders and more familiar with the ins/outs of these signals could do a much more ‘accurate’ job.

Anyway, I hope that all makes more sense. lol

EDIT: Replaced the Rayman 2 shots with Zelda LttP ones - I’m an idiot and wasn’t thinking that Rayman 2 is 640x480 hahaha.


the output of beetle-psx when displaying 480i content is essentially already deinterlaced using the “weave” method. That is, all fields are shown and held until they are refreshed. This leads to combing artifacts in motion.

We have deinterlacing shaders that will take that woven output and replace it with bobbed/line-doubled progressive output, but we also have the interlacing shader, which replaces it with the original interlaced look. That is, with black lines over half of the fields, alternating which are shown on each frame.

If you output to a 31 khz CRT, the interlacing shader looks identical to the original 480i input but with a 480p signal. On 240p content, it just draws black lines over half of the normal line-doubled image.


Alright, so if I understood correctly, the interlacing shader can be used as a crt shader of sorts?

I would still recommend playing the USA rom, but yes, if for some reason it has to be PAL, an NTSC patch will remove the black bars : )


Sort of. It reproduces one effect that was characteristic of certain signals on a CRT. It doesn’t do any of the other CRT stuff, like beam dynamics or phosphor masks, etc. though.

I wrote it specifically for the usecase of people outputting low-res 480p signals to CRT monitors with the purpose of making a fake/lookalike 240p, since it’s much, much easier to do this than to actually produce 240p signals from a modern GPU.


Can’t tell you how much I appreciate this shader hunterk <3 and all the work you do, honestly.

@Squalo - thanks for the clarification. I’ll take a look into patching those few games. My general collection priority is always NTSC-U > PAL > NTSC-J so the only time I have PAL is if it was never sold in the states. Luckily that’s a very small fraction of gaming haha


And even ‘luckylier’, most of (if not all) that small fraction can be NTSC’d :smiley: