The Holy Grail: What is the right way to use a CRT television?

I have an arcade cabinet with a 16 year-old 27" Sony/Trinitron CRT television (NTSC) attached over S-Video to my Lakka PC. Initially I couldn’t get Lakka to sync with the CRT and had to modify extlinux.conf to include the parameter “nouveau.tv_norm=NTSC-M video=TV-1:640x480@60i”. That post and the full thread is here:

http://libretro.com/forums/showthread.php?t=5899&page=2&p=39949&viewfull=1#post39949

My goal is to as closely as possible replicate the way the original consoles/machines sent video output to an NTSC CRT television. So first some questions:

[ol] [li]The original machines had various odd resolutions and had to run games on either PAL or NTSC CRT televisions, which support both different resolutions and refresh rates. How did the consoles (or the games themselves) accomplish this, or were all these odd resolutions in fact valid PAL or NTSC modes?[/li][li]How would, for example, the PAL and NTSC versions of “The Legend of Zelda” differ, or would only the PAL and NTSC consoles differ by up/down-scaling the game to comply with a valid resolution? Or perhaps both the consoles and the games were PAL/NTSC-specific?[/li][li]Did NTSC consoles all output 640x480, upscaling their content, or did they output their games’ native resolutions which were valid NTSC modes and the CRT switched to lower resolutions?[/li][li]Was the phosphor resolution of most CRT televisions at the time of classic consoles 640x480?[/li][li]Is it possible, either with nouveau or the binary NVIDIA driver, to have Lakka dynamically switch to and output each console/game’s original NTSC mode thus replicating it faithfully on an NTSC CRT?[/li][/ol]

I believe what’s happening now on my arcade cabinet is that my Lakka PC is outputting a fixed 640x480@60i mode and simply upscaling the game’s native resolution using either integer or non-integer scaling. I also have to force a 3:2 aspect ratio to get the games to fill the screen with no black borders (all the other AR’s distort the game and/or leave large borders) but even so there is still improper overscan compared to how the original consoles displayed the games on a CRT (I know this because I’m 40 years old).

I could be wrong, and hope to have this answered here, but I believe the beauty of pixel art designed to be displayed on a CRT television requires that each row of pixels be represented by a single pass of the electron gun at a phosphor resolution of 640x480, and that upscaling a game’s native resolution from, for example, 320x240 to 640x480 is an inaccurate representation because there are two electron gun passes for each row of pixels instead of only one pass. In other words, it starts to look like running an emulator on a CRT computer monitor which has a significantly denser phosphor resolution than a CRT television.

I probably only half-way know what I’m talking about, but my current setup despite my best efforts is not representative of the “real thing” and if anyone could give me insight into how to accomplish this it would help not just me but anyone else trying to create a historically/technically accurate old-school experience. I think it’s important, because some shitty scanline overlay or shader is a disservice to those too young to remember what it was like to be there.

  1. All consoles output standards-compliant NTSC/PAL signals, though they typically used the “240p”/non-interlaced/double-strike mode that showed half the fields twice as often.

  2. The consoles for different regions didn’t do any up/down-scaling, but they encoded the image to their specific region. This means a lot of PAL games have black borders on the top and bottom. Sometimes games got proper PAL ports, but often they were just repackaged and ran more slowly (i.e., 60 hz logic slowed down to PAL’s 50 hz).

  3. They put out a variety of resolutions but all fit within the 480i NTSC spec. Sometimes that means true 480i, sometimes ~240-line non-interlaced. CRTs didn’t do any processing on the signal because they’re remarkably simple devices. They just take in a signal that gets fed to an electron gun, and that electron beam gets steered by magnets. Nothing else goes on.

  4. It varies quite a bit. You can find specs on CRTs that include “dot pitch”–the number of phosphors per inch–and “lines”–how many lines of phosphors on the screen, and these variables determined how sharp/blurry the image would be (for example, 800-line PVMs are sharper than 600-line models; late-era CRT computer monitors could be 1000+ and super-sharp even on low-res signals).

  5. The 480i mode that you’re using is pretty close, and it’s exactly correct for some games (Sonic 2’s multiplayer mode, R.P.M. Racing, to name a few). To get the other games exactly correct, you would need to change to a 240p modeline and then use integer scaling. I use an “ultrawide” modeline (1920x240) on my SD CRTs, and this lets me run any non-interlaced game at native res. If you’re using a display that shows the overscan area, you’ll see the black letterbox borders grow/shrink depending on the game/console but on a TV, it will mostly fall into the overscan area as intended.

The options you want to set in RetroArch are point filtering and integer scaling. If your TV could handle 480p, I would recommend running at that resolution and then applying the interlacing.cg shader, which draws black lines across half of a line-doubled (i.e., *x480) image. This looks extremely close to a “240p” non-interlaced image but properly handles 480i content, as well (the aforementioned 16-bit games, along with the many PS1/N64 games that used 480i for menus, high-res, etc.), unlike using a 240p modeline.

In your case, since you’re using an actual TV connected via s-video, your options are to stick with your current 480i setup or try to get a 240p modeline working (which would break any interlaced games/screens). When I was using a TV-out card to a regular TV, I was never able to get my PC to output anything less than 800x600 hardware-downrezzed to 480i, which looked a little blurrier than an actual console hooked up over svideo and interlaced everything (regular TVs were so blurry that it was hard to differentiate interlaced and non-interlaced content anyway, so not too big of a deal). If you can get a display with RGB hookups, you can get better/sharper, though that’s not going to look like what you remember from 40 yrs ago, either.

Arcade games are harder, since they didn’t need to align so closely with the NTSC spec and take more liberties with resolution and refresh rate. I can run most horizontal games on my cabinet using a 1920x240 modeline but RetroArch speeds them up to 60 hz (you can get around the speedup using some options but I left it for simplicity). If you want to play a bunch of arcade games, I would recommend looking into GroovyMAME which is designed from the ground up to show the native res and native refresh.

@hunterk, I hope someone’s paying you well for whatever you’re doing when not writing excellent responses to questions in these forums; thanks for some great information. :slight_smile:

I’m in the process now of flailing with the settings like a madman, so I’ll let everyone know if I figure anything out or have more questions that could lead to the path of success. So far apparently it’s possible for me to do a 320x240 mode when the Lakka logo displays, but not so much in-menu or in-game which appears to use something higher-res, possibly 640x480. Nouveau also appears to default to a 720x480 mode (NTSC DV, I believe), when not otherwise specified in extlinux.conf. May be a characteristic of the CRT TV I’m using.

Not yet at 240p (or X11 or nvidia binary driver) but I finally have results that are “good enough” with nouveau and my CRT television over S-Video. I also wrote some short text files that can be saved in /storage (“chmod somescript.sh +x” to make executable) to allow loading/editing/saving and backing up of the read-only extlinux.conf and regular retroarch.cfg files for more convenient settings experimentation over ssh.

The the three scripts and my current/working extlinux.conf and retroarch.cfg are linked below, so if anyone needs good settings for pc/lakka/nouveau/retroarch/svideo/crt/tv, these settings may work and the scripts will let you experiment with them faster. Hopefully this can help some people while I continue to attempt refinements toward the most authentic representation possible.

http://eightvirtues.com/lakka/

[TXT]    backupconf.sh    2016-06-08 04:15    231     
[TXT]    editboot.sh    2016-06-08 04:15    115     
[TXT]    editconfig.sh    2016-06-08 04:15    136     
[   ]    extlinux.conf    2016-06-08 04:10    270     
[   ]    retroarch.cfg    2016-06-08 04:10    51K     

[QUOTE=kevinfishburne;40462]@hunterk, I hope someone’s paying you well for whatever you’re doing when not writing excellent responses to questions in these forums; thanks for some great information. :slight_smile:

I’m in the process now of flailing with the settings like a madman, so I’ll let everyone know if I figure anything out or have more questions that could lead to the path of success. So far apparently it’s possible for me to do a 320x240 mode when the Lakka logo displays, but not so much in-menu or in-game which appears to use something higher-res, possibly 640x480. Nouveau also appears to default to a 720x480 mode (NTSC DV, I believe), when not otherwise specified in extlinux.conf. May be a characteristic of the CRT TV I’m using.[/QUOTE]

I’m also having the same issues and I’ve spent the last couple of days trying to work this out. I’ve got the Lakka splash being displayed in 240p, but as soon as I get into the menu, everything is back to 480i. I’m running Lakka on a RPi3 and outputting via composite to SD CRT in case anyone has been able to make this work already. I’m open to any and all suggestions.

It looks like you can get 240p out of the rpi HDMI port: http://elinux.org/RPiconfig but not out of the composite port, which sets the interlace bit in the firmware: https://www.raspberrypi.org/forums/viewtopic.php?f=41&t=24597

So, if you do 240p out of the hdmi port into a hdmi>VGA converter, then VGA to RCA/svideo…? That’s a lot of adapters :frowning:

I think filtering needs to be turned on in order to get the variable horizontal resolution to fit the render target horizontal resolution. SMB, for example, is 256x240, yet needs to be scaled to 320x240 or 640x480 to fit NTSC standards. Without high-resolution or bilinear filtering for low-res modes, there is crawl and bad scaling. There is a disconnect between the game state image and the video output signal, although SMB looks as good as I’ve seen it in decades, as do Genesis and SNES games. I’m going to try applying a scanline shader in retroarch to study what happens on my CRT. So far I’m terribly happy, other than the two-player arcade kit stick bug.

Yeah, the main ways of handling that are: bilinear filtering (the worst option; makes everything blurrier than necessary), exact native res (looks great but makes it hard to play different consoles) or ultrawide (CRTs don’t have a set horizontal resolution and, while they typically received signals that were low-res, they can go up as high as your video card will let you). If you do an ultrawide res and set your custom viewport width to stretch out to fill it, horizontal scaling artifacts will disappear into the display’s natural blur.

could you elaborate a little more about ultrawide res on crt? i wasnt aware of this

There’s not much more to say about it. The highest I’m able to run mine is 1920x240 (it’s a GPU limitation, rather than a CRT limitation), while others have been able to go all the way to 3840x240, which is the best res for this, since it’s an even integer multiple of pretty much everything (NES/SNES, MS/Genesis/MD, CPS1/2/3, etc). Even for the few stragglers that aren’t even integer factors (such as Mortal Kombat), when the ultrawide resolution gets averaged across the TV’s surface, the horizontal scrolling doesn’t get all funky.

You have to use custom modelines for it, since OSes don’t ship those sorts of resolutions out-of-the-box, but once you get a working modeline, everything else just works.

[QUOTE=hunterk;40608]There’s not much more to say about it. The highest I’m able to run mine is 1920x240 (it’s a GPU limitation, rather than a CRT limitation), while others have been able to go all the way to 3840x240, which is the best res for this, since it’s an even integer multiple of pretty much everything (NES/SNES, MS/Genesis/MD, CPS1/2/3, etc). Even for the few stragglers that aren’t even integer factors (such as Mortal Kombat), when the ultrawide resolution gets averaged across the TV’s surface, the horizontal scrolling doesn’t get all funky.

You have to use custom modelines for it, since OSes don’t ship those sorts of resolutions out-of-the-box, but once you get a working modeline, everything else just works.[/QUOTE] perfect will try later

Which requires X11 and an xorg.conf, I’m assuming? Lakka doesn’t have X11 currently, although I’d like to figure out how to install it until it’s supported again in future builds.

I use KMS + a custom EDID. However, I’m not sure you can do that with Lakka’s read-only filesystem. I can try to talk to kivutar and see if there’s a solution for getting something like that in there.

I can have my way with Lakka’s “read only” filesystem by remounting it over ssh (see link to my other thread at beginning of this one):

Remount OS as read/write.

mount -o remount,rw /flash

Remount OS as read-only.

mount -o remount,ro /flash

There’s no apt, though, and since it uses OpenElec, which I know nothing about, I’m not sure how to implement your solution or install X11 and the binary NVIDIA driver. If kivutar could help, that would kick ass. Unless of course it involves using wget to download 1000 source trees and compile them all manually. :slight_smile:

On that note, Lakka really should include the option (perhaps during installation) of installing X11 and possibly the binary NVIDIA drivers. If using an ultrawide resolution and setting modelines is a potentially critical part of getting the games to display correctly (a core function of the entire project), requiring a couple hundred megabytes of additional storage should not even be a consideration. Storage capacity is not a problem these days, even on the humble Raspberry Pi. Other than burning Lakka to a chip there is no justification for lacking this functionality due to storage requirements.

Well, running on a CRT is pretty niche, so making the installation more complex to benefit those few users may not be the best idea. The main goal of Lakka is to be a simple, turnkey solution, so people with special use-cases can just use a regular ol’ Linux distro and compile/install RetroArch themselves.

Anyway, if you’re on x86_64, you should be able to use my EDID:

If you’re on 32-bit, you’ll have to compile your own but it’s fairly easy and shouldn’t require any special toolchain, etc. I’m happy to help if you get stuck.

I know it´s not on the list of near to be made projects, but a groovyarcade-like instalation of lakka would be a dream come true. Either way i´´ve read all what you said and going to try Retroarch on my crt-emu pc

I’ve figured out how to use smaller ultrawide NTSC resolutions under Lakka with nouveau, and I think I’ve also figured out why the CRT television’s phosphor density is insufficient to represent the pixels both horizontally and vertically without obvious scaling artifacts or having to use the bilinear filter. The nouveau video mode/buffer set with a kernel parameter and the viewport dimensions set in RetroArch’s settings are both important, and an incorrect AR in the latter can prevent Lakka from displaying the menu (endlessly attempts to switch between two video modes).

I connected a cheap DVD player to the same CRT television over S-Video and it overscanned properly and its shitty menu had proper scanlines/resolution/phosphor-illumination. The problem is the nouveau driver, which fails to output a “normal” signal over S-Video at any resolution or AR. It’s like it has built-in overscan compensation with no way to turn it off and scales the image sent to it before outputting it to the TV out, causing artifacts and an unnecessary layer of confusion when testing. I’ve looked online for a kernel parameter to disable this and can’t find it. I’d use RetroArch and libretro on top of a distro like Mint (my workstation’s OS) so I could use the NVIDIA binary driver’s ability to properly detect and control video outputs and settings, but RetroArch alone has some crippling bugs, a couple of which I’ve already filed on github, which prevents me from doing so.

I’m not sure how to proceed other than continuing with acceptable but imperfect settings. Hopefully this project (Lakka and RetroArch) will continue to progress such that it’ll be the go-to method of running classic games.

Thanks for the custom EDID info. I’ve made some progress since I installed Mint 17.3 and RetroArch from the testing PPA (overwriting the Lakka installation) to test xorg.conf settings with the binary NVIDIA driver. I still have the Lakka scripts and some in-progress/example config files here:

http://eightvirtues.com/lakka/

Under Mint it’s insufficient to set the screen resolution using nvidia-settings because it crashes when “applying” them (old driver for old card with S-Video output) or by editing /etc/X11/xorg.conf, as after logging in the xorg.conf resolution is overruled by a Mint/Cinnamon utility. Mint’s mdm uses the config file “monitors.xml” to set the desktop resolution after login. To override mdm’s desktop resolution settings (it has a bad GUI tool; don’t use except to create skeleton file) create the file “monitors.xml” in ~/.config/ using mine as a reference (see above link). I’m running at 640x480 now, but experimenting with other modes.

To disable overscan compensation (or control its levels), edit (or create using nvidia-settings) the file ~/.nvidia-settings-rc (see example in above link) and set a TVOverScan value between 0 and 25. The userspace nvidia-settings daemon must be running to enforce these settings during login, which can be toggled via the “Startup Applications” GUI in Mint. A TVOverScan value of 0 results in a smaller desktop on-screen (black borders) and a value of 25 results in significant overscanning, like proper old hardware. A value of 20 may be a sensible default…still experimenting.

I also created this test image (640x480) to see how the signal displayed relative to the CRT’s phosphors up close: http://eightvirtues.com/lakka/NTSC,%20640x480.png Something interesting I noticed was that the orientation of the projected image on the Z axis was off by about 0.5 degrees. I think it may be a common anomaly of CRT televisions for the orientation/vector of the electron gun to be slightly off. Could be electromagnetic interference, but nothing’s that close or asymmetrical to the CRT to cause it. I’m considering trying to manually adjust the electron gun’s mounting point to compensate if possible without destroying it.

Have you tried using xrandr? I imagine you’d have an easier time of things than messing with your xorg.conf. It’s also nice in that it only lasts as long as that session, so it’s easier to correct bad options than ssh-ing in to modify text files, etc. Whenever you get settings you like, you can add them to your .bashrc or whatever.

For your z-axis problem, do you mean the image is rotated? If so, that’s fixed by adjusting the yoke while it’s running, which is pretty easy to fuck up and ruin your set, so proceed with caution.

I’ve deleted my xorg.conf and have been experimenting with xrandr. I’m using it like this, but it’s not really working so far:

cvt 320 240 60
# 320x240 59.52 Hz (CVT 0.08M3) hsync: 15.00 kHz; pclk: 6.00 MHz
Modeline "320x240_60.00"    6.00  320 336 360 400  240 243 247 252 -hsync +vsync
xrandr --newmode "320x240_60.00"    6.00  320 336 360 400  240 243 247 252 -hsync +vsync
xrandr --addmode TV-0 "320x240_60.00"
X Error of failed request:  BadMatch (invalid parameter attributes)
Major opcode of failed request:  140 (RANDR)
Minor opcode of failed request:  18 (RRAddOutputMode)
Serial number of failed request:  29
Current serial number in output stream:  30

I’ve also tried this with 1280x240 and 1920x240 and get the same result. Am I doing it wrong, or are these modes simply not supported by the TV encoder on my video card? I’m using the binary NVIDIA drivers. Even if I could get 1024x240, 800x240, or 640x240 I can eliminate horizontal crawl and vertical blurriness by enabling bilinear filtering, as since the height will be 240 pixels the filter shouldn’t affect it, but only blur the pixels horizontally.

Yeah, it’s off by about 0.5 degrees counter-clockwise. I’ll probably hire a local TV repair tech to calibrate the CRT, as I’d probably end up destroying it. :slight_smile: