Back in the CRT club! How to make 240p work with RA?


#1

scored a decent PC CRT! Picture looks good with brightness/contrast at default setting, and I’m pretty sure it was used only rarely, as the previous owner was in a nursing home. It’s a 17” Compaq 7600, “flat” screen, but you can clearly see the classic CRT curvature underneath the glass surface. (or is that pincushion distortion? Hard to tell). Manufacture date is 2007(!), so it’s among the last CRTs ever produced.

It’s been so long since I’ve done this that I’ve forgotten how to set up 240p with Retroarch. Using Custom Resolution Utility I’ve created a 3840x240 @ 120Hz resolution. With black frame insertion this should be the same as 60Hz.

First, what changes do I need to make to the RA config to make the monitor switch resolutions to 3840x240 @ 120 Hz when RA is launched?

Has anyone had any success doing this with Intel integrated graphics? Intel Integrated graphics is giving me some major headaches. The drivers appear to be broken; I can create a custom resolution within the Intel control panel, and when I go to “remove custom resolution,” it lists the custom resolution, so it’s clearly adding something. However, when I open the Intel Graphics display driver properties and click “list modes,” the custom resolution isn’t listed. In fact, only 60 Hz resolutions are listed, and it doesn’t even list common vga resolutions like 640x480. I’m using Windows 10 and reinstalled the latest drivers. Any ideas? I want my 240p!


#2

eyyy, congrats! Looks like a nice one.

Once the modeline is created, it should just be a matter of setting the video_fullscreen_x/y settings in your retroarch.cfg and using exclusive fullscreen.


#3

Is this correct? Anything else I need to do?

video_fullscreen = “true”

video_windowed_fullscreen = “false”

what about these?

video_window_x = “0”

video_window_y = “0”

Edit: okay, now when RA launches the monitor just switches off, and comes back on when I exit RA, whereas before, nothing happened. Any idea what this means?


#4

Hello NESGUY

I’am also looking for some info about [email protected] (with a Sony trinitron Multiscan 20" CRT)

“video_window_x” and “video_window_y” are useful when you set “video_windowed_fullscreen” to “true”

In your case, you need to set “video_fullscreen_x=3840” and “video_fullscreen_y=240”.


#5

if it switches off, that usually means the modeline is incompatible. does it work if you switch to it on your desktop?


#6

@hunterk

The mode isn’t even listed when I go to display properties -> list modes. If I open CRU, the mode is listed. If I add the custom resolution using Intel control panel, the mode appears in this list within the control panel, but it still doesn’t appear as a selectable mode under display properties. Intel just sucks. I’ve been reading some forums and they’ve had problems with custom resolutions going back 5 years, and apparently they still haven’t fixed this.

At this point, I’m wondering if there’s a work around that doesn’t involve buying a dedicated GPU.

This is what I get, currently:


#7

You could try DTD calculator if you haven’t yet, it’s supposed to be an alternative tool specifically for Intel.


#8

You can also try using 480-line resolutions with the interlacing shader. The scanlines lose a tiny bit of roundness at the edges, but it’s otherwise identical to 240p (and it properly handles interlaced content, which 240-line modelines won’t) and GPUs are usually much more friendly toward those modelines.


#9

Yeah, I’m thinking what you suggested might be the best solution, since I was eventually able to get common vga resolutions to work.

I’m thinking 640x480 with the Pixellate shader to take care of scaling artifacts on the X axis. Add interlacing shader for scanlines, and one of the shaders with adjustable Quilez scaling (image-adjustment?) to restore the slight rounding of the scanline edges.

I’m curious, though- is there a difference in scanline beam width variability with 240p compared to 480p + interlacing shader?

It’s pretty annoying that I can’t get 240p to work with Intel integrated graphics, though. There’s no reason why the hardware can’t handle it. This is basically a deal breaker as far as buying any future Intel products.

Is there maybe some kind of Linux wizardry that will force 240p despite Intel’s drivers being (apparently) broken?


#10

If there’s any beam width difference, it’s very, very slight.

I would probably still try to use an ultrawide resolution if you can. That will really help with the horizontal scaling artifacts. Also check out tvout-tweaks for any horizontal scaling/blurring needs.

Linux Intel doesn’t have any issues in Linux. You just use xrandr and set whichever modeline you want.


#11

Keep in mind most consoles before PSX use 224 active lines, which means you need to add something like 2560x448 for those.

The beam width is gonna be diminished because of the loss of brightness, but the variance is gonna be pretty much the same because 31kHz CRT monitors are very flat by design. If you want to take some life-threatening risks, you can access the knobs in the inside of your CRT and play with those for wide beam height variance, but you could die if you touch anything you shouldn’t by mistake.

Another thing, if you are disgusted by the excesive sharpness of your display, there are some shaders worth trying, that can somewhat increase the resolution without looking too fake or blurry, for example, try XBR just on horizontal and keep vertical native resolution can look pretty nice.


#12

Okay, so now (somehow) I can get the custom resolution to appear in the intel control panel:

However, when I click “apply,” the monitor immediately shuts off/goes to sleep.

This time, I don’t have any custom resolutions added via the Intel control panel. Here is the custom res that I’ve set up in CRU, just to make sure I didn’t screw anything up there:

here’s what it looks like when I go to advanced display settings -> display adapter properties -> monitor tab:

If I go to display settings -> display adapter properties -> adapter tab -> list all modes:

Only 60 Hz resolutions are listed, and I don’t see the [email protected] resolution listed. :neutral_face:

Edit: FWIW, I’m encountering the same issue when I try to add [email protected] I’m looking into opening an issue with Intel support ATM.


#13

This looks like it might actually work, but it also looks pretty scary. The link to the wiki article explaining what this is doing leads to a 404 and the link to AVS forum where instructions are located is also dead. :neutral_face: I really don’t know what any of this stuff means and I don’t want to break anything.

I think it can be inferred from the existence of this utility that Intel Integrated Graphics is a pain in the ass when it comes to adding custom resolutions and making them selectable within the display control panel.

I may have to go back to using my LCD display until I can get a new computer. Bummer.


#14

I haven’t bothered to use the the sole Intel GPU in my household for this myself, but I may eventually once I move things around to get my own CRT setup near my desktop.

In the past, I’ve usually used WinModelines for custom resolutions, but only with Radeon cards. http://www.geocities.ws/podernixie/htpc/modeline-en.html

The site says it has support for Intel and Win 10 so you could try it. What’s up with “Digital Display” in the control panel, shouldn’t it say it’s an analog one when a VGA connection is established?


#15

In the past, I’ve usually used WinModelines for custom resolutions, but only with Radeon cards. http://www.geocities.ws/podernixie/htpc/modeline-en.html

The site says it has support for Intel and Win 10 so you could try it.

Thanks! I’ll give that a shot.

What’s up with “Digital Display” in the control panel, shouldn’t it say it’s an analog one when a VGA connection is established?

Yeah, that does seem really weird, and is another reason why I think Intel’s drivers are just broken. It’s not being recognized as anything other than a generic PNP digital display.


#16

Well, I’m dumb. Looks like the hardware I’m using just won’t support [email protected] or [email protected] From the spec sheet for the NUC6CAYS:

-HDMI 2.0 through a MegaChipsMCDP2800-BCT DisplayPort 1.2a to HDMI 2.0 Level Shifter/Protocol Converter (LSPCON)

-VGA graphics through an ITE IT6516BFN DisplayPort to VGA bridge

So it looks like it’s just converting DisplayPort to VGA in order to get VGA. Lame. :frowning: Looks like this CRT is going in the closet until I can get some better hardware.

On the other hand, this exercise has taught me that the LCD can match or exceed the CRT in every aspect of picture quality except for motion blur, black level and viewing angles, given the right shader setup.

CRT advantages:

-NO motion blur

-viewing angles

-black level (technically, the LCD has better black levels, but not after cranking up the backlight to compensate for scanlines)

Edit: also, no input lag!

LCD advantages:

-contrast ratio

-peak brightness

-perfect screen geometry

-better picture uniformity

-ease of use: size, weight and driver/hardware compatibility.


#17

I guess there are more advantages for CRTs. One that you missed is they have no fixed pixel resolution and no input lag. I’d add they don’t look sterile and I could go on, but I’ll leave it at that :slight_smile:


#18

Nice video, although I was surprised they didn’t discuss motion blur. The input lag is definitely a major advantage, although with RA’s run ahead feature it’s possible to get less input lag than even the real thing via a CRT, but it requires some fiddling, and has a high computing cost. Hard GPU sync can get you within 1 frame or less of lag if your display is fast enough.

As far as the sterility goes, shaders can help a lot, although it’s easy to overdo it. I think I’ve managed to match the static image quality on the CRT, and maybe even improve on it a bit. Actually, the PC CRT and the LCD are very similar in terms of sharpness, and both benefit from a very slight amount of blur IMO. In motion, of course, it’s a completely different story, CRT wins hands down.


#19

Another great thing about CRTs: no matter what kind of whack resolution is being output to the monitor, I can always just use the CRT’s vertical and horizontal size adjustments to make the image fill the screen. It’s a little annoying to resize the image manually, but it’s a decent workaround until I can get a computer with proper VGA output.


#20

It’s actually pretty amazing how sharp this CRT display is; with the interlacing shader applied there’s virtually no difference in sharpness when displaying 240p compared to the 1080p LCD with scanlines.

Still, the image looks better on the CRT. I’ve been speculating that this is due to either the CRT flicker and/or the way phosphors emit light compared to the LCD’s backlight + shutters. This results in an image that is inherently easier on the eyes and/or better at reducing the harshness of the high-frequency content in pixel art. It also goes without saying that the motion clarity is incredible. This display also shows a lot of halation due to the thick, flat glass over the main screen surface, which is cool.

I’ll try to get some decent side-by-side shots up at some point.