Using a CRT monitor with Retroarch

I’m using an ultrawide res setup. Is it normal if most of the background in the first screenshot and the sky in the second one are invisible? Lowering the contrast brightens black colors too much IMO.

I would recommend finding some test card images and load them up in RetroArch and then play with your monitor’s brightness/contrast settings until you get something that’s properly dark but not losing detail.

The problem is that they’re already maxed out. Is there any shader for increasing color intensity? image-adjustment.glsl’s R/G/B Channel options hardly help in the first case.

EDIT: Sorry if I’m being dumb here and this is just a hardware limitation.

You shouldn’t just max both, that will do exactly what you’re describing :stuck_out_tongue:

You want your contrast and your brightness both somewhere near the middle, where blacks are black, whites are white and there are many gray steps between. When contrast is maxed, things are either flared out or black-crushed.

That might be because my monitor seems to be a bit too dim to display artificial scanlines, but I can’t see that background at any combination.

If you disable the scanlines, are you able to do it?

Try downloading these images and loading them in RetroArch and then trying to calibrate:

Goddammit. That was simply because I enabled TV color levels. Nevermind, thanks a lot for the help.

np! I’m glad it was something simple and not a hardware problem.

I am using an M781mm CRT ( I also have an E771p and a couple of E772p lying around if they suite the purpose better somehow) and I’m interested in going the 240p route as OP claimed to have done successfully earlier in this thread. I just can’t figure our how to force a 3840x240@120 output. Like OP my card (ATI r7 2xx) states a maximum refresh rate of 85Hz. Can anyone tell me how it’s done? :smiley:

In Windows, people usually use crt-emu-driver to create custom modelines. In linux, you can use an online modeline calculator and then pass the resulting modeline to xrandr.

I much prefer the linux route.

Thanks! It looks like its working but I’m noticing lines on items moving vertically in all games I tried in this mode (nes, 2600, ect).

Secondly you mentioned on the first page of this thread that retroarch automatically changes the modeline if it’s available but this has not been the case for me ever. Is there something I could be missing?

if you set the video_fullscreen_x/y settings to match the modeline, it should use that video setting when in exclusive fullscreen.

Did you enable black frame insertion (bfi) or change the swap interval in settings > video? Also, do you have any overscan cropping enabled (you should check both the global setting in settings > video and whether the core has any core options for it).

Y̶e̶a̶h̶ ̶t̶h̶a̶t̶ ̶n̶e̶v̶e̶r̶ ̶w̶o̶r̶k̶e̶d̶ ̶f̶o̶r̶ ̶m̶e̶.̶ ̶A̶n̶d̶ ̶c̶o̶m̶e̶ ̶t̶o̶ ̶t̶h̶i̶n̶k̶ ̶o̶f̶ ̶i̶t̶ ̶I̶ ̶u̶s̶e̶d̶ ̶t̶o̶ ̶h̶a̶v̶e̶ ̶t̶h̶e̶ ̶s̶a̶m̶e̶ ̶p̶r̶o̶b̶l̶e̶m̶ ̶w̶i̶t̶h̶ ̶h̶y̶p̶e̶r̶s̶p̶i̶n̶/̶R̶L̶ ̶b̶a̶c̶k̶ ̶w̶h̶e̶n̶ ̶I̶ ̶u̶s̶e̶d̶ ̶t̶h̶e̶m̶ ̶w̶h̶i̶c̶h̶ ̶i̶s̶ ̶w̶h̶y̶ ̶I̶ ̶d̶o̶n̶’̶t̶ ̶t̶h̶i̶n̶k̶ ̶t̶h̶i̶s̶ ̶i̶s̶ ̶r̶e̶t̶r̶o̶a̶r̶c̶h̶’̶s̶ ̶f̶a̶u̶l̶t̶.̶ ̶A̶t̶ ̶t̶h̶e̶ ̶e̶n̶d̶ ̶I̶ ̶a̶l̶w̶a̶y̶s̶ ̶h̶a̶d̶ ̶t̶o̶ ̶m̶a̶n̶u̶a̶l̶l̶y̶ ̶s̶w̶i̶t̶c̶h̶ ̶t̶h̶e̶ ̶c̶r̶t̶ ̶t̶o̶ ̶t̶h̶e̶ ̶3̶8̶4̶0̶x̶4̶8̶0̶ ̶m̶o̶d̶e̶l̶i̶n̶e̶ ̶e̶l̶s̶e̶ ̶t̶h̶e̶ ̶i̶n̶t̶e̶r̶l̶a̶c̶e̶ ̶s̶h̶a̶d̶e̶r̶ ̶g̶o̶e̶s̶ ̶o̶u̶t̶ ̶o̶f̶ ̶w̶h̶a̶c̶k̶.̶ Wait… “video_fullscreen_x/y”? ooooooh whoops. :stuck_out_tongue:

Aye aye it turns out cropping was the issue all along. Thanks for that!

I’ve upgraded my monitor to LG Flatron F700B and it has scanlines of its own when at 3840x480 resolution. Should I start using 3840x240 to have a reasonable scanline width or is there any other solution?

I recommend using the interlacing shader at 2x w/480p resolution. It will look almost indistinguishable from 240p but will handle interlaced content gracefully.

I use this already. The problem is that, it seems, monitor’s native scanlines make the scanlines look really thick.

EDIT: Sorry I didn’t make myself clear; I didn’t see the ambiguity there.

ohhh, I see what you mean now. Well, the shader includes a brightness percentage parameter that’s usually set to 0%. You could try increasing it a bit to see if it blends with the monitor’s own scanlines any better.

I don’t see how this can help much, they are still wide as hell. I guess I’ll have to use 240p from now on.

Update: Turns out 240p scanlines are pretty big as well. Some of the newer monitor models suck for this use it seems.

The problem is that they’re already maxed out.

I’ve upgraded my monitor to LG Flatron F700B and it has scanlines of its own when at 3840x480 resolution.

That’s not surprising since that monitor has a dot pitch of 0.24 mm. That means that the individual raster lines are going to be more defined than with monitors with a larger dot pitch.

My own PC CRT monitor has a dot pitch 0.27 mm for a 17" screen, which is enough for the raster lines at 480p to be discernible (but not prominent).

It probably helps to increase the screen luminance to make the raster lines bloom a bit more at low resolution. I use my CRT with all color channels maxed out so the screen is as bright as possible, which nicely compensates the brightness loss from inserting black lines.