Correct brightness/contrast setting for CRTs and scanline variability

[QUOTE=hunterk;29608]I think you’re making a good choice.

Unless you get like a super-old monitor, resolution isn’t going to matter.

If you feed it the native res (either through custom modelines or using something like groovymame), it will have natural scanlines. That shot of Super Mario World on mine is native res, nearest neighbor. No other filtering.

There are a number of shaders that can be helpful even at native res. Aliaspider’s tvout-tweaks lets you set a horizontal bandwidth parameter to blends things like pseudo-transparency. Since running at “240p” breaks on high-res games (such as those with interlacing), some people like to run at 480p instead and use my interlacing shader, which looks almost identical (the scanlines are just a teeny bit more rounded at 240p) but without the high-res issue. Maister’s NTSC shader can also be good on low-res games. Someone (either Monroe88 or GPDP, IIRC) made a bunch of shader presets in the ‘cgp’ subdir that are good for use with CRTs.

Regardless, the picture will be extremely sharp and the gaps between the scanlines will be pitch-black and well-defined. If you want some blurring, you can always add it in with shaders.

EDIT: here are a couple more shots from my monitor running at games’ native res: http://4.bp.blogspot.com/-8_ibEH6-_OU/UzeM22N0TzI/AAAAAAAAB9w/YjezR13wJAU/s1600/IMAG0083.jpg[/QUOTE]

Thanks for the info.

Those pics look really great - exactly what I’m going for, although those colors look a tad washed out to me :stuck_out_tongue: Can’t really judge these things by photos anyway. I’m assuming that in person there is even less bloom/scanline variability?

I could pick up a used Dell Monitor for $30 right now, or I could pay $60 and have a NIB Compaq monitor shipped to me. Both are the same resolution and made during the end of the CRT era. Edit: NM, the Compaq is “new out of box,” lol. Got the Dell, a E773C 17’’. Looks pretty sweet!

[QUOTE=hunterk;29608]

If you feed it the native res (either through custom modelines or using something like groovymame), it will have natural scanlines. That shot of Super Mario World on mine is native res, nearest neighbor. No other filtering. [/quote]

Can I just set RA to a custom resolution that is whateverx240 and then set my monitor’s resolution to 640x480 in the graphics card? That will just make the window tiny, won’t it?

RetroArch will try to change the resolution if there’s an existing modeline for what it’s trying switch to but you won’t have 240p modelines unless you create them, either manually or with a utility. You can set it to 640x480, which will be available by default, but that will be a line-doubled image (i.e, no gaps between lines). You can add the interlacing shader (i.e., solid black lines) to that to get a very similar image to native 240p, but you’ll probably want to setup an ultrawide horizontal res, like 1920 or 3840 that’s an even multiple of most common resolutions and large enough to hide fractional scaling artifacts from anything that’s not an even multiple.

Use something like this

It’s better to output a “superwide” mode to the CRT than to use standard 640x480 because many games used resolutions that aren’t multiples of that and games that could change resolutions at any time will often result in non-integer scaling artifacts. 3840 was just chosen because it was a common multiple of several common horizontal resolutions used by games, and any non-integer scaling would be unnoticeable due to the number of horizontal pixels vs the dot pitch of the CRT. Configure the aspect ratio to stretch the game’s output to that width and it will appear normal on the CRT.

Whatever video mode you choose, be sure it actually exists as a valid video mode in your OS/drivers or RA can’t use it. There are tools like Custom Resolution Utility or Nvidia Control Panel to create custom modes like that.

[QUOTE=Monroe88;29641]Use something like this

It’s better to output a “superwide” mode to the CRT than to use standard 640x480 because many games used resolutions that aren’t multiples of that and games that could change resolutions at any time will often result in non-integer scaling artifacts. 3840 was just chosen because it was a common multiple of several common horizontal resolutions used by games, and any non-integer scaling would be unnoticeable due to the number of horizontal pixels vs the dot pitch of the CRT. Configure the aspect ratio to stretch the game’s output to that width and it will appear normal on the CRT.

Whatever video mode you choose, be sure it actually exists as a valid video mode in your OS/drivers or RA can’t use it. There are tools like Custom Resolution Utility or Nvidia Control Panel to create custom modes like that.[/QUOTE]

EDIT: started a new thread on this, since the original topic ran its course.

Thanks for the info! So, just to be clear- I probably need to use a special utility to add a 3840x240 mode? What should the Hz be set to? Or will the utility calculate that? This is the first time I’ve even heard the term “modeline.” Once I’ve added the custom mode to my graphics card, I can then set RA to output at that resolution, and it will result in natural scanlines? Sounds pretty painless- what am I missing?