Using a CRT monitor with Retroarch

re-posting this from another thread because it’s a separate topic:

[QUOTE=Monroe88;29641]Use something like this

It’s better to output a “superwide” mode to the CRT than to use standard 640x480 because many games used resolutions that aren’t multiples of that and games that could change resolutions at any time will often result in non-integer scaling artifacts. 3840 was just chosen because it was a common multiple of several common horizontal resolutions used by games, and any non-integer scaling would be unnoticeable due to the number of horizontal pixels vs the dot pitch of the CRT. Configure the aspect ratio to stretch the game’s output to that width and it will appear normal on the CRT.

Whatever video mode you choose, be sure it actually exists as a valid video mode in your OS/drivers or RA can’t use it. There are tools like Custom Resolution Utility or Nvidia Control Panel to create custom modes like that.[/QUOTE]

Thanks for the info! So, just to be clear- I probably need to use a special utility to add a 3840x240 mode? What should the Hz be set to? Or will the utility calculate that? This is the first time I’ve even heard the term “modeline.” Once I’ve added the custom mode to my graphics card, I can then set RA to output at that resolution, and it will result in natural scanlines? Sounds pretty painless- what am I missing?

2 Likes

I think powerstrip can do it in Windows. Xrandr will do it in Linux. Yeah, just set it to 60 Hz. The modeline utility will probably not land exactly on 60 but whatever. It’ll be close.

That’s it. Not too tough.

[QUOTE=hunterk;29760]I think powerstrip can do it in Windows. Xrandr will do it in Linux. Yeah, just set it to 60 Hz. The modeline utility will probably not land exactly on 60 but whatever. It’ll be close.

That’s it. Not too tough.[/QUOTE]

So, yesterday I acquired a massive 36" 4:3 Wega (weight: 235 lbs). The problem is that it’s one of the HD models that autimatically line doubles 240p content. In other words, no scanlines.

Do you think there might be a way to force scanlines through the service menu somehow?

If not, could I hook up a PC to it through the DVI port and force a 240p resolution through the graphics card, as previously described? Or will the TV still upscale it?

If nothing else works then I’ll just output at 480p and add scanlines with the interlacing shader, but it would be nice to get the TV to produce its own scanlines.

Really want to get this to work because it’s in really good shape for such a huge CRT, plus it came with the matching stand, which is sweet. I plan on turning this into the console-equivalent of a MAME cab, with emulators for everything up to PS1 and N64, and original controllers for each system.

Ah, yeah, those TVs are actually bad for 240p content because, as you’ve found, they line-double/deinterlace it, which looks not-so-great and adds processing latency. Try it at 480p with the shader.

Man, I thought I had really hit the jackpot with this TV :frowning: Should have done more research I guess.

Hopefully there won’t be any input lag at 480p, right? Do you know if these TVs have a native resolution?

Why on earth would TV manufacturers add this “feature”? All it does is add noise/artifacts.

[QUOTE=Monroe88;29641]Use something like this

[/QUOTE]

I picked up an old VGA computer monitor yesterday. So, I’ve been trying to get this modeline stuff figured out, and of course it’s proving to be way over my head. I’ve tried Custom Resolution Utility, Crt-emudriver, Soft15khz, and Modeline. I can’t get anything to work right.

AMD and Windows 7 are making this a real bitch. I know for a fact this monitor is capable of 240p, but Windows 7 and AMD want to act like they know what’s best for the user and won’t let you do something as simple as add a custom resolution.

Can someone hold my hand through this, Sesame Street style? After spending the better part of a day on this, I’ve given up on trying to solve this on my own. I’m just looking for the simplest way to add a 240p superwide resolution using Windows 7 and an AMD card. I’m guessing this would probably take less than 5 minutes if I was using Linux.

CRTs don’t really have a “native res” the way we generally think of them, but if it can show 480p content, it’s probably 31khz just like a PC monitor.

I don’t have a lot of experience doing this sort of thing in Windows, unfortunately. I’m not 100% but I think some/most of those utilities will balk if your display reports being unable to show the desired resolution, so make sure your CRT is the only connected display before you run them.

In linux, you would just do this:

xrandr –newmode “240p” 31.96 1920 1952 2072 2104 240 245 248 253 -HSync -VSync // you may need to replace the sync bits with +CSync or mess with the +/- signs.

xrandr –addmode DVI-0 240p // replace DVI-0 with your active display; you can find it from ‘xrandr -q’

// this activates the new resolution: xrandr –output DVI-0 –mode 240p

// and then this gets you back to a normal resolution: xrandr –output DVI-0 –mode 1024×768

[QUOTE=hunterk;30080]CRTs don’t really have a “native res” the way we generally think of them, but if it can show 480p content, it’s probably 31khz just like a PC monitor.

I don’t have a lot of experience doing this sort of thing in Windows, unfortunately. I’m not 100% but I think some/most of those utilities will balk if your display reports being unable to show the desired resolution, so make sure your CRT is the only connected display before you run them.

In linux, you would just do this:

xrandr –newmode “240p” 31.96 1920 1952 2072 2104 240 245 248 253 -HSync -VSync // you may need to replace the sync bits with +CSync or mess with the +/- signs.

xrandr –addmode DVI-0 240p // replace DVI-0 with your active display; you can find it from ‘xrandr -q’

// this activates the new resolution: xrandr –output DVI-0 –mode 240p

// and then this gets you back to a normal resolution: xrandr –output DVI-0 –mode 1024×768[/QUOTE]

This would have been such a breeze with Linux. I went with windows 7 because of the low input lag, but what a nightmare. As it stands, I’ve tried every utility available and spent basically two whole days on this. Nothing works. Either I have the wrong connection (VGA instead of DVI) the wrong OS (Windows 7 instead of XP) the wrong graphics card, etc. I just can’t believe that there isn’t a way to add a custom resolution with Win 7 using an AMD card or that anyone would have to go to so much trouble to do so. It’s ridiculous. This has been a regular feature of Nvidia cards for years.

I’ve come up with a temporary fix until I can find someone who knows how to do this on Windows 7: I just use the interlacing shader and then stretch the picture using my display’s H size, V size, H pos and V pos controls until the picture is centered and fills the screen. Luckily, these controls are conveniently located right on the front of the display. It’s pretty dumb, but it works good enough as a temporary fix.

I was under the impression that AMD cards were preferred for this sort of thing. Looks like crt-emudriver only goes up to radeon 4xxx series, though.

Have you gone through these steps: http://wiki.arcadecontrols.com/wiki/Custom_display_modes_(Windows)_-_Powerstrip

[QUOTE=hunterk;30146]I was under the impression that AMD cards were preferred for this sort of thing. Looks like crt-emudriver only goes up to radeon 4xxx series, though.

Have you gone through these steps: http://wiki.arcadecontrols.com/wiki/Custom_display_modes_(Windows)_-_Powerstrip[/QUOTE]

SUCCESS!

Turns out that I didn’t need Powerstrip (or maybe I did, I used it but it might have been redundant). I wound up just adding a 3840x480 resolution using Custom Resolution Utility, I selected “CRT timing” and entered 60Hz as the refresh rate. Then I followed Monroe88’s config and now RA automatically switches to the desired resolution :slight_smile: I think I didn’t have the refresh rates correct or something before.

The only filtering I’ve added is your interlacing shader. It’s not “true 240p,” but it looks fantastic and it’s nearly identical. This is superior to any LCD+shader combo I’ve seen :slight_smile: Makes me want to pick up a couple more Dell monitors.

It looks like true 240p is going to require a bit more work. I’m satisfied enough with 480p + interlacing that I’m not sure it’s worth the additional effort.

Awesome :smiley:

Yeah, I agree that “true 240p” isn’t going to be worth it. You would need automatic switching to handle interlaced games, which RA doesn’t do, so any games (or parts of games) that use interlacing for 480-line res would break.

Welcome to the CRT club!

[QUOTE=hunterk;30277]Awesome :smiley:

Yeah, I agree that “true 240p” isn’t going to be worth it. You would need automatic switching to handle interlaced games, which RA doesn’t do, so any games (or parts of games) that use interlacing for 480-line res would break.

Welcome to the CRT club![/QUOTE]

Since I can’t leave well enough alone, I finally figured out how to do true 240p :slight_smile:

My graphics card was falsely reporting a maximum refresh rate of 85Hz. Once I set it to the correct setting of 160Hz, I was able to get RA to switch to a resolution of 3840x240 @ 120Hz. Works perfectly with black frame insertion. I’ll post the steps I followed later.

Unfortunately, though, I hit another snag when attempting to adjust the color. I think the bulb in this CRT is bad because I am seeing really severe black crush- greys are getting crushed to black even with brightness and contrast set to 100%. Windows calibration and AMD’s calibration didn’t improve this. To get all colors to display requires increasing contrast and brightness to the point where the colors are severely washed out. So I have to either go with a washed out picture or one that is far too dark.

Can I conclude that the bulb is bad? I think I’m going to order a second monitor of a similar model so that I can compare the two, but I just plugged my LCD back in and it’s pretty shocking how much better the brightness, contrast and colors are on the LCD. It doesn’t look good for this CRT…

This CRT stuff is proving to be a bit of work, but I don’t want to give up because I know the end result will be worth it.

Hmm. Probably nothing wrong with the bulb, which would actually be the electron gun itself (CRTs are basically just an electron gun and some magnets to move the beam of electrons around). Does it do that with the black frame insertion disabled? I think there’s another option that you can enable that will let it frame-dupe instead. Try setting maximum runspeed to 1.0x.

As far as I can tell, RA is working perfectly.

It’s not just RA that’s too dark, the desktop is also unacceptably dark. I’ve cranked up brightness and contrast to 100% but blacks are still turning into big blobs without any detail.

I tried adjusting gamma through both AMD and Windows calibration, but if I make the picture bright enough that greys are visible, the color becomes waaay too washed out. If I adjust the picture so that the colors aren’t washed out, all shades of grey merge together into a black blob.

I’m at a loss, thinking it might be a problem with Windows 7 drivers or maybe a bad circuit in the monitor. Any suggestions?

So, does it look like that at normal res? like 1024x768@60? If so, there may be something up with the monitor, indeed. Are you doing adjustments on the hardware knobs? It could be that the flyback transformer’s screen pot is turned too low, though I don’t know if that’s even adjustable on a PC monitor like that. You would have to open the case to get to it either way.

Update to my ongoing CRT quest:

I decided to order another monitor of a similar model so I can compare the two.

I think the deal with this monitor is that the white point is set to 9300k. I recall learning that many monitors are calibrated to a 9300k temperature rather than 6500k. While this monitor has an option for 6500k, there’s no denying that it looks unacceptably dim (even for a calibrated image, which is dim to most people). At 6500k pretty much all shades of grey are crushed to black, and white has a distinctly yellow tint to it. Furthermore, it’s impossible to get all shades of blue to display at 6500k even with brightness and contrast at 100%.

Additionally, I think there might be a problem with a voltage regulator or it’s possible that the tube just has a lot of hours on it. In my graphics card, I had to set brightness at 100% and lower contrast to 60% to compensate. After this, I was able to get an accurate greyscale and color bars (using the CPS-2 color bars) by setting contrast to 100% and brightness to 70% on my monitor, and setting it to the 9300k temperature.

It seems like not all monitors look best at 6500k even if that is technically the most accurate color temperature. I think a lot depends on what the white point is set to. If white isn’t white, then the colors and greyscale can’t be accurate regardless of color temperature.

Complicating this further are the different color standards used- SD NTSC typically used a narrower color space than what is commonly used now (which I wrote about in the post “The Importance of Adjusting Your Display’s Settings”) This made the colors brighter and more vibrant at the expense of total range. After making the above adjustments to brightness and contrast, I fired up “Fudoh’s 240p Test Suite,” and found that I only needed to increase brightness on the monitor slightly in order to get an accurate picture.

The takeaway is that a color temperature of 6500k, while technically the most accurate, is not always going to be best for every display. Color temperature should be set to the point where white is true white without any sort of tint.

There may be a way to make 6500k look accurate by doing white point adjustment for red, blue and green, which I haven’t tried yet. I don’t know a lot about this so it will probably take a lot of trial and error. It seems like this monitor is just supposed to be set to 9300k, though.

So I got another Dell monitor to compare it to the first one I got, and the difference is striking. The new monitor had excellent brightness, contrast, and color right out of the box. All I needed to do was select the 6500k color temperature and it made all the necessary adjustments automatically. This one doesn’t even have brightness or contrast controls, just the color temperature settings 9300k and 6500k and a user-adjustable mode.

I think we can conclude that the first monitor was simply a dud, and that a good CRT computer monitor shouldn’t take a whole lot of fiddling to look good.

The first one was a Dell E773C, the second is a Dell E773S, if curious. The second one is also noticeable lighter while having the same screen size, which is nice.

The geometry on this monitor is very good; I haven’t needed to make any adjustments to geometry at all, just adjustments to H/V size/position.

The one flaw- there’s always something - is some color bleed on the right side of the screen, which is slightly bluer than it should be. I just spent all night testing it with a bunch of different games, and it wasn’t really noticeable unless specifically looking for it. It’s not that much worse than the screen uniformity issues I’ve seen with some LED/LCDs.

Anyway, it looks awesome, just had to share :slight_smile:

was wondering - do you have any suggestions for lightweight blur shaders to add to a 480p CRT config? I’m needing to set up a 480p config for that massive hi-scan Wega I acquired. Just looking for something that will add some horizontal blur.

I think the tvout-tweaks’ “TVOut Signal Resolution” parameter will do what you want/need.

I’m glad to hear your second monitor is treating you better. You’ll have to get some good pics :slight_smile:

IMO the best shader combo for CRT monitors is TVOut-Tweaks+Image-Adjustment+Interlacing.cgp. You have lots of options for adjusting horizontal blur, as well as being able to adjust the color range from 0-255 to 16-235, which was what NTSC standards used. You can also adjust the gamma to 2.4 to match consumer CRT TVs.

My current settings are horizontal blur set to 768 (for just the tiniest bit of pixel rounding), gamma set to 2.4, and the color level left alone at 0-255, since I don’t thing crushing some blacks is worth the slightly brighter colors.

[QUOTE=Nesguy;30653]This one doesn’t even have brightness or contrast controls, just the color temperature settings 9300k and 6500k and a user-adjustable mode. [/QUOTE]

On a Dell E771P, the user-adjustable color mode lets you max out every color channel and massively boost the brightness over the preset 6500k and 9300k modes, which is really good for compensating inserted black lines in 480p mode (or black frames for 240p 120Hz mode). Those E773 monitors may function in a similar way to this monitor.