Correct brightness/contrast setting for CRTs and scanline variability

EDIT: So, I found out that the SMPTE method of calibrating a CRT involves adjusting the white level/contrast so that scanlines over the whites are equal in thickness to scanlines over the darker areas. So, ideally, a display should show very little/no scanline variability, contrary to what I initially thought.

Not really sure where to post this, but I think it’s pertinent to CRT-shaders.

Can anybody with some knowledge of how CRT monitors are calibrated shed some light on the following?

I’m curious to know the relationship between brightness, contrast, and the variability of scanline width on CRT displays.

Many shots I’ve seen of CRTs show little/no variability in scanline width across the screen. This doesn’t seem to be determined by quality/resolution, either. I’ve seen PVMs that showed almost no scanline variability, and PVMs that showed a great deal of variability.

As I understand it, a “correctly calibrated” display will set the black level (brightness) to the lowest level possible at which all detail is still visible. On a CRT, this would mean setting the brightness to the point where the darkest-colored visible lines were just barely visible, and this would make them super thin on a PVM, almost 1/4th the width of the scanlines. See this shot for reference: https://i.warosu.org/data/vr/img/0007/54/1370419191800.jpg

The correct contrast setting, to my knowledge, is the highest setting at which all detail is still visible. On a CRT this would mean that the lightest-colored visible lines would bloom to the point where the scanlines were just barely visible.

So, as a CRT got better in terms of white/black levels, and the more accurately it is calibrated, the more the scanline width varies across the screen.

Is this correct?

Not really. There are a number of factors involved that can vary by manufacturer, horizontal scan rate (15khz vs 31khz), dot pitch, etc. These are all displays I own: NEC XM29+ (multisync; showing 15khz content) Compaq 15" 1280x1024 31khz PVM 20M2U 15khz Neotec CGA 15khz arcade monitor As you can see, the variability in scanline width varies on each one, as does the space between the scanlines. In particular, you can see that the Compaq monitor doesn’t have a lot of variability, even though that display was high-quality and from the end of the CRT era, calibrated using a professional calibration instrument.

Something that I learned about scanline variability and contrast/brightness when developing crt-hyllian shader is that the flat displays (lcd/led/plasmas) can’t compensate the loss of brightness/contrast when you vary the scanlines too much. The result is always a dark screen, bland and unplayable. So, the crt shaders developers try to vary only by some degree so that the brightness/contrast doesn’t suffer too much. This is a limitation that needs to be improved by manufacturers.

[QUOTE=hunterk;28738]Not really. There are a number of factors involved that can vary by manufacturer, horizontal scan rate (15khz vs 31khz), dot pitch, etc. These are all displays I own: NEC XM29+ (multisync; showing 15khz content) Compaq 15" 1280x1024 31khz PVM 20M2U 15khz Neotec CGA 15khz arcade monitor As you can see, the variability in scanline width varies on each one, as does the space between the scanlines. In particular, you can see that the Compaq monitor doesn’t have a lot of variability, even though that display was high-quality and from the end of the CRT era, calibrated using a professional calibration instrument.[/QUOTE]

The shot from the Compaq is very telling - it looks like an emulated shot with scanline and shadowmask emulation. I guess the reason for this is that some CRTs could achieve reference contrast levels without blooming? I’m actually not really clear on how CRTs were calibrated when it comes to contrast levels; I know that for digital displays the main thing you want to avoid is clipping, but I imagine there were different considerations for CRTs. Was the ideal contrast level for CRTs the highest one that avoided blooming? Or is it more correct to say that there is just no clear relationship between brightness/contrast, quality, and scanline variability?

I don’t think there’s any real, clear relationship from display to display but on a given individual display, having higher contrast can definitely lead to more variability up to a fault. When I first got my XM29+, the contrast was set waaay too high and it was causing a weird bulge at the end of bright lines. By the time I got rid of the bulges, the screen was a lot less bright and the gaps were more defined as compared with my PVM.

Calibrating CRTs isn’t wildly different from calibrating LCDs (“increase the brightness until this line is just barely visible,” etc) and doesn’t depend on how much it blooms and so on. It’s easy to forget that CRTs weren’t normally used to show doublestrike/non-interlaced content, since that’s what we use them for now. The big gaps between scanlines were much less apparent in typical, interlaced applications, and progressive content had no gaps at all (obviously). For reference, here’s a shot from that same XM29+ showing progressive content: Gran Turismo 4 on PS2 running in 480p

[QUOTE=hunterk;28745]I don’t think there’s any real, clear relationship from display to display but on a given individual display, having higher contrast can definitely lead to more variability up to a fault. When I first got my XM29+, the contrast was set waaay too high and it was causing a weird bulge at the end of bright lines. By the time I got rid of the bulges, the screen was a lot less bright and the gaps were more defined as compared with my PVM.

Calibrating CRTs isn’t wildly different from calibrating LCDs (“increase the brightness until this line is just barely visible,” etc) and doesn’t depend on how much it blooms and so on. It’s easy to forget that CRTs weren’t normally used to show doublestrike/non-interlaced content, since that’s what we use them for now. The big gaps between scanlines were much less apparent in typical, interlaced applications, and progressive content had no gaps at all (obviously). For reference, here’s a shot from that same XM29+ showing progressive content: Gran Turismo 4 on PS2 running in 480p[/QUOTE]

After looking at a LOT more shots of PVMs and BVMs in action, I’ve concluded that a brightness/contrast setting that maximizes beam dynamics is the “correct” setting for high-resolution CRTs that were capable of maxing out the contrast setting without detail loss (i.e., clipping) and which displayed bloom:

-The Sony PVM and BVM exhibited bloom -The Sony PVM and BVM could be set to maximum contrast with no loss of detail -Maximum contrast = maximum bloom for that set -The correct contrast setting is the highest one at which all detail is still visible -Therefore, correct contrast setting for hi-res PVM and BVM is that which maximizes bloom for those sets -The correct brightness setting for all TVs is the one at which the darkest colors are barely visible -Therefore, the correct brightness setting on a PVM or BVM is the one in which the darkest-colored visible lines are as thin as possible, just a trace of a line.

-Therefore, the correct brightness/contrast setting on a high resolution PVM (800 lines) or a BVM (900 lines) will result in the greatest amount of beam width variability possible for those sets.

In other words, the higher the resolution of the display and the greater its maximum contrast level, the more the beam will vary if brightness/contrast are correctly calibrated. Ideally, you want the darkest lines to be as thin as possible and the brightest lines to be as thick as possible.

Hunter- if I’m not mistaken, your PVM is one of the 600 line models, correct? If so, that would explain the lesser degree of bloom present on your PVM than the higher-resolution models. The gaps between visible lines are thicker on the higher-resolution models, allowing for more possible beam width variation. Thus, the higher the resolution of a PVM, the greater is the possible range of variation in beam width.

I’m not sure if my conclusions can be generalized to all high-resolution CRT displays or if they are only applicable to the BVMs and PVMs. As you mentioned, there are other factors involved such as horizontal scan rate, etc.

I’ll post some shots later from BVMs and PVMs to corroborate. :slight_smile:

Yes, mine is one of the 600-line models. However, as you can see on the shots from my XM29+ (1024x768 max res) and Compaq (1280x1024 max res), having more lines of resolution doesn’t necessarily equate to greater beam dynamics. That kind of thing wasn’t desirable while these displays were in general use (for example, if you were working on a PC, you wouldn’t want all of your small-font white text to bloom out into mush or all of your darker colors to be tiny slivers), so it’s odd to assume that the “correct” calibration would be the one that most exaggerates the undesirable effect.

In regards to small text blooming into illegible mush, I am in agreement that this would mean that contrast is set too high. However, I acknowledged this already when I wrote: “The correct contrast setting is the highest one at which all detail is still visible.” I don’t think PC monitors are different from other displays in this regard. If a TV showed small text blooming into illegible mush, or if a test pattern showed something similar, then the contrast would be too high for that set.

In regards to dark colors turning into tiny silvers, that is indeed what I see occurring in every shot I’ve seen of an 800-line PVM, BVM, or other high resolution display - the visible lines in the darkest colors are barely visible slivers. This actually greatly increases the perceived contrast and “depth” of the image, and allows for the appearance of finer detail than is otherwise possible at the given resolution, creating the illusion of a higher resolution image. Scanlines by themselves already do this to a degree.

So it’s not just resolution, but also maximum contrast that matters in determining the beam dynamics. I believe the PVM and BVM (certain models at least) can be set to 100% contrast with no loss of detail in the image. Brightness should then be set to the lowest level as described above. This maximizes the dynamic range of the image and the amount of visible detail. This also maximizes the variability of the beam width.

To re-rephrase: the “correct” calibration is the one which leads to the greatest degree of beam variability without causing a loss of detail in the image (when viewing any source).

Just curious, which display do you prefer for 240p content? I’m currently having a lot of fun trying to recreate the look of those different displays you posted using CRT-easymode :slight_smile:

I like them all for different things, really. Tbh, I wish I could say one of them was best for everything so I wouldn’t have to own so many of the damned things :stuck_out_tongue:

If you HAD to choose one, though?

I’m considering acquiring a high-end CRT but I’m not sure what to get. The PVMs and BVMs are a little too much in terms of cost and effort, but finding a good consumer CRT (like a Trinitron that doesn’t have atrocious geometry) has proven to be difficult. Maybe I just shouldn’t, since I’m quite satisfied with the look that CRT shaders provide.

There are definitely good reasons to have an actual CRT: low latency; bright, vibrant colors; insanely high contrast values, etc. but there are also many cons: hard to hook modern kit up to them; heavy!; expensive!; high maintenance costs; heavy!; did I mention heavy and expensive?

If you can stomach the limitations of LCD displays + shaders and are planning to emulate rather than run actual consoles, I definitely recommend avoiding the hassle of a CRT

For hooking up actual consoles, go with a PVM or comparable broadcast monitor. I actually like the shadowmask look better than the aperture grille for the most part, even though the aperture grilles make a brighter picture, so getting a trinitron isn’t that big of a deal, IMO. You don’t really see the dynamic beam stuff unless your nose is touching the glass anyway :stuck_out_tongue: Other than that, less “famous” monitors from JVC, Panasonic, etc. are every bit as good as a PVM and can often be had for lower prices. Most of these are 480i/“240p” ONLY, so no 480p Wii/GC/PS2 and no PC games. Plus, most don’t have a direct VGA port, so you have to adapt down from a PC signal, anyway, and still run the risk of breaking it by piping the wrong frequency signal through it.

If you’re planning to emulate, stick with a CRT computer monitor. You won’t break it if you accidentally run the wrong horizontal frequency through it (for example, while booting up) and you can play modern games like SF4 or Ikaruga on Steam, which is cool. OTOH, you can’t easily connect old consoles to it, other than, say Dreamcast via VGA.

The XM29+ sits somewhere in between these options, since it can go up to 1024x768 (i.e., big enough for modern games) over VGA and still has RGBHV and svideo hookups for classic 240p gaming. And, since it’s multisync, it runs it all no problem.

[QUOTE=hunterk;29110]There are definitely good reasons to have an actual CRT: low latency; bright, vibrant colors; insanely high contrast values, etc. but there are also many cons: hard to hook modern kit up to them; heavy!; expensive!; high maintenance costs; heavy!; did I mention heavy and expensive?

If you can stomach the limitations of LCD displays + shaders and are planning to emulate rather than run actual consoles, I definitely recommend avoiding the hassle of a CRT

For hooking up actual consoles, go with a PVM or comparable broadcast monitor. I actually like the shadowmask look better than the aperture grille for the most part, even though the aperture grilles make a brighter picture, so getting a trinitron isn’t that big of a deal, IMO. You don’t really see the dynamic beam stuff unless your nose is touching the glass anyway :stuck_out_tongue: Other than that, less “famous” monitors from JVC, Panasonic, etc. are every bit as good as a PVM and can often be had for lower prices. Most of these are 480i/“240p” ONLY, so no 480p Wii/GC/PS2 and no PC games. Plus, most don’t have a direct VGA port, so you have to adapt down from a PC signal, anyway, and still run the risk of breaking it by piping the wrong frequency signal through it.

If you’re planning to emulate, stick with a CRT computer monitor. You won’t break it if you accidentally run the wrong horizontal frequency through it (for example, while booting up) and you can play modern games like SF4 or Ikaruga on Steam, which is cool. OTOH, you can’t easily connect old consoles to it, other than, say Dreamcast via VGA.

The XM29+ sits somewhere in between these options, since it can go up to 1024x768 (i.e., big enough for modern games) over VGA and still has RGBHV and svideo hookups for classic 240p gaming. And, since it’s multisync, it runs it all no problem.[/QUOTE]

Awesome, thanks for the info. I may break down and get a PVM because I have a bunch of hardware gathering dust and I feel a compulsive urge to plug some cartridges into some consoles. I miss that tactile aspect of the experience for some reason.

In your experience, as far as the PVMs go, do you think its worth it to get one of the 800 line models or is the difference vs. one of the 600 line models not significant enough to warrant the extra cost?

looking back at those shots of PVMs and BVMs I’ve seen, it occurred to me that a lot of the bloom present in those shots could be added by the cameras that people are using. I’ve never seen one up close in real life, so I have no idea what they actually look like. From what I’ve read, the BVM in real life does not exhibit as much bloom as is present in many of the shots I’ve seen.

I don’t have an 800-line PVM to compare with but it seems to be a matter of how much you like the gaps between the scanlines. If you want the gaps barely visible on bright patches of color, go for the 600-line ones. If you want the thick, black gaps, even on bright white patches, go for the 800-line. I guess there’s also the question of whether the 800-line ones provide any additional functionality, like 480p support. (which I don’t know, personally)

There’s also the issue of scarcity and beggars can’t usually be choosers. I’d been keeping an eye on craigslist off and on for almost 2 yrs before I lucked into my PVM and there was no indication which model it was or what condition it was in. Then again, I live in “flyover country” and these things are much rarer here than in, say, California.

I have only the XM29 (without plus) or better said the Nec 3PG. I love that monitor in every aspect. He has native 640x480pixels, but can do also 800x600p or even higher, but interlaced resolutions. Even Pinball emus are looking satisfying, but the XM29 Plus should have even better image quality.

This tri-sync thing can do practily every resolution (15-25-31khz) you can dream of as a arcade fan and can easily be rotated. Has even grips for doing that :slight_smile: .

I did go the Scart->VGA cable route (with built in sync-splitter)for my PS1 (fantastic image quality) and a Component->VGA box (for my X-Box). The Component->VGA box has a switch for VGA->VGA (for my PC).

I am so satisfied with this monitor, i dont want anything else anymore. Feed with MAME or any console from lowres (SNES, NES, Atari 2600 etc.) up to 480p (x-box, ps2, Dreamcast, Gamecube), this monitor beats everything. The very nice and convenient remote control with screen scale on top, make it my favourite.

Good Review: http://www.arcadeinfo.de/showthread.php?12884-NEC-Pr�sentationsmonitore-Multisync-XM29-XP29Plus

[QUOTE=u-man;29151]I have only the XM29 (without plus) or better said the Nec 3PG. I love that monitor in every aspect. He has native 640x480pixels, but can do also 800x600p or even higher, but interlaced resolutions. Even Pinball emus are looking satisfying, but the XM29 Plus should have even better image quality.

This tri-sync thing can do practily every resolution (15-25-31khz) you can dream of as a arcade fan and can easily be rotated. Has even grips for doing that :slight_smile: .

I did go the Scart->VGA cable route (with built in sync-splitter)for my PS1 (fantastic image quality) and a Component->VGA box (for my X-Box). The Component->VGA box has a switch for VGA->VGA (for my PC).

I am so satisfied with this monitor, i dont want anything else anymore. Feed with MAME or any console from lowres (SNES, NES, Atari 2600 etc.) up to 480p (x-box, ps2, Dreamcast, Gamecube), this monitor beats everything. The very nice and convenient remote control with screen scale on top, make it my favourite.

Good Review: http://www.arcadeinfo.de/showthread.php?12884-NEC-Pr�sentationsmonitore-Multisync-XM29-XP29Plus[/QUOTE]

The XM 29 plus looks like it’s the god-tier CRT. I’ve never seen one available though. The PVMs are far more common, I might grab one of those.

Anybody know a good price for a PVM 14M4U? It’s an 800 line model and I see one for less than $100, plus shipping I would guess $150. Seriously thinking of jumping on this one, but then I keep thinking “do I really need this?” and all the work that would be involved with modding my consoles and finding the right cables for everything. I don’t know; I’m fairly satisfied with the results that CRT-shaders bring and input lag can be almost completely eliminated with the right emulator configuration and a low-lag LCD/LED. I also don’t really like doing a bunch of work in order to play a video game. I’m torn.

That’s a reasonable price. I think 14" is a bit small, esp if you’re wanting to really see the beam/mask stuff, but I’m sure it’ll look great all the same.

ah, I’m just gonna go for it, then. I’ll probably regret my decision either way :stuck_out_tongue:

I’ll probably just sit very close to it while playing. I was hoping to use it as a computer monitor as well, but it looks like it doesn’t have VGA.

I learned recently that the SMPTE method of calibrating contrast on a CRT involves a test pattern where you adjust contrast so that the scanlines in the white areas are as white as possible without blooming. I.e., the scanlines over the white areas should be of equal thickness to scanlines over darker areas, ideally.

So, I take it all back :stuck_out_tongue: Ideally, a display will exhibit very little bloom/variability of the scanlines. I think I was misled by bad photos of CRTs, since cameras tend to add their own bloom.

Well, I wound up passing on that PVM. It was going to come out to more like $200 with shipping, which was just a bit more than I’m willing to pay.

The more I think about it, the more I think getting a good CRT computer monitor is the route to go. Although I have a lot of actual hardware, there are still a lot of games I’d like to get, and I don’t really have the money for it. Emulation suits my needs fine, particularly on a Windows 7 64-bit computer configured so that there is basically no input lag.

The difference in quality vs a PVM/BVM and a good computer monitor is not really worth the difference in price, IMO. Computer monitors supported a variety of resolutions including HD resolutions and were usually calibrated to a very high standard. They are also much easier to find and much cheaper. You can often find good CRT computer monitors at Goodwill.

I had some questions regarding using a CRT with an emulator, though. I’m assuming that if I tell an emulator to output with a resolution of 320x240 that the CRT monitor will add its own scanlines, but I have no idea if this is the case or not. Is it possible to get “natural” scanlines with a computer CRT, or do you have to use shaders? Also, is there a shader that just adds some slight horizontal blur?

Another question: would I be better off getting a computer CRT with a lower native resolution or a higher native resolution? I’m guessing the higher resolution will produce thicker scanlines but also a shaper image. I want the scanlines to be noticeable but I also don’t want a super crisp image, but a slightly soft one.

Also, do you think it’s worth it to get a new CRT, in box? I’m tempted to do so since it will have 0 hours on it and no wear/use. I saw a couple on ebay that were around $80 with shipping.

I think you’re making a good choice.

Unless you get like a super-old monitor, resolution isn’t going to matter.

If you feed it the native res (either through custom modelines or using something like groovymame), it will have natural scanlines. That shot of Super Mario World on mine is native res, nearest neighbor. No other filtering.

There are a number of shaders that can be helpful even at native res. Aliaspider’s tvout-tweaks lets you set a horizontal bandwidth parameter to blends things like pseudo-transparency. Since running at “240p” breaks on high-res games (such as those with interlacing), some people like to run at 480p instead and use my interlacing shader, which looks almost identical (the scanlines are just a teeny bit more rounded at 240p) but without the high-res issue. Maister’s NTSC shader can also be good on low-res games. Someone (either Monroe88 or GPDP, IIRC) made a bunch of shader presets in the ‘cgp’ subdir that are good for use with CRTs.

Regardless, the picture will be extremely sharp and the gaps between the scanlines will be pitch-black and well-defined. If you want some blurring, you can always add it in with shaders.

EDIT: here are a couple more shots from my monitor running at games’ native res: