CRT shader debate


Macro Mario.

Direct screen capture from my iPhone, viewing the emulated image with my posted settings through Safari. The slight anti aliasing that’s added by the phone actually makes it look closer to how it looks when seen in person, maybe slightly softer. Be sure to crank up the backlight when viewing! :wink:


Ultimately, in this case we have two different visual styles, people with different goals and tastes. Both parties are happy with the look they have achieved, there is no right or wrong.

Just remembered actually, I went to a retro gaming expo’s in the summer and they had a Sony BVM in motion and it certainly did look sharp, clean, the scan lines were very obvious. They also had loads of arcade machines from the 90’s, Street Fighter 2 Turbo, Mortal Kombat and Killer Instinct. All the screens looked original or re-conditioned and it was very obvious the graphics were softer, whites were bloomy, scan lines were subtle, grungy looking, the masks were more obvious and appeared almost grid like when viewed up close (nose touching the screen - people must have thought I was weird :face_with_monocle:).

Granted the screens probably have never been serviced or properly calibrated and maybe with age got worse? The arcade screens looked more natural and acceptable to me, most likely this is down to how the brain has been trained to handle and accept these visuals from my younger days. I dunno!

Don’t worry I still kicked their ass in Street Fighter 2 :muscle: “You must defeat Sheng Long to stand a chance” :fist:


There is no right or wrong when it comes to subjective preference which is why I think it’s a lot more useful to discuss objective picture quality and things which we can actually measure.

Objectively, I believe there are reasons your kids and why most people in general who aren’t invested in this debate would prefer my settings. I’ve literally increased the dynamic range of the image. The mask and scanlines increase the contrast around highlights which makes them stand out more, and the lowlights are made even darker. Your kids are probably responding to the increased dynamic range and contrast of the image, which is objectively superior when using these settings. Smearing the image with blurs does greatly interfere with the mask IMO because it causes the phosphors to smear over each other, reducing the intensity of each phosphor and dulling the image.

I believe that a color bars test pattern would show clipping when displayed using some of the settings I’ve seen posted here, but if you don’t care about objective quality, lost detail, inaccurate color or burning your retinas out, have at it :smile:

FWIW, I grew up playing with a crappy RF or composite connection. I have no desire to relive that part of the experience.

I think the zoomed in shot of Mario compares favorably with the macro photos I posted, particularly when it comes to the phosphors and scanlines.


I don’t buy this dak looking image of @Nesguy. It’s just not convenient to me to change picture mode of my PC monitor. Often I play on my 40" TV and I don’t want to change any settings there just to play some retro games as well.

Also I don’t want my eyes to bleed. Like it was back in the days when I set for several hours in from on my old CRT TV.

In fact you have to adjust brightness on each of your devices individually as all of them have different brightness range. So just bumping up a brightness to 100% isn’t a good solution IMO.


Pushing a button isn’t convenient?

The method I’ve described will result in peak brightness levels that are equal to that of a CRT, so I don’t get the comment about your eyes. You seem to be contradicting yourself “I don’t get this dark image” and “I don’t want me eyes to bleed.” Which is it? Is it too bright or too dark?

Did you see the post where I describe how to adjust your display and shader settings for your own display? Adjusting the backlight to 100 is a decent solution if you just want to get an idea of what it looks like, but if you want to fine tune it more than that, it’s not particularly difficult or complicated to do so. I’ve described exactly how to match CRT level peak brightness for whatever display you’re using. It takes all of 10 minutes to set up.

This bears repeating:

The fundamental problem I’m trying to address is the significant loss of brightness and contrast that occurs when scanlines and RGB phosphors are applied. We can address this by: 1) adjusting our display’s settings 2) altering color and brightness values through Retroarch, which can obviously reduce the accuracy of the image 3) significantly reducing the mask and/or scanline strength, compromising the CRT emulation and reducing the positive benefit to 240p content.


I’m trying to come up with an experiment that will work given the crappy quality of the camera I’m using. This is what I’ve come up with:

I’ll take one photo of a game running on my display with no shaders applied at all, with my display settings at their normal calibrated values.

Then I’ll take a photo of the exact same screen with my settings applied and my display’s backlight adjusted.

The two should look nearly identical, but the camera is still going to result in differences in brightness and color because it’s doing its own thing to the image.

That’s the best I can come up with. I’ll post pics after it gets dark tonight; way too much ambient light right now.


I know for a fact my preferred shader presets will be out of whack with all the calibration tests, they would probably fail every brightness, contrast and colour test!

My thoughts on this is… I would be testing 240p material on a large resolution display panel that has not been designed with that kind of source in mind. It’s like running an old SD based calibration DVD on a 4K display, yes you might get the basics calibrated to a decent effort but to get it perfect you would need a 4K source too.

I could be wrong as I’m not in to calibration, if I bought a brand new 4K TV to watch 4K Blu Ray movies I would definitely make more of an effort to get it professionally calibrated to get the best “objective” experience.

As well as emulating the game I want to emulate the nostalgic feeling of how I remember my arcade experiences and retro consoles. Of course I wouldn’t want the RF noise, interference and bad colour bleeding from the old days, with RetroArch shaders I can take the best bits, leave out the undesired effects and customise the shader passes to get my desired visuals.

In the end I am not desperate to get a perfect calibrated 240p output on these modern displays. I can’t speak for others however, for me personally, emulation is very, as you call it “subjective”.

The Macro Mario shot looks good though :+1:


yeah, it’s hard for me to change brightness as I can’t have several custom presets on my Dell U2410 My brightness level is set to 30% and contrast to 34% to match 100 cd/m2. My color mod is set to sRGB. With these settings my eyes are absolutely comfortable with. I have similar settings for all my displays (TVs, mobile phones etc.) so I could have the same look across the board.

  • I don’t need to calibrate my displays using a test picture for any games as they are already fine.
  • I can alt-tab between windows and not hurt my eyes badly because of brightness difference.
  • I don’t need to use night mode for a software.
  • I can record my retro videos and it will look OK for everybody else without editing to compensate lack of brightness.

That’s what I call convenience. It’s not only how your display shows you the picture. The picture itself have to be calibrated and shouldn’t look like a pitch black.

I’ve got what you are trying to achieve with your approach. I’m just not looking for a perfect replication of CRT o LCD. I just want to use some its good features to make a picture look good to me.


This guy gets it. Scroll down to posts 22 and 23.


Is that 100 cd/m2 peak brightness? That’s actually quite low, if so. Most recommendations I’ve seen recommend 200 - 250 cd/m2 for a room with normal to dim lighting.

This just further highlights the problem of sharing individual settings; people are doing wildly different things with their displays and different displays have different specs.


He has the same school of thought as you, those shots do technically look impressive, very precise and sharp, from what pics I’ve seen on the Internet looks close to a BVM style visuals, if that’s what he is trying to do then job well done and kudos to him. :+1:

I personally prefer the great @EasyMode’s screen shots just above his posts :sunglasses:


No. Recommended is 120 cd/m2


In the end I am not desperate to get a perfect calibrated 240p output on these modern displays. I can’t speak for others however, for me personally, emulation is very, as you call it “subjective”.

I know I look like I’m being pedantic, but there are some objective reasons why one would prefer a calibrated image, among them: reduced eye strain, increased visible detail, colors that more closely match the output of the console, and a more vibrant image. In particular, the altered colors, overdriven contrast and exaggerated bloom in a lot of these shots bothers me, and would be considered flaws to be corrected by professionals that work on CRTs.

If you’re happy with what you have, though, that’s great. This all reminds me of when I calibrated my dad’s TV and he immediately complained that the resulting image was too dark :stuck_out_tongue:

I personally prefer the great @EasyMode’s screen shots just above his posts :sunglasses:

My own settings are somewhat between what Easymode has posted and what larch1991 posted; there is 0 blending between the scanlines in larch1991’s shots, which is as close to “perfect” as CRT tech could get, and closely resembles the close-up of the Sony FW900 I posted a few days ago.

The comparison shot he posted demonstrates that he matched the color values and brightness of the shot without any mask or scanlines applied. I’m doing the exact same thing.

Easymode’s opinion regarding this method; it’s interesting to see the perspective of someone who’s written shaders.

“Turning up the Scanline and Mask strength and cranking up your display’s brightness is a neat trick. I’ve played around with that in the past, and the results can look quite nice if your display has brightness to spare.”


No. Recommended is 120 cd/m2 1

That’s a recommendation for doing image work in a dark room, AFAIK, and even that isn’t universal. Recommendations on this vary widely. There is no single correct setting for brightness, it has to be matched to the viewing conditions.There’s no reason why we shouldn’t calibrate to a higher brightness level.

Again, this highlights the problem with sharing individual settings, as not only are people doing wildly different things with their displays, there is also no single “correct” recommendation for what level of brightness a display should be calibrated to.

“Public areas (department stores, hospitals, transit stations) • High-level brightness LCDs (300 cd/m2 and above) are recommended to suit higher
ambient light and viewing distances of around 100 cm. Offices • Mid-level brightness LCDs (250cd/m2 to 300 cd/m2) are recommended for normal ambient
light and viewing distances of around 60 cm. Study rooms • Low-level brightness LCDs (200cd/m2 to 250cd/m2) are recommended for lower ambient
light (e.g. desk lamp) and distances of around 50 cm.”

“Since we consider 200 cd/m2 to be an ideal average for peak output, we calibrate all of our test monitors to that value. In a room with some ambient light (like an office), this brightness level provides a sharp, punchy image with maximum detail and minimum eye fatigue. It’s also the sweet spot for gamma and grayscale tracking, which we’ll look at on the next page.”

. There is an ISO standard that specifies a brightness of 160 candela/m2 for critical inspections of color prints.


Yes, I edited images and videos previously and I get used to that brightness value. Something higher than that isn’t comfortable to me.


If you’re targeting 100 cd/m2 then you should be able to increase the mask strength even further compared to what I’ve done, so that’s actually a benefit if you’re trying to max out the mask/phosphor strength like I’m doing.


Good to see all these new replies!

@Nesguy - yes, with your method you might very well achieve those TMNT results. And I imagine that will make you as happy as a sandboy. But what will you do then? Play games? Haha it’s much more fun to tinker with shaders. Still, I don’t like that look, it’s too… oh wait :stuck_out_tongue:

Regarding my screen, (a 2009 TN/24"/1080p Iiyama ProLite) it’s just pointless to try. Good for my work because it’s kind of flat, neutral and matte, not so hot for games and films. Like I told you, increasing the brightness to the required levels isn’t possible, and even if it could get more or less there, my color calibration would be completely ruined. I might give it a go on my Samsung 1080 40 incher, which is not exactly state of the art either but it’s built around a seriously nice (at least for its time) SuperPVA panel.

And by the way, I like antialiased Macro Mario much better than the raw images! (and you knew I would, I’m sure)

@hunterk - that’s pretty much exactly like the first TV I had for myself, a 14" lo-res Sharp, whose output I actually have fond memories of. I wouldn’t go for that look nowadays, after having fallen in love with (subtle) scanlines. I think @Great_Dragon digs it though.

Yeah, you and I grew up in the same kind of place haha. And we are all weirdos here. Nobody, literally not a single person in my ‘real life’ gives the slightest of fucks about the stuff we are so passionately talking about here.

I would, sir :facepunch:t3: “Attack me if you dare, I will crush you!!”

Correct. Specially because the actual results so far don’t seem… well, spectacular (also imo of course)

Cameras in manual mode don’t do their own thing. If the brightness and contrast in both scenes are indeed equal, just keep the same settings (aperture, shutter and iso) and you are good to go.

In general I agree of course, but I will take your idea a step further: what I’m trying to do is take that nostalgia and elevate it to a higher level of quality (same applies to sound and controls, btw). For instance increasing the internal resolution of 3D games makes them look SO much better and detailed, while at the same time the crt shaders keep them connected with the way I experienced them 20 years ago.

And I have already talked about it, but bear this in mind: a perfectly calibrated 240p output is not necessarily sterile and xVM-ish. Regarding 240p content, those monitors have too many lines for their own good. A perfectly calibrated mid-res arcade monitor or a consumer 90s Trinitron-like tv (my personal references) look nothing like a perfectly calibrated xVM! Not better or worse, just different. The rest is a matter of taste.

Yup, that’s why I insist that he post camera photos. The raw captures will always look like that on a normally-configured display. Which is a shame, because after all this talk I really believe that his actual results must be very interesting.

Well like I just said to shenlong, and my point all this time: it really depends on what kind of crt you are trying to (more or less) perfectly replicate.

His halation shader, for me, is simply the best. Those blur passes (horiz-vert) are bliss. Before I started using it, I was trying to achieve something like that via high radius gaussian blur + bloom on reshade, with prosperous enough results. But the way in which he implemented that idea, what his method does to tonal values… genius.

Furthermore, masks are good and diverse (the shader even automagically compensates for the lost brightness as you increase their intensity), scanlines almost always look perfect even at non-integer, it has good curvature… I could go on and on. And considering all the goodies that it brings to the table (blur and bloom for instance tend to be very expensive, and it does both) it’s pretty fast.

I will try to find a post in which he explains how he got there. I think @Nesguy has already spoken with him about his shaders, and I’m sure @hunterk has, too. Perhaps you two can help me with that.

Phew. Came here to say hello, ended up posting a brick…


Well, I’m not really satisfied with these photos, but here are some first attempts with my iPhone 6. I can’t get the camera to not mess with the color saturation or brightness. No matter how much I fiddle with zooming in/out or the exposure, it’s a trade off between brightness and color saturation. You can see all kinds of weird stuff being done by the camera, including some awful moire, just like you see in photos of actual CRTs. I hope this at least demonstrates that my display is not as absurdly dark as might be suggested by some of the direct screen captures I posted- see the shot of the Sonic background in particular. In person, the image is even brighter and has better color saturation than what’s shown here. I’m not sure how much use these are in terms of proving anything, but here you go. I’ll try to get some comparison shots up tomorrow.


Yeah, that looks properly bright :slight_smile:

I can see in the scroll test shot that the camera/eye adds some faux beam dynamics (i.e., brighter pixels appear larger), which is an important part of the CRT look to me that wasn’t evident in your earlier screencaps.

Good stuff!


Definitely looks better when you see the actual TV output screen like that, I would be happy with that :+1:

Quick question if you don’t mind (I’m always keen to learn), you can see the scan lines fading out on the pure whites, I’ve seen that happen on real hardware when the whites bloom - is that a negative i.e. is that because those screens are not calibrated or is that a normal thing with CRT displays?