I don’t buy this dak looking image of @Nesguy.
It’s just not convenient to me to change picture mode of my PC monitor. Often I play on my 40" TV and I don’t want to change any settings there just to play some retro games as well.
Also I don’t want my eyes to bleed. Like it was back in the days when I set for several hours in from on my old CRT TV.
In fact you have to adjust brightness on each of your devices individually as all of them have different brightness range. So just bumping up a brightness to 100% isn’t a good solution IMO.
The method I’ve described will result in peak brightness levels that are equal to that of a CRT, so I don’t get the comment about your eyes. You seem to be contradicting yourself “I don’t get this dark image” and “I don’t want me eyes to bleed.” Which is it? Is it too bright or too dark?
Did you see the post where I describe how to adjust your display and shader settings for your own display? Adjusting the backlight to 100 is a decent solution if you just want to get an idea of what it looks like, but if you want to fine tune it more than that, it’s not particularly difficult or complicated to do so. I’ve described exactly how to match CRT level peak brightness for whatever display you’re using. It takes all of 10 minutes to set up.
This bears repeating:
The fundamental problem I’m trying to address is the significant loss of brightness and contrast that occurs when scanlines and RGB phosphors are applied. We can address this by: 1) adjusting our display’s settings 2) altering color and brightness values through Retroarch, which can obviously reduce the accuracy of the image 3) significantly reducing the mask and/or scanline strength, compromising the CRT emulation and reducing the positive benefit to 240p content.
I’m trying to come up with an experiment that will work given the crappy quality of the camera I’m using. This is what I’ve come up with:
I’ll take one photo of a game running on my display with no shaders applied at all, with my display settings at their normal calibrated values.
Then I’ll take a photo of the exact same screen with my settings applied and my display’s backlight adjusted.
The two should look nearly identical, but the camera is still going to result in differences in brightness and color because it’s doing its own thing to the image.
That’s the best I can come up with. I’ll post pics after it gets dark tonight; way too much ambient light right now.
I know for a fact my preferred shader presets will be out of whack with all the calibration tests, they would probably fail every brightness, contrast and colour test!
My thoughts on this is… I would be testing 240p material on a large resolution display panel that has not been designed with that kind of source in mind. It’s like running an old SD based calibration DVD on a 4K display, yes you might get the basics calibrated to a decent effort but to get it perfect you would need a 4K source too.
I could be wrong as I’m not in to calibration, if I bought a brand new 4K TV to watch 4K Blu Ray movies I would definitely make more of an effort to get it professionally calibrated to get the best “objective” experience.
As well as emulating the game I want to emulate the nostalgic feeling of how I remember my arcade experiences and retro consoles. Of course I wouldn’t want the RF noise, interference and bad colour bleeding from the old days, with RetroArch shaders I can take the best bits, leave out the undesired effects and customise the shader passes to get my desired visuals.
In the end I am not desperate to get a perfect calibrated 240p output on these modern displays. I can’t speak for others however, for me personally, emulation is very, as you call it “subjective”.
yeah, it’s hard for me to change brightness as I can’t have several custom presets on my Dell U2410
My brightness level is set to 30% and contrast to 34% to match 100 cd/m2. My color mod is set to sRGB.
With these settings my eyes are absolutely comfortable with. I have similar settings for all my displays (TVs, mobile phones etc.) so I could have the same look across the board.
I don’t need to calibrate my displays using a test picture for any games as they are already fine.
I can alt-tab between windows and not hurt my eyes badly because of brightness difference.
I don’t need to use night mode for a software.
I can record my retro videos and it will look OK for everybody else without editing to compensate lack of brightness.
That’s what I call convenience.
It’s not only how your display shows you the picture. The picture itself have to be calibrated and shouldn’t look like a pitch black.
I’ve got what you are trying to achieve with your approach. I’m just not looking for a perfect replication of CRT o LCD. I just want to use some its good features to make a picture look good to me.
Is that 100 cd/m2 peak brightness? That’s actually quite low, if so. Most recommendations I’ve seen recommend 200 - 250 cd/m2 for a room with normal to dim lighting.
This just further highlights the problem of sharing individual settings; people are doing wildly different things with their displays and different displays have different specs.
He has the same school of thought as you, those shots do technically look impressive, very precise and sharp, from what pics I’ve seen on the Internet looks close to a BVM style visuals, if that’s what he is trying to do then job well done and kudos to him.
I personally prefer the great @EasyMode’s screen shots just above his posts
In the end I am not desperate to get a perfect calibrated 240p output on these modern displays. I can’t speak for others however, for me personally, emulation is very, as you call it “subjective”.
I know I look like I’m being pedantic, but there are some objective reasons why one would prefer a calibrated image, among them: reduced eye strain, increased visible detail, colors that more closely match the output of the console, and a more vibrant image. In particular, the altered colors, overdriven contrast and exaggerated bloom in a lot of these shots bothers me, and would be considered flaws to be corrected by professionals that work on CRTs.
If you’re happy with what you have, though, that’s great. This all reminds me of when I calibrated my dad’s TV and he immediately complained that the resulting image was too dark
I personally prefer the great @EasyMode’s screen shots just above his posts
My own settings are somewhat between what Easymode has posted and what larch1991 posted; there is 0 blending between the scanlines in larch1991’s shots, which is as close to “perfect” as CRT tech could get, and closely resembles the close-up of the Sony FW900 I posted a few days ago.
The comparison shot he posted demonstrates that he matched the color values and brightness of the shot without any mask or scanlines applied. I’m doing the exact same thing.
Easymode’s opinion regarding this method; it’s interesting to see the perspective of someone who’s written shaders.
“Turning up the Scanline and Mask strength and cranking up your display’s brightness is a neat trick. I’ve played around with that in the past, and the results can look quite nice if your display has brightness to spare.”
That’s a recommendation for doing image work in a dark room, AFAIK, and even that isn’t universal. Recommendations on this vary widely. There is no single correct setting for brightness, it has to be matched to the viewing conditions.There’s no reason why we shouldn’t calibrate to a higher brightness level.
Again, this highlights the problem with sharing individual settings, as not only are people doing wildly different things with their displays, there is also no single “correct” recommendation for what level of brightness a display should be calibrated to.
“Public areas (department stores, hospitals, transit stations)
• High-level brightness LCDs (300 cd/m2 and above) are recommended to suit higher
ambient light and viewing distances of around 100 cm.
Offices
• Mid-level brightness LCDs (250cd/m2 to 300 cd/m2) are recommended for normal ambient
light and viewing distances of around 60 cm.
Study rooms
• Low-level brightness LCDs (200cd/m2 to 250cd/m2) are recommended for lower ambient
light (e.g. desk lamp) and distances of around 50 cm.”
“Since we consider 200 cd/m2 to be an ideal average for peak output, we calibrate all of our test monitors to that value. In a room with some ambient light (like an office), this brightness level provides a sharp, punchy image with maximum detail and minimum eye fatigue. It’s also the sweet spot for gamma and grayscale tracking, which we’ll look at on the next page.”
. There is an ISO standard that specifies a brightness of 160 candela/m2 for critical inspections of color prints.
If you’re targeting 100 cd/m2 then you should be able to increase the mask strength even further compared to what I’ve done, so that’s actually a benefit if you’re trying to max out the mask/phosphor strength like I’m doing.
@Nesguy - yes, with your method you might very well achieve those TMNT results. And I imagine that will make you as happy as a sandboy. But what will you do then? Play games? Haha it’s much more fun to tinker with shaders. Still, I don’t like that look, it’s too… oh wait
Regarding my screen, (a 2009 TN/24"/1080p Iiyama ProLite) it’s just pointless to try. Good for my work because it’s kind of flat, neutral and matte, not so hot for games and films. Like I told you, increasing the brightness to the required levels isn’t possible, and even if it could get more or less there, my color calibration would be completely ruined. I might give it a go on my Samsung 1080 40 incher, which is not exactly state of the art either but it’s built around a seriously nice (at least for its time) SuperPVA panel.
And by the way, I like antialiased Macro Mario much better than the raw images! (and you knew I would, I’m sure)
@hunterk - that’s pretty much exactly like the first TV I had for myself, a 14" lo-res Sharp, whose output I actually have fond memories of. I wouldn’t go for that look nowadays, after having fallen in love with (subtle) scanlines. I think @Great_Dragon digs it though.
Yeah, you and I grew up in the same kind of place haha. And we are all weirdos here. Nobody, literally not a single person in my ‘real life’ gives the slightest of fucks about the stuff we are so passionately talking about here.
I would, sir “Attack me if you dare, I will crush you!!”
Correct. Specially because the actual results so far don’t seem… well, spectacular (also imo of course)
Cameras in manual mode don’t do their own thing. If the brightness and contrast in both scenes are indeed equal, just keep the same settings (aperture, shutter and iso) and you are good to go.
In general I agree of course, but I will take your idea a step further: what I’m trying to do is take that nostalgia and elevate it to a higher level of quality (same applies to sound and controls, btw). For instance increasing the internal resolution of 3D games makes them look SO much better and detailed, while at the same time the crt shaders keep them connected with the way I experienced them 20 years ago.
And I have already talked about it, but bear this in mind: a perfectly calibrated 240p output is not necessarily sterile and xVM-ish. Regarding 240p content, those monitors have too many lines for their own good. A perfectly calibrated mid-res arcade monitor or a consumer 90s Trinitron-like tv (my personal references) look nothing like a perfectly calibrated xVM! Not better or worse, just different. The rest is a matter of taste.
Yup, that’s why I insist that he post camera photos. The raw captures will always look like that on a normally-configured display. Which is a shame, because after all this talk I really believe that his actual results must be very interesting.
Well like I just said to shenlong, and my point all this time: it really depends on what kind of crt you are trying to (more or less) perfectly replicate.
His halation shader, for me, is simply the best. Those blur passes (horiz-vert) are bliss. Before I started using it, I was trying to achieve something like that via high radius gaussian blur + bloom on reshade, with prosperous enough results. But the way in which he implemented that idea, what his method does to tonal values… genius.
Furthermore, masks are good and diverse (the shader even automagically compensates for the lost brightness as you increase their intensity), scanlines almost always look perfect even at non-integer, it has good curvature… I could go on and on. And considering all the goodies that it brings to the table (blur and bloom for instance tend to be very expensive, and it does both) it’s pretty fast.
I will try to find a post in which he explains how he got there. I think @Nesguy has already spoken with him about his shaders, and I’m sure @hunterk has, too. Perhaps you two can help me with that.
Phew. Came here to say hello, ended up posting a brick…
Well, I’m not really satisfied with these photos, but here are some first attempts with my iPhone 6. I can’t get the camera to not mess with the color saturation or brightness. No matter how much I fiddle with zooming in/out or the exposure, it’s a trade off between brightness and color saturation. You can see all kinds of weird stuff being done by the camera, including some awful moire, just like you see in photos of actual CRTs. I hope this at least demonstrates that my display is not as absurdly dark as might be suggested by some of the direct screen captures I posted- see the shot of the Sonic background in particular. In person, the image is even brighter and has better color saturation than what’s shown here. I’m not sure how much use these are in terms of proving anything, but here you go. I’ll try to get some comparison shots up tomorrow.
I can see in the scroll test shot that the camera/eye adds some faux beam dynamics (i.e., brighter pixels appear larger), which is an important part of the CRT look to me that wasn’t evident in your earlier screencaps.
Definitely looks better when you see the actual TV output screen like that, I would be happy with that
Quick question if you don’t mind (I’m always keen to learn), you can see the scan lines fading out on the pure whites, I’ve seen that happen on real hardware when the whites bloom - is that a negative i.e. is that because those screens are not calibrated or is that a normal thing with CRT displays?
It’s a normal thing for CRT displays, and is influenced by the quality of the display and calibration. If you check out the shot of the Sony FW900 I posted a few days back you can see that there is hardly any bloom at all. Pro manuals for calibrating CRTs recommend adjusting contrast to reduce this effect, so that you don’t lose detail in brightly colored areas. A little bit of bloom helps the highlights “pop” more, though. So this is one of those things that comes down to preference. In general, though, you should be able to clearly make out scanlines over white on a well-calibrated, good quality CRT.
Do keep in mind that “240p”/double-strike content is going to look different on a properly calibrated CRT than 480i, as the individual lines are going to be hit twice as often, resulting in increased brightness (on an individual line/phosphor basis; total image brightness is unchanged) and increased bloom as compared with normal interlaced content, due to twice as much juice going into each lit phosphor per unit of time.
I deliberately defocused the camera by zooming in to max while up close and then moving back a few feet, the same method used to take screenshots without scanlines for magazines back in the day. One of these shots has my shader preset applied and the backlight adjusted to 100%. The other shot is with no filtering except for gamma correction, and the backlight at the normal setting of 35%. Color saturation is crap compared to real life, and the iPhone camera keeps altering the colors in different ways, so bear that in mind.
EDIT 2: updated the photos with gamma correction. A slight adjustment to color saturation was made in the photo with shader settings applied: I changed “saturation” in the shader parameter settings from 1.00 to 1.20 (max 5.00). I checked the color bars test pattern in Fudoh’s 240p suite to ensure I wasn’t getting any clipping in any of the colors. Could probably be tweaked a bit further but I think this is good enough for the sake of this comparison.
EDIT: haha, I realized after taking this that I didn’t apply any gamma correction to the shot without filtering, so that’s another thing to keep in mind. I’ll redo this later when I have the time.