CRT shader debate

comparison shot using the settings I posted above.

This shot is from the article HunterK posted above. (https://vgdensetsu.tumblr.com/post/179656817318/designing-2d-graphics-in-the-japanese-industry). This is the Famicom version of Wizardry, displayed on a CRT using composite video. Notice how the composite artifacts actually alter the pixel colors and add more detail to the image.

tumblr_inline_phiwexvGTu1w169t0_500

This next image is an emulated image using the settings I posted. Although you lose some of the detail resulting from composite artifacts, the image is much sharper, cleaner, and the colors are much better. It’s a trade-off, but one that is well worth it, IMO. It’s sort of a happy medium between composite video and raw pixels on an LCD. (As always, click once to open image, right click and select view image, then click once to zoom to full size). Display backlight will need to be increased for the second image (I have my backlight set to 100%).

1 Like

This should be in the Shaders FAQ for ppl who can’t remember exactly how CRT looked and what to expect … 100% agree and share the same opinion.

1 Like

As technically as accurate as they might be I personally find a sharp, pixelated image with hard scanlines and dark masks difficult to look at for too long, I don’t think my brain is trained to handle all that “accuracy”! Maybe I’m a bit stupid😂

Over the years I’ve messed with many shaders, at one point I thought xBRZ was king!! :dizzy_face:

Now I’m at a stage where I want my retro games to have a nostalgic feeling on a 50 inch LCD (I still have a CRT TV but I prefer the convenience of a LCD). When I say “nostalgic feeling”, of course I wouldn’t want to go back to having the noise and appalling colour bleed from an RF connection. RetroArch let’s us pick and choose what looks right for us, I find the subtle scanlines, minor softening of pixels and smoothing of colours easier on the eyes when viewed on a large flat screen, my brain handles that effect better for video games than a crisp, sharp pixelated image with dark scanlines.

Whatever floats your boat :sailboat:

4 Likes

I don’t really care to get into a pointless debate over subjective definitions of quality. You’re certainly welcome to your opinions and it’s fine that you want to share your results here.

You’ve mentioned the image being too dark several times now, but I wonder if you ever followed my recommendations regarding adjusting your display’s backlight, as that’s part of how these settings are intended to be used. Doing so results in a peak brightness that actually exceeds that of a CRT (CRT is ~175 cd/m2 while this resulted in a peak brightness of ~200 cd/m2 on my display).

I’m really only interested in the definition of quality that actually gets used by video professionals. Your image resembles a poorly calibrated CRT with overblown contrast and brightness and too much bloom (scanlines should still be clearly visible over white according to how professionals calibrated CRTs).

Regarding the scanlines, as CRT tech improved the scanlines get closer and closer to 1:1; see the image I posted of the Sony FW900 above, which is both sharper and has more prominent scanlines than either of the emulated images I posted.

I don’t get the blur at all. Even the slightest amount of blur doesn’t do anything for image quality IMO. You’re just trading one form of eye strain for another (blocky pixels for blur). Enabling quilez scaling with the zfast shader and setting blurscalex to 0 is still too blurry for me, and I grew up playing these games with a crappy RF or composite connection. When the edges of objects are blurry like that, my eyes keep straining to find the edges. It’s as exhausting as watching a 3d movie (which I dislike). It’s not doing anything to benefit the pixel art. I think one can make a case that composite artifacts benefit NES era and earlier games, but even that is questionable, since a good many people seem to prefer RGB for that as well. Even so, composite artifacting isn’t just blur; it’s more complicated than that and it isn’t accurately captured by just blurring the image.

Also, as I mentioned earlier, cranking up the backlight and viewing the image at a proper viewing distance mitigates the sharpness somewhat, because then the emulated phosphors are acting more like real phosphors, which results in the emitted light blending together at the right distance. Get close enough to a CRT and it’s rough as hell; much rougher than either of the emulated images I posted.

Another thing worth considering is that your eyes will adjust to the amount of light being emitted because your eyes dilate. If you’re used to viewing images with too much contrast and brightness, a properly calibrated image is going to appear too dark and dull. After a few days of viewing a properly calibrated image, it will no longer look dark but it will look more accurate and produce less eye strain.

Also, if you insist on blur, then you can easily adjust the blur with the shader I’m using by adjusting the “blurscalex” parameter and setting shader filter to “linear.” This shader is extremely lightweight so it should have no problem running full speed on older / low power systems.

One thing noticeably absent in all of your images is anything resembling accurate RGB mask/phosphor emulation. That may not be important to you, but it’s definitely an integral part of how all CRTs worked, and IMO it’s one of the most important aspects of CRT emulation. EDIT: I see you’re using 720p, so that effectively rules out any kind of accurate RGB phosphor emulation. You just can’t make the phosphors small enough at that resolution.

I simply wouldn’t want to experience any modern game, or a movie with such obvious, opaque black bars eating up so much of the signal

Of course I don’t play modern games or watch movies with these settings; that would be silly. This is exclusively for retro gaming.

FWIW, I’m really pushing the mask strength with these settings while trying to maintain acceptable brightness. Here’s a shot from Fudoh’s 240p test suite. I need to zoom in to really see it, but with my backlight at 100% the inner bars are just barely visible; this is what they should look like since they’re supposed to be displaying blue at 1 IRE, the lowest possible value.

[edited to add previous post for context]

@Nesguy - we already talked about it in the other thread, but here goes my opinion about your settings for whoever didn’t read it there: as much as I appreciate and admire your technical/scientific approach (and I do!), your images look as artificial to me as mine look to you. The scanlines are too prominent, the resulting image too dark, and also too sharp. I don’t like playing with such obvious artifacts between my sight and the art in the game, they do get in the way. It’s not about nostalgia and crappy TVs during childhood as you think. I simply wouldn’t want to experience any modern game, or a movie with such obvious, opaque black bars eating up so much of the signal. Same applies to retrogaming. Let’s put them side by side.

I’m not saying my image is a more accurate simulation, or better. It’s not a competition. What I’m saying is that it looks easier on my eye. More… playable. But like @shenglong always says, the beauty of retroarch is that it provides so much choice. It’s also great that we can share knowledge and experience in these forums. I’m not saying my image is a more accurate simulation, or better. It’s not a competition. What I’m saying is that it looks easier on my eye. More… playable. But like @shenglong always says, the beauty of retroarch is that it provides so much choice. It’s also great that we can share all this knowledge and experience in these forums. Also the devs are around as well, and I hope it’s good for them to get this kind of feedback too :slight_smile:

[end previous post]

@Nesguy - I will respond here to both posts if that’s ok. I was not talking about bloom, but sharpness. In the images that you posted on the other thread, I can see square pixels and sharp angles in the shader image. The CRT is much smoother obviously. If you will not it see it now, I can only recommend a pair of glasses.

raw%20pixels soft%20crt

Regarding brightness, I don’t think my screen would be able to reach the necessary levels to compensate for all that blackness anyway… and honestly I don’t need nor want to crank its brightness to the max (which would also destroy color everywhere else, and I don’t use my computer just for running emulators) to achieve a bright image in old games.

As the crops I just posted (from your own images) clearly demonstrate, those settings are in fact a far cry from an actual crt monitor, and certainly not up to the air of superiority with which you flaunt them. It’s also kind of ironic that you keep talking about proper calibration, while having to raise the brightness of your LCD to extreme levels in order to achieve decent gamma/brightness.

I will be repeating myself here, but calibrated screens are essential to my job, and on my (or any) calibrated monitor that is not operating at blinding intensities, those settings will result in an overly dark image. Plus the too prominent black lines eating up the pixel art. Plus the excessive, almost raw-pixel sharpness.

That said, if that’s how you enjoy your games, good for you. It’s all a matter of taste. I have also told you already that I admire your passion, knowledge and technical approach to these matters, and it’s great having people like you around here, but I find it inelegant to speak like you are right and everyone else is wrong, specially when you certainly aren’t.

Lastly, if we really must set a reference to how old games should look like, that would be a good arcade monitor, not a video/broadcast one. Arcade games, the ultimate videogame frontier at the time, were developed with the former in mind. That’s how the artists wanted us to see their creations.

6 Likes

In my humble opinion (opinions are like arse holes, everyone has one!), arcade cabinet monitors from the late 80’s and throughout the 90’s, were the pinnacle for displaying video games of that era. I could not have or afford an arcade cabinet in my home back then, but now if I want to re-create that same feeling I would go with the visuals of an arcade monitor for coin-ops obviously and home consoles rather than a high end BVM style CRT. Maybe I’m being unfair as I’ve never been exposed to a BVM style display so don’t know how the output of one looks like. I now have a DIY custom built VEWLIX style arcade cabinet, it has a FHD 32 inch LED display, I can re-create, re-live and enjoy the arcade style look through shaders and MAME HLSL.

I really appreciate @Nesguy’s depth of knowledge on displays, judging by the posts, he calibrates real screens so must know his stuff, I’m just a casual retro gamer, stuck in the 90’s who likes tinkering with shaders, trying new ideas out etc… so I have no authority to question his ability.

However, reading through this thread I have noticed, bar his outstanding, in depth knowledge in displays, much of the feedback @Nesguy has provided appears negative towards everyone else’s screenshots, although I don’t think he wants it to come across like that - it’s probably more out of frustration as people are posting visuals that are frowned upon by video display experts so they are deemed wrong. Yes a casual retro gamer might like the pin cushion effect, the extra bloom, blending of colours, heck throw is some blurring too for good measure but are they wrong or is it a problem if they like that look? RetroArch’s shader system let’s you do that, pick and choose what you like, leave out what you don’t (yes I sound like a broken record stuck on repeat!).

I do hope it doesn’t put off others from posting their screen shots so we can see what fun things people are doing with their shaders.

Funnily enough, me and my 3 kids were doing some arcade gaming last night, I thought I’d get their feedback. I fired up classic Street Fighter 2 and loaded my preferred shader effect (light scanlines/masks, subtle blooming and softening etc…) then I loaded the shader zfast+dotmask with @Nesguy preset and all 3 of them liked the zfast+dotmask visuals.

Blah, hey what do kids know anyway!

4 Likes

I will respond here to both posts if that’s ok. I was not talking about bloom, but sharpness. In the images that you posted on the other thread, I can see square pixels and sharp angles in the shader image. The CRT is much smoother obviously. If you will not it see it now, I can only recommend a pair of glasses.

Sharpness is affected by bloom, whether it’s coming from the CRT itself or the camera.

The subpixel details you’re showing in the zoomed in image are literally impossible to recreate at 1080p, so you’re just showing the limitations of all shaders. Blur does not at all accurately capture the details in the zoomed in “N” from the CRT, as a side by side image using your settings would no doubt reveal.

Regarding brightness, I don’t think my screen would be able to reach the necessary levels to compensate for all that blackness anyway… and honestly I don’t need nor want to crank its brightness to the max (which would also destroy color everywhere else, and I don’t use my computer just for running emulators) to achieve a bright image in old games.

Actually, increasing the backlight doesn’t affect the color values, not when you follow the methods I’ve used to arrive at these settings, which involved the use of a light meter. I also use my monitor for daily computer stuff, it’s pretty easy to just press a button to switch between picture presets…

As the crops I just posted (from your own images) clearly demonstrate, those settings are in fact a far cry from an actual crt monitor, and certainly not up to the air of superiority with which you flaunt them. It’s also kind of ironic that you keep talking about proper calibration, while having to raise the brightness of your LCD to extreme levels in order to achieve decent gamma/brightness.

Let’s look at a side by side using your settings, then we can have an actually meaningful discussion. What these crops show aren’t the limitations of the settings, but of current display tech. As I just mentioned, 1080p isn’t up to the task of replicating all the subpixel details in that image, but it’s definitely doing a better job than the settings you posted, which have no trace of mask simulation at all, and which resemble a CRT with overdriven contrast and brightness. Furthermore, a close up with the crappy phone camera I used isn’t accurately capturing the glow that occurs at proper viewing distance - in fact, such comparisons are very limited in what they can actually demonstrate, and only there to give a rough idea. It has to be seen in person, period.

It is objectively not “overly dark” when these settings are used as intended. I’ve measured between 175 - 200 cd/m2 peak brightness when my backlight is at 100% and the settings are applied. When I’m not playing a game in Retroarch I press a button on the side of my monitor to switch to a different picture mode preset.

The sharpness of the pixels is somewhat mitigated by getting the emulated phosphors to glow as brightly as the real thing on a CRT, and viewing the image at a normal viewing distance for 240p content, which is typically 1-2 feet back from how far people ususally sit from their monitors. Without that, you lose a lot of the glow that occurs between the surface of the screen and the human eye and the blending that results. Without adjusting your display, you’re not using the settings as intended and won’t get a good picture or an accurate idea of what it looks like when the settings are used as intended.

Lastly, if we really must set a reference to how old games should look like, that would be a good arcade monitor, not a video/broadcast one. Arcade games, the ultimate videogame frontier at the time, were developed with the former in mind. That’s how the artists wanted us to see their creations.

I think you’re inferring way too much regarding the artist’s intentions without any real solid evidence to back it up. The article Hunter posted is a pretty good reference. I think one can make a case for composite video benefiting NES era games, but the benefit gets increasingly questionable when you’re talking about anything more recent than that. Even then, it’s questionable. See the post I made above with the images from Wizardry. Sure, you lose some detail, but honestly, I can’t even tell that’s a robe in the composite version; it looks like some kind of weird fur only on the lower half of the orc’s body, maybe. It’s clearly supposed to read as a robe, which it does in the emulated image, and the colors and image clarity are obviously superior with RGB.

Can we see some test images from Fudoh’s 240p test suite using your preferred settings? I’d like to see more people using Fudoh’s test suite for adjusting black level, contrast and color; it’s a really great tool.

Lastly, here’s a shot from Hunter’s blog, for the sake of comparison. Notice how the scanlines are even more prominent compared to the images I posted. This is pretty close to what I consider “ideal” for 240p content. I’m pretty sure this is just a standard 90s PC monitor, nothing too fancy or expensive for that era.

A lot of people find the image of 31 khz CRTs to be too sterile, which I can certainly sympathize with. Hell, a lot of people find 15 khz PVMs to be too sterile, as well. I have a bunch of CRTs because I’ve found that each one is different, and each has its own quirks and charm (the joy/frustration of analog equipment lol). Personally, I love them all!

I got rid of this little 13" consumer CRT because it was a little too grungy, even for me.

My 15 khz 25" Neotec arcade monitor is much less defined than my PVMs, to the point that the gaps between very bright scanlines is scarcely there at all. It also gets a lot of halation/diffusion as a result of the super-thick glass.

I enjoy a lot of different looks and shaders (including xBR/ScaleFx) and switch among many, depending on what I’m playing, but I still get a kick out of this:

7 Likes

pretty much this, yeah.

I often think we’re just talking past each other in this thread - which is easy to do since there are two different things being discussed, here: 1) objective picture quality 2) subjective picture quality.

I happen to think 1 and 2 are the same thing, and that much of what people are doing with shaders is akin to motion interpolation on modern TVs or the other “enhancements” that manufacturers add to TVs to make them stand out more in the store. Sure, it might make the image “pop” more, but it isn’t accurate and results in lost detail and increased eye strain under normal viewing conditions. Many of the images in this thread, IMO, are akin to TVs set to “store mode,” with overblown contrast and brightness and too much bloom. When I criticize an image, I mean that it falls short in some respect regarding the objective picture quality. Of course it’s ultimately up to the individual whether they prefer an accurate image that reduces eye strain or an exaggerated one that causes eye strain. The “it’s too dark!” complaint is one that is all too common to display calibration professionals.

I don’t like any of the images in that comparison using Link :stuck_out_tongue:

I think the real thing missing from many of these attempts at CRT emulation is the mask/phosphor structure and phosphor glow. This is really essential to breaking up the pixels into smaller units and creating the illusion of a higher definition, IMO.

I’ve been playing around with some composite filters for NES games, and I’m kind of on the fence about it. It definitely brings out more detail, but at the cost of a less coherent image overall. It’s not clearly better or worse IMO. I think when it comes to how 240p graphics are displayed, the most we can say is that raw pixels on an LCD are wrong. There are many approaches that are acceptable. I think Blaarg’s NTSC composite filter looks very good/accurate, if one is into that sort of thing.

I think the 31 khz PC CRT image is pretty close to ideal in terms of objective picture quality, but I can see why it might look too sterile to some people; those scanlines are even more prominent than those in the emulated images I posted. I’ve come to appreciate a bit of vertical blending between the lines.

1 Like

@embe - that’s fantastic, I must give that 199X-BLOOM a go.

@Arviel - very nice, and great choice of games there too!

I think he does want to come across like that actually. And more out of pedantry than anything else, I would add. Video experts and crt savants not all agree on the sterile (good word to express my point) look, I can guarantee that to you. This is not hard science we are dealing with.

On the left, a BVM. On the right, a Nanao 15khz arcade monitor. Completely different, both high quality (not so much according to Nesguy, but believe me, Nanaos are good). Needless to say, I much prefer the arcade monitor and its translucent scanlines that merge with the art in a lovely, natural way, rather than devouring half of it.

Someone is pushing his personal preference as an ideal standard of godly quality that everyone should switch to right away. If at least said preference resulted in extremely good looking games, or maybe an accurate replica of something like the BVM-Metal Slug image posted above, I would understand the attitude. However, based on the screens posted, it doesn’t seem to be the case.

Also @shenglong your kids probably read what you write here and they are trolling you my friend :stuck_out_tongue:

@Nesguy

Evidence… well arcade games came together with those screens! A pretty strong message from their makers I reckon. They were the standard.

The image you brought from hunter’s blog is good, yeah… and much brighter than your preset, man. Does your setup look like that ‘in person’? If so, I think you should start taking photos of your display with a camera instead of internal captures, so we can see how it really is with the strong backlight, instead of having to infer and imagine from abstract values like ‘200cd/m2’ :roll_eyes:

Now Ns galore haha

raw%20pixels soft%20crt N

Left is Nesguy’s preset, center is Nesguy’s crt, right is my settings. Nesguy’s are cellphone photos, mine is direct feed. Also, I don’t think that FFVII’s start screen is the best choice for this sort of exercise. Still, some conclusions can be drawn. I haven’t boasted about my preferences being hyper-realistic or anything like that, they are simply what I like. But lo and behold, they actually look closer to the real thing.

(Full size)

Absolutely. As important as the scanlines themselves.

2 Likes

I think he does want to come across like that actually. And more out of pedantry than anything else, I would add. Video experts and crt savants not all agree on the sterile (good word to express my point) look, I can guarantee that to you. This is not hard science we are dealing with.

Seriously? Lighten up. I made my intentions clear with my reply to shenglong above. I am not being pedantic in my criticisms, I’m criticizing the objective picture quality of your images and you’re getting all huffy and personally offended by it. Video experts would most certaintly agree when contrast, black level and color accuracy are all off because these are objective things which are measurable. Still no images from fudoh’s 240p test suite, I can see. It’s like it’s impossible to get a single word of reconciliation from you.

On the left, a BVM. On the right, a Nanao 15khz arcade monitor. Completely different, both high quality (not so much according to Nesguy, but believe me, Nanaos are good 2). Needless to say, I much prefer the arcade monitor and its translucent scanlines that merge with the art in a lovely, natural way, rather than devouring half of it.

I don’t like either of those images very much, the BVM is a little harsh even for me. Both of the monitors are good, yes, but in terms of objective quality, the BVM is pretty much the gold standard.

Someone is pushing his personal preference as an ideal standard of godly quality that everyone should switch to right away. If at least said preference resulted in extremely good looking games, or maybe an accurate replica of something like the BVM-Metal Slug image posted above, I would understand the attitude. However, based on the screens posted, it doesn’t seem to be the case.

A blatant mischaracterization which shows that you’re only interested in stirring shit.

from the preceding post: “I think when it comes to how 240p graphics are displayed, the most we can say is that raw pixels on an LCD are wrong. There are many approaches that are acceptable. I think Blaarg’s NTSC composite filter looks very good/accurate, if one is into that sort of thing.”

Shenglong’s kid’s preferring my settings is no surprise at all. It’s as close as it gets to someone who isn’t already invested in an opinion sharing their untainted view of what is superior.

The reason my settings look good, objectively, is because I’ve increased the contrast around highlights and I’ve made the lowlights even darker, both of which contribute to increasing the dynamic range of the image. That’s what the scanlines and mask structure do to the image, and it’s one of the primary reasons why things look good on a CRT, IMO.

Evidence… well arcade games came together with those screens! A pretty strong message from their makers I reckon. They were the standard.

No they didn’t, and no it isn’t. As per the article that Hunter posted, the graphics were made on RGB monitors, the sharpest screens they had available. Sometimes the output was then tested using composite connected to a TV, but it’s unclear just how widespread a practice this was. Even more unclear is how much various characteristics of the CRT informed the pixel artist’s work. The only evidence we have is that they used something called the 0.5 dot technique, which relied on scanlines and the generally lower sharpness of TVs.

The image you brought from hunter’s blog is good, yeah… and much brighter than your preset, man. Does your setup look like that ‘in person’? If so, I think you should start taking photos of your display with a camera instead of internal captures, so we can see how it really is with the strong backlight, instead of having to infer and imagine from abstract values like ‘200cd/m2’ :roll_eyes:

It seems obvious to me that you are trolling at this point. I’ve said probably 10 times now that my screenshots must be viewed with the backlight turned up, and that this is going to vary depending on the display being used. I can absolutely guarantee that the image using the settings I’m using is actually brighter and has more contrast than the shot of the CRT from Hunter’s blog, although again, you cannot accurately judge these things from photos. As a photographer, one would think that you would understand how cameras all do their own thing to the image. So yeah, I could post some screenshots using a camera phone but it’s going to have it’s own inaccuracies which you will then use to make baseless straw man criticisms. For reference, typical CRTs had a peak brightness of around 175 cd/m2 so that’s why I targeted 200 cd/m2. It’s actually almost too bright for extended viewing, so I can’t help but continue to laugh at this baseless criticism.

That N comparison is crap for all the reasons which I already discussed. Two different cameras taking pictures under different lighting conditions and I’m not even sure how that CRT is calibrated. The CRT shot is obviously overbloomed, so the fact that your N resembles it more closely in that respect is exactly the problem I’m talking about. The scanlines shouldn’t disappear over white like that, and you can find that information in pro manuals for calibrating CRTs.

Furthermore, an extreme close up like that is highly misleading. I cannot stress enough the importance of proper viewing distance. CRT tech is an additive visual system, meaning it emits light and the light combines after leaving the surface of the screen to form the colors that you see. Get close enough to the screen and you can’t even tell what’s going on. At the proper viewing distance that N looks nowhere near as sharp as when its zoomed in like that, because I’ve gotten the emulated phosphors to glow as brightly as the real thing, which causes the emitted light to blend together and form a smoother image when viewed at the correct distance (this is in addition to the natural “pointilist” effect resulting from the visual cognitive system). Of course this is all completely lost in photos. Needless to say, getting a 1080p LCD to resemble a CRT at every possible viewing distance is an impossible task.

I will grant that the transitions from white to black are somewhat more abrupt than what you observe on a CRT up close, but that’s the kind of detail that you need 4K or higher resolution to capture accurately, and which is greatly exaggerated by adding blurs.

So yes, your example does look closer to the CRT shot in respect to the bloom that it shows, which is precisely the problem, because that shot is overbloomed, whether from the camera or the CRT itself. When it comes to the actual mask/phosphor structure, I think it’s obvious that my settings are doing a much better job of capturing what’s going on. On an actual CRT the mask/phosphor structure should be even stronger than what’s going on in my images, but I’m already pushing the limit of what my display is capable of regarding brightness.

Now lets see those test image patterns from Fudoh’s 240p test suite. Then we can talk about something other than opinions.

1 Like

No I don’t mind that you are criticizing my images. They are beautiful and I don’t need you to confirm that. In fact, considering how different our tastes are, I might want to change whatever you liked about any of them :stuck_out_tongue:

I just don’t like your preset, and I like even less that you are trying to sell it as ‘objectively superior’, ugly and unrealistic as I find it.

What do you want me to do with fudoh’s suite? I can’t recall any prior requests.

Impossible to get words of reconciliation? I can’t see what that has anything to do with fudoh, but anyway… how many times do I have to tell you that I “admire and respect your passion, knowledge and technical approach”? I think this one would the 4th. What I don’t admire nor respect is arrogance.

Only if “objective quality” means prioritizing ugly black bars over beautiful pixel art.

Yes they did and I’m obviously not talking about how games were made, but how they were released and meant to be played by their makers, and that’s on an arcade monitor, not a video/broadcast one. Please note how I don’t use the word ‘professional’ to make a distinction between the two. They are both professional. Different professions though.

As a photographer (and videographer), I understand that a properly configured camera can be quite faithful to the real world. I will not straw-criticize anything based on how you capture your footage. I have seen hundreds of gorgeous shots taken with cameras and phones by CRT users, I actually posted some on the original thread.

Do I also have to tell you again that I would love to see your setup with my own eyes? I’m really curious about it, and a photo or a video would give me an idea. It’s possible indeed that the images that you post here don’t do the real thing justice. So please, let’s see it.

Haha yeah sure whatever. I can only really understand the images you provide, you see? Magical viewing distances, cd/m2 values and all those other abstract things that your preset requires to look alright aren’t something that my imagination can implement accurately.

For the nth (and last) time: NO. It’s not bloom. It’s s-h-a-r-p-n-e-s-s. Your preset exhibits this digital, black-line-on-raw-pixel appearance, which is of course particularly obvious up close but also perfectly noticeable in the full size images you have posted. Now you can say all you want about how a correct, magical viewing distance will make up for that, and I will keep calling that charlatanry, at least until you provide some evidence to back it up.

alright guys, you’ll just have to agree to disagree at this point, as it doesn’t look like we’re getting any new arguments/information, just squabbling.

If we can keep talking about stuff, I’m all for it, but if it’s just personal digs and back-and-forth pecking, nothing good will come of it, so I’ll have to lock the thread.

@hunterk - yeah man good idea to split :+1:t3: And sure, I can be more politically correct. Also I think this new one should actually start with an earlier post, this one

I kept it in the last thread because it was on-topic there and included pictures and settings, but it is helpful for context here, so I copied the contents into the first post.

I just don’t like your preset, and I like even less that you are trying to sell it as ‘objectively superior’, ugly and unrealistic as I find it.

You seem to not understand what I mean by “objectively superior.” I’m talking about qualities that can be measured. I’m not arguing that my opinion is superior because that would be completely fruitless. I think you’re just completely misreading the entire situation based upon this one misunderstanding.

Only if “objective quality” means prioritizing ugly black bars over beautiful pixel art.

Again, there is such a thing as objective quality and signal fidelity, and when video professionals are using these terms they are referring to the same thing, which has a single definition they can point to. We continue to talk past each other because if this.

Yes they did 2 and I’m obviously not talking about how games were made, but how they were released and meant to be played by their makers, and that’s on an arcade monitor, not a video/broadcast one. Please note how I don’t use the word ‘professional’ to make a distinction between the two. They are both professional. Different professions though.

I’m not sure what that link is meant to prove. I think you’re also really underestimating just how sharp anarcade monitor could be in person when well maintained and well calibrated. Furthermore, (repeating myself), I think it’s very likely that developers would design the graphics to look as good as possible using the highest quality signal available on the console, because they were competing with each other over the quality of the graphics and this was a big part of the console wars. Even when it’s clear that composite artifacts influenced the design of the pixel art, it’s not obviously clear that composite is superior to RGB; it’s still a trade-off.

Haha yeah sure whatever. I can only really understand the images you provide, you see? Magical viewing distances, cd/m2 values and all those other things that your preset requires to look alright aren’t something that my imagination can implement accurately.

And what I have been telling you repeatedly throughout this discussion is that you can’t get an accurate idea of what one person’s display looks like by looking at screenshots on your own display, whether they are direct capture or through a camera. Photos can only give us a very rough idea of what things look like in person, and can be misleading. A photo or video does not give you an accurate idea, it just provides more fuel for baseless criticisms. My cell phone camera is simply not up to the task. And a lot of the “great photos of CRTs” are actually very misleading for the reasons which I mentioned earlier. CRTs look vastly different depending on viewing distance because of the way they emit light. In fact, good photos of CRTs are rare, and it is very difficult to get a good photo of a CRT in terms of capturing what is actually going on when seen in person. The ONLY way to get an accurate idea is to see the display in person, or to follow the method I used to arrive at these particular settings. Particular settings are going to vary based on the display being used. I think I mentioned earlier that my display is on the brighter end of LED-lit LCDs, but it’s by no means the brightest out there.

The reason I keep mentioning the cd/m2 value is because it is objective. It provides you what a photo cannot. A peak brightness of 175 cd/m2 is roughly equivalent to a typical CRT, and 200 cd/m2 is almost too bright in a dark room. In other words, my display is matching or exceeding the objective brightness of a CRT.

Since we’re just going in circles with this, I propose that it’s only going to be useful for us to discuss methods rather than specific examples. I’ve asked a couple times for you to provide shots using test image patterns from Fudoh’s 240p display, then I can see what’s going on with your settings, objectively. I’m almost certain that there is too much contrast in your images, again objectively and not as a matter of opinion. Of course, seeing those images on *my * display will only reveal if those settings are accurate on my display. I still won’t know exactly what your display looks like, especially if there are differences with how our displays are calibrated.

As far as opinions go, we’ve exhausted the subject. Everyone is well aware of what our preferences are, and there is no point in discussing them further.

Would you please also copy my reply to it then? That one with the Mario World sidebyside.

ffs, I had hoped it would move it to the proper location based on posting date, but that clearly isn’t the case. :confused:

I’ll try to find a way to fix the order, or at least copy/paste it into place somehow

Haha you can just place it anywhere you like before my first reply here.