CRT shader debate

In my humble opinion (opinions are like arse holes, everyone has one!), arcade cabinet monitors from the late 80’s and throughout the 90’s, were the pinnacle for displaying video games of that era. I could not have or afford an arcade cabinet in my home back then, but now if I want to re-create that same feeling I would go with the visuals of an arcade monitor for coin-ops obviously and home consoles rather than a high end BVM style CRT. Maybe I’m being unfair as I’ve never been exposed to a BVM style display so don’t know how the output of one looks like. I now have a DIY custom built VEWLIX style arcade cabinet, it has a FHD 32 inch LED display, I can re-create, re-live and enjoy the arcade style look through shaders and MAME HLSL.

I really appreciate @Nesguy’s depth of knowledge on displays, judging by the posts, he calibrates real screens so must know his stuff, I’m just a casual retro gamer, stuck in the 90’s who likes tinkering with shaders, trying new ideas out etc… so I have no authority to question his ability.

However, reading through this thread I have noticed, bar his outstanding, in depth knowledge in displays, much of the feedback @Nesguy has provided appears negative towards everyone else’s screenshots, although I don’t think he wants it to come across like that - it’s probably more out of frustration as people are posting visuals that are frowned upon by video display experts so they are deemed wrong. Yes a casual retro gamer might like the pin cushion effect, the extra bloom, blending of colours, heck throw is some blurring too for good measure but are they wrong or is it a problem if they like that look? RetroArch’s shader system let’s you do that, pick and choose what you like, leave out what you don’t (yes I sound like a broken record stuck on repeat!).

I do hope it doesn’t put off others from posting their screen shots so we can see what fun things people are doing with their shaders.

Funnily enough, me and my 3 kids were doing some arcade gaming last night, I thought I’d get their feedback. I fired up classic Street Fighter 2 and loaded my preferred shader effect (light scanlines/masks, subtle blooming and softening etc…) then I loaded the shader zfast+dotmask with @Nesguy preset and all 3 of them liked the zfast+dotmask visuals.

Blah, hey what do kids know anyway!

4 Likes

I will respond here to both posts if that’s ok. I was not talking about bloom, but sharpness. In the images that you posted on the other thread, I can see square pixels and sharp angles in the shader image. The CRT is much smoother obviously. If you will not it see it now, I can only recommend a pair of glasses.

Sharpness is affected by bloom, whether it’s coming from the CRT itself or the camera.

The subpixel details you’re showing in the zoomed in image are literally impossible to recreate at 1080p, so you’re just showing the limitations of all shaders. Blur does not at all accurately capture the details in the zoomed in “N” from the CRT, as a side by side image using your settings would no doubt reveal.

Regarding brightness, I don’t think my screen would be able to reach the necessary levels to compensate for all that blackness anyway… and honestly I don’t need nor want to crank its brightness to the max (which would also destroy color everywhere else, and I don’t use my computer just for running emulators) to achieve a bright image in old games.

Actually, increasing the backlight doesn’t affect the color values, not when you follow the methods I’ve used to arrive at these settings, which involved the use of a light meter. I also use my monitor for daily computer stuff, it’s pretty easy to just press a button to switch between picture presets…

As the crops I just posted (from your own images) clearly demonstrate, those settings are in fact a far cry from an actual crt monitor, and certainly not up to the air of superiority with which you flaunt them. It’s also kind of ironic that you keep talking about proper calibration, while having to raise the brightness of your LCD to extreme levels in order to achieve decent gamma/brightness.

Let’s look at a side by side using your settings, then we can have an actually meaningful discussion. What these crops show aren’t the limitations of the settings, but of current display tech. As I just mentioned, 1080p isn’t up to the task of replicating all the subpixel details in that image, but it’s definitely doing a better job than the settings you posted, which have no trace of mask simulation at all, and which resemble a CRT with overdriven contrast and brightness. Furthermore, a close up with the crappy phone camera I used isn’t accurately capturing the glow that occurs at proper viewing distance - in fact, such comparisons are very limited in what they can actually demonstrate, and only there to give a rough idea. It has to be seen in person, period.

It is objectively not “overly dark” when these settings are used as intended. I’ve measured between 175 - 200 cd/m2 peak brightness when my backlight is at 100% and the settings are applied. When I’m not playing a game in Retroarch I press a button on the side of my monitor to switch to a different picture mode preset.

The sharpness of the pixels is somewhat mitigated by getting the emulated phosphors to glow as brightly as the real thing on a CRT, and viewing the image at a normal viewing distance for 240p content, which is typically 1-2 feet back from how far people ususally sit from their monitors. Without that, you lose a lot of the glow that occurs between the surface of the screen and the human eye and the blending that results. Without adjusting your display, you’re not using the settings as intended and won’t get a good picture or an accurate idea of what it looks like when the settings are used as intended.

Lastly, if we really must set a reference to how old games should look like, that would be a good arcade monitor, not a video/broadcast one. Arcade games, the ultimate videogame frontier at the time, were developed with the former in mind. That’s how the artists wanted us to see their creations.

I think you’re inferring way too much regarding the artist’s intentions without any real solid evidence to back it up. The article Hunter posted is a pretty good reference. I think one can make a case for composite video benefiting NES era games, but the benefit gets increasingly questionable when you’re talking about anything more recent than that. Even then, it’s questionable. See the post I made above with the images from Wizardry. Sure, you lose some detail, but honestly, I can’t even tell that’s a robe in the composite version; it looks like some kind of weird fur only on the lower half of the orc’s body, maybe. It’s clearly supposed to read as a robe, which it does in the emulated image, and the colors and image clarity are obviously superior with RGB.

Can we see some test images from Fudoh’s 240p test suite using your preferred settings? I’d like to see more people using Fudoh’s test suite for adjusting black level, contrast and color; it’s a really great tool.

Lastly, here’s a shot from Hunter’s blog, for the sake of comparison. Notice how the scanlines are even more prominent compared to the images I posted. This is pretty close to what I consider “ideal” for 240p content. I’m pretty sure this is just a standard 90s PC monitor, nothing too fancy or expensive for that era.

A lot of people find the image of 31 khz CRTs to be too sterile, which I can certainly sympathize with. Hell, a lot of people find 15 khz PVMs to be too sterile, as well. I have a bunch of CRTs because I’ve found that each one is different, and each has its own quirks and charm (the joy/frustration of analog equipment lol). Personally, I love them all!

I got rid of this little 13" consumer CRT because it was a little too grungy, even for me.

My 15 khz 25" Neotec arcade monitor is much less defined than my PVMs, to the point that the gaps between very bright scanlines is scarcely there at all. It also gets a lot of halation/diffusion as a result of the super-thick glass.

I enjoy a lot of different looks and shaders (including xBR/ScaleFx) and switch among many, depending on what I’m playing, but I still get a kick out of this:

7 Likes

pretty much this, yeah.

I often think we’re just talking past each other in this thread - which is easy to do since there are two different things being discussed, here: 1) objective picture quality 2) subjective picture quality.

I happen to think 1 and 2 are the same thing, and that much of what people are doing with shaders is akin to motion interpolation on modern TVs or the other “enhancements” that manufacturers add to TVs to make them stand out more in the store. Sure, it might make the image “pop” more, but it isn’t accurate and results in lost detail and increased eye strain under normal viewing conditions. Many of the images in this thread, IMO, are akin to TVs set to “store mode,” with overblown contrast and brightness and too much bloom. When I criticize an image, I mean that it falls short in some respect regarding the objective picture quality. Of course it’s ultimately up to the individual whether they prefer an accurate image that reduces eye strain or an exaggerated one that causes eye strain. The “it’s too dark!” complaint is one that is all too common to display calibration professionals.

I don’t like any of the images in that comparison using Link :stuck_out_tongue:

I think the real thing missing from many of these attempts at CRT emulation is the mask/phosphor structure and phosphor glow. This is really essential to breaking up the pixels into smaller units and creating the illusion of a higher definition, IMO.

I’ve been playing around with some composite filters for NES games, and I’m kind of on the fence about it. It definitely brings out more detail, but at the cost of a less coherent image overall. It’s not clearly better or worse IMO. I think when it comes to how 240p graphics are displayed, the most we can say is that raw pixels on an LCD are wrong. There are many approaches that are acceptable. I think Blaarg’s NTSC composite filter looks very good/accurate, if one is into that sort of thing.

I think the 31 khz PC CRT image is pretty close to ideal in terms of objective picture quality, but I can see why it might look too sterile to some people; those scanlines are even more prominent than those in the emulated images I posted. I’ve come to appreciate a bit of vertical blending between the lines.

1 Like

@embe - that’s fantastic, I must give that 199X-BLOOM a go.

@Arviel - very nice, and great choice of games there too!

I think he does want to come across like that actually. And more out of pedantry than anything else, I would add. Video experts and crt savants not all agree on the sterile (good word to express my point) look, I can guarantee that to you. This is not hard science we are dealing with.

On the left, a BVM. On the right, a Nanao 15khz arcade monitor. Completely different, both high quality (not so much according to Nesguy, but believe me, Nanaos are good). Needless to say, I much prefer the arcade monitor and its translucent scanlines that merge with the art in a lovely, natural way, rather than devouring half of it.

Someone is pushing his personal preference as an ideal standard of godly quality that everyone should switch to right away. If at least said preference resulted in extremely good looking games, or maybe an accurate replica of something like the BVM-Metal Slug image posted above, I would understand the attitude. However, based on the screens posted, it doesn’t seem to be the case.

Also @shenglong your kids probably read what you write here and they are trolling you my friend :stuck_out_tongue:

@Nesguy

Evidence… well arcade games came together with those screens! A pretty strong message from their makers I reckon. They were the standard.

The image you brought from hunter’s blog is good, yeah… and much brighter than your preset, man. Does your setup look like that ‘in person’? If so, I think you should start taking photos of your display with a camera instead of internal captures, so we can see how it really is with the strong backlight, instead of having to infer and imagine from abstract values like ‘200cd/m2’ :roll_eyes:

Now Ns galore haha

raw%20pixels soft%20crt N

Left is Nesguy’s preset, center is Nesguy’s crt, right is my settings. Nesguy’s are cellphone photos, mine is direct feed. Also, I don’t think that FFVII’s start screen is the best choice for this sort of exercise. Still, some conclusions can be drawn. I haven’t boasted about my preferences being hyper-realistic or anything like that, they are simply what I like. But lo and behold, they actually look closer to the real thing.

(Full size)

Absolutely. As important as the scanlines themselves.

2 Likes

I think he does want to come across like that actually. And more out of pedantry than anything else, I would add. Video experts and crt savants not all agree on the sterile (good word to express my point) look, I can guarantee that to you. This is not hard science we are dealing with.

Seriously? Lighten up. I made my intentions clear with my reply to shenglong above. I am not being pedantic in my criticisms, I’m criticizing the objective picture quality of your images and you’re getting all huffy and personally offended by it. Video experts would most certaintly agree when contrast, black level and color accuracy are all off because these are objective things which are measurable. Still no images from fudoh’s 240p test suite, I can see. It’s like it’s impossible to get a single word of reconciliation from you.

On the left, a BVM. On the right, a Nanao 15khz arcade monitor. Completely different, both high quality (not so much according to Nesguy, but believe me, Nanaos are good 2). Needless to say, I much prefer the arcade monitor and its translucent scanlines that merge with the art in a lovely, natural way, rather than devouring half of it.

I don’t like either of those images very much, the BVM is a little harsh even for me. Both of the monitors are good, yes, but in terms of objective quality, the BVM is pretty much the gold standard.

Someone is pushing his personal preference as an ideal standard of godly quality that everyone should switch to right away. If at least said preference resulted in extremely good looking games, or maybe an accurate replica of something like the BVM-Metal Slug image posted above, I would understand the attitude. However, based on the screens posted, it doesn’t seem to be the case.

A blatant mischaracterization which shows that you’re only interested in stirring shit.

from the preceding post: “I think when it comes to how 240p graphics are displayed, the most we can say is that raw pixels on an LCD are wrong. There are many approaches that are acceptable. I think Blaarg’s NTSC composite filter looks very good/accurate, if one is into that sort of thing.”

Shenglong’s kid’s preferring my settings is no surprise at all. It’s as close as it gets to someone who isn’t already invested in an opinion sharing their untainted view of what is superior.

The reason my settings look good, objectively, is because I’ve increased the contrast around highlights and I’ve made the lowlights even darker, both of which contribute to increasing the dynamic range of the image. That’s what the scanlines and mask structure do to the image, and it’s one of the primary reasons why things look good on a CRT, IMO.

Evidence… well arcade games came together with those screens! A pretty strong message from their makers I reckon. They were the standard.

No they didn’t, and no it isn’t. As per the article that Hunter posted, the graphics were made on RGB monitors, the sharpest screens they had available. Sometimes the output was then tested using composite connected to a TV, but it’s unclear just how widespread a practice this was. Even more unclear is how much various characteristics of the CRT informed the pixel artist’s work. The only evidence we have is that they used something called the 0.5 dot technique, which relied on scanlines and the generally lower sharpness of TVs.

The image you brought from hunter’s blog is good, yeah… and much brighter than your preset, man. Does your setup look like that ‘in person’? If so, I think you should start taking photos of your display with a camera instead of internal captures, so we can see how it really is with the strong backlight, instead of having to infer and imagine from abstract values like ‘200cd/m2’ :roll_eyes:

It seems obvious to me that you are trolling at this point. I’ve said probably 10 times now that my screenshots must be viewed with the backlight turned up, and that this is going to vary depending on the display being used. I can absolutely guarantee that the image using the settings I’m using is actually brighter and has more contrast than the shot of the CRT from Hunter’s blog, although again, you cannot accurately judge these things from photos. As a photographer, one would think that you would understand how cameras all do their own thing to the image. So yeah, I could post some screenshots using a camera phone but it’s going to have it’s own inaccuracies which you will then use to make baseless straw man criticisms. For reference, typical CRTs had a peak brightness of around 175 cd/m2 so that’s why I targeted 200 cd/m2. It’s actually almost too bright for extended viewing, so I can’t help but continue to laugh at this baseless criticism.

That N comparison is crap for all the reasons which I already discussed. Two different cameras taking pictures under different lighting conditions and I’m not even sure how that CRT is calibrated. The CRT shot is obviously overbloomed, so the fact that your N resembles it more closely in that respect is exactly the problem I’m talking about. The scanlines shouldn’t disappear over white like that, and you can find that information in pro manuals for calibrating CRTs.

Furthermore, an extreme close up like that is highly misleading. I cannot stress enough the importance of proper viewing distance. CRT tech is an additive visual system, meaning it emits light and the light combines after leaving the surface of the screen to form the colors that you see. Get close enough to the screen and you can’t even tell what’s going on. At the proper viewing distance that N looks nowhere near as sharp as when its zoomed in like that, because I’ve gotten the emulated phosphors to glow as brightly as the real thing, which causes the emitted light to blend together and form a smoother image when viewed at the correct distance (this is in addition to the natural “pointilist” effect resulting from the visual cognitive system). Of course this is all completely lost in photos. Needless to say, getting a 1080p LCD to resemble a CRT at every possible viewing distance is an impossible task.

I will grant that the transitions from white to black are somewhat more abrupt than what you observe on a CRT up close, but that’s the kind of detail that you need 4K or higher resolution to capture accurately, and which is greatly exaggerated by adding blurs.

So yes, your example does look closer to the CRT shot in respect to the bloom that it shows, which is precisely the problem, because that shot is overbloomed, whether from the camera or the CRT itself. When it comes to the actual mask/phosphor structure, I think it’s obvious that my settings are doing a much better job of capturing what’s going on. On an actual CRT the mask/phosphor structure should be even stronger than what’s going on in my images, but I’m already pushing the limit of what my display is capable of regarding brightness.

Now lets see those test image patterns from Fudoh’s 240p test suite. Then we can talk about something other than opinions.

1 Like

No I don’t mind that you are criticizing my images. They are beautiful and I don’t need you to confirm that. In fact, considering how different our tastes are, I might want to change whatever you liked about any of them :stuck_out_tongue:

I just don’t like your preset, and I like even less that you are trying to sell it as ‘objectively superior’, ugly and unrealistic as I find it.

What do you want me to do with fudoh’s suite? I can’t recall any prior requests.

Impossible to get words of reconciliation? I can’t see what that has anything to do with fudoh, but anyway… how many times do I have to tell you that I “admire and respect your passion, knowledge and technical approach”? I think this one would the 4th. What I don’t admire nor respect is arrogance.

Only if “objective quality” means prioritizing ugly black bars over beautiful pixel art.

Yes they did and I’m obviously not talking about how games were made, but how they were released and meant to be played by their makers, and that’s on an arcade monitor, not a video/broadcast one. Please note how I don’t use the word ‘professional’ to make a distinction between the two. They are both professional. Different professions though.

As a photographer (and videographer), I understand that a properly configured camera can be quite faithful to the real world. I will not straw-criticize anything based on how you capture your footage. I have seen hundreds of gorgeous shots taken with cameras and phones by CRT users, I actually posted some on the original thread.

Do I also have to tell you again that I would love to see your setup with my own eyes? I’m really curious about it, and a photo or a video would give me an idea. It’s possible indeed that the images that you post here don’t do the real thing justice. So please, let’s see it.

Haha yeah sure whatever. I can only really understand the images you provide, you see? Magical viewing distances, cd/m2 values and all those other abstract things that your preset requires to look alright aren’t something that my imagination can implement accurately.

For the nth (and last) time: NO. It’s not bloom. It’s s-h-a-r-p-n-e-s-s. Your preset exhibits this digital, black-line-on-raw-pixel appearance, which is of course particularly obvious up close but also perfectly noticeable in the full size images you have posted. Now you can say all you want about how a correct, magical viewing distance will make up for that, and I will keep calling that charlatanry, at least until you provide some evidence to back it up.

alright guys, you’ll just have to agree to disagree at this point, as it doesn’t look like we’re getting any new arguments/information, just squabbling.

If we can keep talking about stuff, I’m all for it, but if it’s just personal digs and back-and-forth pecking, nothing good will come of it, so I’ll have to lock the thread.

@hunterk - yeah man good idea to split :+1:t3: And sure, I can be more politically correct. Also I think this new one should actually start with an earlier post, this one

I kept it in the last thread because it was on-topic there and included pictures and settings, but it is helpful for context here, so I copied the contents into the first post.

I just don’t like your preset, and I like even less that you are trying to sell it as ‘objectively superior’, ugly and unrealistic as I find it.

You seem to not understand what I mean by “objectively superior.” I’m talking about qualities that can be measured. I’m not arguing that my opinion is superior because that would be completely fruitless. I think you’re just completely misreading the entire situation based upon this one misunderstanding.

Only if “objective quality” means prioritizing ugly black bars over beautiful pixel art.

Again, there is such a thing as objective quality and signal fidelity, and when video professionals are using these terms they are referring to the same thing, which has a single definition they can point to. We continue to talk past each other because if this.

Yes they did 2 and I’m obviously not talking about how games were made, but how they were released and meant to be played by their makers, and that’s on an arcade monitor, not a video/broadcast one. Please note how I don’t use the word ‘professional’ to make a distinction between the two. They are both professional. Different professions though.

I’m not sure what that link is meant to prove. I think you’re also really underestimating just how sharp anarcade monitor could be in person when well maintained and well calibrated. Furthermore, (repeating myself), I think it’s very likely that developers would design the graphics to look as good as possible using the highest quality signal available on the console, because they were competing with each other over the quality of the graphics and this was a big part of the console wars. Even when it’s clear that composite artifacts influenced the design of the pixel art, it’s not obviously clear that composite is superior to RGB; it’s still a trade-off.

Haha yeah sure whatever. I can only really understand the images you provide, you see? Magical viewing distances, cd/m2 values and all those other things that your preset requires to look alright aren’t something that my imagination can implement accurately.

And what I have been telling you repeatedly throughout this discussion is that you can’t get an accurate idea of what one person’s display looks like by looking at screenshots on your own display, whether they are direct capture or through a camera. Photos can only give us a very rough idea of what things look like in person, and can be misleading. A photo or video does not give you an accurate idea, it just provides more fuel for baseless criticisms. My cell phone camera is simply not up to the task. And a lot of the “great photos of CRTs” are actually very misleading for the reasons which I mentioned earlier. CRTs look vastly different depending on viewing distance because of the way they emit light. In fact, good photos of CRTs are rare, and it is very difficult to get a good photo of a CRT in terms of capturing what is actually going on when seen in person. The ONLY way to get an accurate idea is to see the display in person, or to follow the method I used to arrive at these particular settings. Particular settings are going to vary based on the display being used. I think I mentioned earlier that my display is on the brighter end of LED-lit LCDs, but it’s by no means the brightest out there.

The reason I keep mentioning the cd/m2 value is because it is objective. It provides you what a photo cannot. A peak brightness of 175 cd/m2 is roughly equivalent to a typical CRT, and 200 cd/m2 is almost too bright in a dark room. In other words, my display is matching or exceeding the objective brightness of a CRT.

Since we’re just going in circles with this, I propose that it’s only going to be useful for us to discuss methods rather than specific examples. I’ve asked a couple times for you to provide shots using test image patterns from Fudoh’s 240p display, then I can see what’s going on with your settings, objectively. I’m almost certain that there is too much contrast in your images, again objectively and not as a matter of opinion. Of course, seeing those images on *my * display will only reveal if those settings are accurate on my display. I still won’t know exactly what your display looks like, especially if there are differences with how our displays are calibrated.

As far as opinions go, we’ve exhausted the subject. Everyone is well aware of what our preferences are, and there is no point in discussing them further.

Would you please also copy my reply to it then? That one with the Mario World sidebyside.

ffs, I had hoped it would move it to the proper location based on posting date, but that clearly isn’t the case. :confused:

I’ll try to find a way to fix the order, or at least copy/paste it into place somehow

Haha you can just place it anywhere you like before my first reply here.

I am finding this thread very interesting. Would love for Nesguy to take a picture of a CRT running a game and then a pic of an LCD with his shaders preset running the same. Also, I am not too sure what should Squalo do with fudoh’s, but would love for those tests to be shared here as well.

2 Likes

Here’s what I did to arrive at my settings: I calibrated my display to the sRGB standard. Not much had to be adjusted from the sRGB preset on my monitor. I adjusted the color temp somewhat, and set contrast to 80, which is the highest it goes on my display without resulting in clipping.

Next, I maxed out my backlight setting to 100% to get all the brightness I could get out of the display.

My goal is to get the mask as dark as possible while maintaining around 200 cd/m2 peak brightness.

I applied the scanlines and adjusted the parameter settings until each scanline was as close as possible to a 50% reduction in brightness per visible line (1:1).

I increased “mask light” to 2.00 to eek out even more brightness, then I gradually lowered “mask dark” by one step at a time, taking measurements of the white screen using a light meter. I stopped lowering mask dark when my measurements started fluctuating between 175-200 cd/m2.

Next, I increased “mask fade” until the scanlines over pure white were no longer obviously visible at normal viewing distance, then lowered “mask fade” slightly until they were visible. This adds a bit more brightness, so I then returned to “mask dark” and lowered it a step at a time, taking measurements, until I arrived at 175-200 cd/m2 again.

With this pattern, all the bars from D to F should still be clearly visible, which they are on my display with the settings I’m using.

Just for kicks, but you want the three squares in the bottom right to appear as a solid black rectangle with just one of the vertical rectangles slightly visible.

Should be able to see all the bars, which I do on my display.

This is the white screen pattern. On my display I’m consistently measuring between 175-200 cd/m2.

There are six vertical bars in this image which may not even be visible depending on your display. Three to the right of the greyscale pattern, and three to the left. They should all be visible, with the innermost bars just barely visible, since they are supposed to show blue at a value of 1 IRE, the lowest possible value. On my display, the innermost bars are barely visible - I actually have to zoom in quite a bit to confirm it, but the bars are there. This is normally only detectable with the right equipment.

All of the above test images look good on my display.

2 Likes

C’mon… I didn’t misunderstand anything. Just yesterday, you wrote:

I will leave that there.

It obviously is meant to prove that arcade games very often came paired with arcade monitors. In the example provided, Capcom and the SF3 devs clearly chose (yes… Capcom, not some random guy from the internet) that screen to parade their game. Why is that so hard to accept? So BVMs, no matter how much you like them or how big of a meme they became when people started buying them at cheap prices, aren’t/shouldn’t be a standard for old games.

The rest of your post tells me that you are not willing to show your setup. Fair enough, I will not insist. Nor will I believe that it looks phenomenal and so much better than the images posted.

My monitors, all of them, are Spyder-calibrated, like I already told you. Which means that if yours are correctly profiled as well, we should see a similar image. Which is precisely why industrial standards for screen calibration exist, as you should know. Consistency among people for whom image quality is important, for whatever reason.

And man are you persistent. I said I don’t remember having being asked to post any fudoh screens. I would be happy to oblige, but again: what is it exactly you want me to post? Pluge, color bar, gray ramp?

Please.

2 Likes

You really do need to chill out. I’m also going to assume that you posted before I replied with my test images and give you the benefit of the doubt. You’ve done a terrific job of cherry picking which part of my posts to reply to which leads me to believe that you are arguing in bad faith, but I’m trying to move past that so we can have a fruitful discussion.

I said that I happen to think 1 and 2 are the same thing, but that’s not the same thing as me arguing that this is objectively true, which is what you seem to think. So yeah, the whole thing is predicated on a giant misunderstanding on your part, and you just won’t let it go for some reason.

It obviously is meant to prove that arcade games very often came paired with arcade monitors. In the example provided, Capcom and the SF3 devs clearly chose (yes… Capcom, not some random guy from the internet) that screen to parade their game. Why is that so hard to accept? So BVMs, no matter how much you like them or how big of a meme they became when people started buying them at cheap prices, aren’t/shouldn’t be a standard for old games.

That’s one specific example, which again doesn’t tell us much of anything that we can generalize from. I also am not seeing the part in that link where it explains that the pixel artists themselves chose that particular screen or where they explained that the specific qualities of that particular screen informed their work or why the qualities of that particular screen are essential to properly displaying their designs. I’m just not seeing any of that in the link provided; maybe I’m missing something. There are lots of other factors that go into choosing an arcade monitor; namely, cost. We simply don’t know what they would have chosen as a display if this wasn’t a factor. I could just as easily point to the monitor used in the Toys R Us SNES display unit as “proof” that Nintendo wanted their games displayed in gloriously sharp RGB. We really can’t infer much of anything regarding the artist’s intentions from such examples.

And man are you persistent. I said I don’t remember having being asked to post any fudoh screens. I would be happy to oblige, so again: what is it exactly you want me to post? Pluge, color bar, gray ramp?

from 23 hours ago, do a search with this text: “Can we see some test images from Fudoh’s 240p test suite using your preferred settings? I’d like to see more people using Fudoh’s test suite for adjusting black level, contrast and color; it’s a really great tool.”

If your display is calibrated to the same standard as mine, it still doesn’t mean that we’re seeing the same image on our displays, because our displays have different specs. So you can truly only judge these things in person, by seeing someone’s display or by following the method they used on your own display (which I explained above). The settings you actually arrive at following this method will most likely differ from those I’m using due to the fact that we’re using different displays. Other than buying a plane ticket to Denver, the only way you can form an accurate opinion is by following the method yourself.

They commercialized the game together with that cabinet and monitor, what does that tell you? I’m also pretty sure that whatever stick and buttons were used, Capcom meant their game to be played with them. It’s an example yeah, how many do you need? 10? 27? And then there’s stuff like Sega’s Aero/Astro City:

AstroCity_Cabinet

Sure, that is no doubt a BVM :roll_eyes:

That request of posting Fudoh with my preferred settings wasn’t there when I read the post, I believe. Maybe you edited it afterwards, maybe I missed it. Sorry if I did.

Your Fudoh internal capture screens are completely useless, your whites can never be white under those settings. They will appear grey on any non madly backlit screen. For the rest of us, you should post camera photos of your cranked up screen, and clearly you don’t want to do that. Your methodology is so broken it renders the discussion moot.

So I will post Fudoh’s gray ramp (from the MegaDrive suite, which is the one I have at hand) just to prove that contrast is fine in my setup, and that my whites are actually white. And l will get off there. I don’t feel like going on with this argument, at least for the time being.