Will CRT shaders ever match a real CRT?

It is highly disingenuous to say this because it all depends on how much brightness headroom you have available in your display. We are at the point now where you can buy a 10,000 nit TV and DisplayHDR1400 monitors.

There are also solutions like Gsync Pulsar and CRT Beam Simulator which aim to solve the temporal smoothness and clarity issue even more efficiently.

Please do some research before regurgitating false or outdated information.

2 Likes

Relax, i thought you were talking about the BFI option in RetroArch, not about monitors that have similar built in functions. Obviously if a monitor is made with BFI in mind, it will also have the means to balance out the image.

And yeah, i look forward to Gsync pulsar and i’m holding on getting a new monitor after this tech matures.

No, I was talking about all BFI.

That’s not necessarily true. BFi generally always darkens the image. BFI off would always have higher peak brightness, it doesn’t matter if it’s hardware or Software BFI.

With accurate mask and scanline emulation also darkening the image you need as much brightness from the display as possible.

You don’t need to tell me to relax, that has no relevance to the topic or discussion.

So what exactly did i say that was so wrong then in my original post? I said that exact thing (even referring to RetroArch only) and your reply was pretty strict about how wrong i am.

I already responded to that statement. I don’t think there’s a need to try to explain it any better than I did the first time. I would ask you to read it over.

You said BFI is not a solution. That’s not true.

You said it darkens the colours to the point where it’s impossible to balance them by increasing brightness/contrast, e.t.c.

That is also not true at all.

I said it all depends on the amount of brightness headroom you have available. Why do I need to repeat myself? What about that don’t you understand and why?

You have to repeat yourself because you are acting so adamant and strict about this with your tone, making my post look like i said something factually wrong and that i’m spreading false information and all that bullshit.

Yet, you admited what i said in my previous post, that BFI darkens the image. That’s a fact we both agree. The next part where i say “it’s impossible to balance it out” was simply my opinion. Sure you can have more headroom to adjust depending on the monitor/TV but i tried this in quite a few so far (i have two myself) and i was never happy with the result. And you also confirmed this by saying “BFI off would always have higher peak brightness”.

So i still stand to what i said that BFI isn’t a solution (for me) because it (more or less) darkens the image.

This is getting tiring. You didn’t say it was just your opinion the first time. Your choice of words were very absolute. Facts are facts.

This is fact.

I don’t think there’s anything more that needs to be said about this.

In the future you can qualify your statements a bit better so as to avoid confusion and possible misrepresentation.

So do have a good one and I really hope you understand where I’m coming from as it would be a travesty if no one participating in this thread would be able to have a better understanding of things from my post.

This is all about sharing, growth and expansion of knowledge not just about I am right and you are wrong so let’s battle it out or try to compete.

If you’re having trouble understanding the concepts then you can read some of these threads:

These pics were taken with BFI On.

Higher end TVs have a form of BFI that is adaptive and maintains brightness very well (don’t let that scare you; it doesn’t add input lag). To me, looks as good as true BFI.

the only issue with crts is that after so many years they are pose a health hasard. Not a coincidence that eg with older pc monitors, computer science teachers at schools, office employees etc who used those old monitors for work, used to receive unsanitary work benefit.

Yes and No. Not all CRTs where made with the same Phospore Pattern and not all CRT’s has the same color purity/output. So, in my opnion, CRT shaders are strictly tied to the brand and TV model they attempt to recreate… mostly with great success.

I used to fix CRTs, and I’ve noticed that CRT Shaders that recreates “Color Bleed” or other artifacts like blurring, softening, etc, are recreating the effects of a failing CRT which presents either Yoke, Electron Gun or Capacitors fail and Purity or Convergence Rings misalignments due to earth’s magentic field or other magnetic forces + bad shielding (like putting a CRT next to Stereo Speakers or simply placing your CRT in the wrong cardinal position… not kidding with this one)

CRT’s in good shape doesn’t present neither of these picture artifacts, might differ in Phospore patterns, but everything else I’ve seen is just recreation of typical Consumer Grade CRT failure. I sometimes wonder if someone would make a CRT Shader with Pincushion Distortion or Bowing… to recreate the “matching feel of a CRT” that’s been on service for thousands of hours.

2 Likes

That’s true for the tube itself, especially with bright red/blue bleeding to the right. However, the composite video signal always causes blurring and artifacts even if the TV is working correctly. The blurring and artifacts are different depending on the console and the TV. Aging capacitors in the TV or console can cause the signal to get blurrier over time.

Not true, composite will blur-bleed-soften regardless if CRT is failing or not. 90% of consoles of 80s/90s used composite signal. RGB was used mostly on 16 bit computers like the Amiga, and some shaders are even sharper than what the 1084s had to offer.

1 Like

Depends where you lived. In Europe the situation was a bit better with the universal SCART connection that supported multitude of signals, even RGB. I mainly played it this way and the image was crystal clear, besides the tv screen off course.

Is RGB Scart like Component signal quality? Like better than S-Video?

RGB is a Component signal, as it divides the signal in its components. The Component Video signal is just a different way of encoding the same data, and thus, to my understanding should result in about the same quality. Maybe comparable to VGA. Given the cable and original signal are equivalent in quality. That’s about it, I don’t know too much details. I hardly doubt that we would notice a difference at all.

Looking online I found this interesting read https://www.avforums.com/threads/component-vs-rgb-scart-vs-svideo-vs-composite.43445/ with the conclusion (remember this is just one persons opinion):

Component > RGB Scart <> Svideo > composite

This is the best order on interfaces but connectors / engineering / transposing formats can certainly reverse at least the middle two Svideo and RGB Scart.

RGB is better than YPbPr component in most cases, however the difference is very small. Component supports HD, SCART does not.

1 Like

A comparison of true RGB and composite

2 Likes

i have two crt running a normal TV set via composite and a VGA in superres 240p so the VGA has scanlines bla bla bla …i think VGA is another beast on its own it looks really sharp and perfect but transparent dithering effects like in some megadrive games looking better on the TV set CRT

with the Real CRT shader who knows …i asked once here for a core that captures the video from other programms so we can use retroarch shaders for them …and we got that too

The thing I always say about this is: a picture of a light bulb will never be a light bulb.

However, if by “match a CRT” you mean “soften video game graphics the way a CRT does,” we’ve been able to do that for quite a while, and we’re getting better at it all the time. If you mean “match it in motion clarity,” well, we’re getting there. If you mean “both of those things at the same time,” it’ll probably be a while, but we’ll get there at some point.

Nevertheless, there’s some truth to the CRT purists who say stuff like “it’s just not the same,” etc. Yeah, because a CRT is more like a light bulb than it’s like an LCD. If you film a CRT TV and show it on a 4K+ display, you’re still not going to look at it and be like “omg that’s a real CRT in the room with me.”

It’s still just a picture of a CRT. And that’s okay.

9 Likes

Our eyes play an illusion and distortion the way we perceive live the white light from the light bulb, just like with stars. An instant picture or even video, is totally different.