CRT Squabblin'

@Brainbin74 Mentioned to me in a previous post in another thread that he is French and they use SECAM. He gave me the whole rundown of why the SECAM system is superior to all and why I couldn’t expect him to use, be excited about or relate to the image being presented in my CyberLab Mega Bezel Death To Pixels shaders. This was partly because he had used an outdated screenshot for comparison but it was quite an interesting and enlightening exchange. I’m going to post this piece I just read that we can all use to supplement or add to the immense trove of knowledge we have about TV standards. I think it’s good to learn about the experience of people who might have experienced the same thing that we did in very different ways and as such that final image in their heads and that goal in their minds as to what constitutes their ultimate and most suitable shader preset might vary considerably from what we might envisage.

http://www.differencebetween.net/technology/protocols-formats/difference-between-ntsc-pal-and-secam/

3 Likes

This is another gem I just came across that actually briefly touches on what @Nesguy was saying about the difference in appearance of the aperture grille between different sized CRTs at the 8:45 sec. mark. Watch the whole thing if you’re a true retrogaming fan!

5 Likes

If the blur motion of lcd qled or lcd does not bother you, and that the bloom does not bother you, and not to have the same depth of black as the oled I would say that it is a good choice.

But a friend who has a qled regrets his choice after seeing my oled tv in retrogaming. It’s blurry in motion, which has nothing to do with a crt. And at his place I saw bloom halos around each white letter…in the texts, as I also saw that it didn’t have absolute blacks like the oled, for resume? He regrets his purchase for the retrogaming games.and pc games.

You say the qled have more contrast, yes it’s true, but I already find my tv too bright in hdr… It takes some getting used to, and I told you with the bfi that I lose 40% of luminosity, and that I wasn’t even at full contrast for retroarch in rec709… I am between 70 and 80. The rec 709 on oled uses only 120nits calibration standard, and I already have as much contrast as my sony crt, so imagine with 10 times more contrast in hdr 1000 nits? And without the bfi? I would have a supercharged crt, I would see at least 4 times more contrast than my crt. So if you are playing in a garden in daylight? Yes the qled will be brighter. But the truth is that the oled is still superior in games and movies. This is my opinion. And I have a 1080p plasma, a panasonic ez950 oled, and a sony crt tv to compare. I don’t buy the qled for the flaws I mention above, it’s blurry, and I wouldn’t make a mama cab with a qled, at least not for me for sure, no thanks. :joy: This oled is pure magic. My tv in action

OLED can’t get bright enough for full strength masks. Try adjusting the mask strength to 100% and report back.

Also, OLED uses a WRGB subpixel structure and it’s impossible to get any masks to look right other than the black and white ones. The subpixel-respecting masks all need a three color subpixel structure (RGB, GBR, GRB, etc).

2 Likes

Yes and I don’t see why rvb would be impossible to do. It’s not because no shader creator has taken wrvb into account A mask shift on the white diode and the problem is solved. 4k specific shaders don’t really exist so those that take wrvb into account are not a priority. No more rvb lcd in retroarch? Less users who have an oled. A mask can be adapted very well especially with 3860*2160.

The masks are for 720p 1080p or 1440p nothing is made specifically for oled. This does not mean that in the future it will not exist. It is enough that the programmers have an oled, to see arrive shader optimized wrvb oled.

Except that it is not the case yet, they have lcd at 98%… So no problem for me, I play with masks, and I don’t see the problem you mention. Except looking at 0.1 ft with a magnifying glass the structure. On my couch? I see that it does the job like an lcd monitor.

And the oled smartphones? Do they have problems with retrogaming? What about the rvb mask? Because in this case, millions of people who play with retroarch and shader, should notice a problem, right? Or send the information back. No I don’t see people complaining about their games on oled smartphones. No qled in smartphone?

And the pc games are rvb, are they badly displayed on oled? The diodes must be so fine that no mask will show a shift, unless you create a mask in 20k reduced to 4k where the precision of the structure would be important. I saw the rvb mask of crt royal, it’s far from being accurate… It’s even misaligned on the grid. 64x64 px and 512x512 px …so we are far from the surgical precision of oled diodes, and not adapted to 4k. Use a low resolution tileable png, stretch it to 4k? And the mask is dirty, the rvb are not even 100% pure

There is dirt everywhere, R must be at 100% pipette I find 80% of the stains in the mask. So I don’t find this coherent in terms of precision… On the thread shader can do I have even cleaned the mask crt royal, for proof take a look…then for the precision ? It is not the case Cleaning mask rvb crt royal

Give me something to test, and I’ll be happy to take videos, screenshots, photos from near and far. Do you see a problem here with guest adv rev2, because I don’t. Compare with a lcd I did and I don’t see any difference, except that my lcd has grey blacks, and it’s blurry.

Ristar oled wrvb guest adv rev2 You download this little video and play with vlc, it’s better I think.

https://drive.google.com/file/d/12W4QAgCJ1PV19KELyjUWbMDSEG5AWCcy/view?usp=sharing

Thanks nesguy always a pleasure.

1 Like

@Brainbin74

Okay, what mask pattern are you proposing to use for WRGB to get uniform subpixel spacing on an OLED? I don’t think this problem is as easy to solve as you seem to think. Particularly since the white subpixel is required to attain the desired contrast… it’s not an easy thing to fix.

And there’s still the brightness thing, there’s no way OLED can get bright enough for full strength RGB masks.

By all means though, please prove me wrong.

2 Likes

Where do you get this stuff? The masks can be used at any resolution with the right settings.

I thought the whole point of this is that we were striving to be as accurate as possible? That’s cool that you don’t notice the problem, enjoy your compromises. :wink:

1 Like

So interestingly enough I hooked my laptop to a 55 inch Samsung LCD screen and even on that, I was getting the same extreme bloom problem on reds and blues. Literally the same exact problem I was having on my plasma. What fixed it was changing LUT Colors to NTSC. I don’t know why this problem does not happen on smaller screens like my laptop, or does not occur on my 32 inch LCD in my arcade cabinet either…but as soon as you blow up the picture onto larger TV’s it happens.

The C1 has recently dropped in price due to the G1 coming on the scene. A 65 inch is now $1700.00. Granted the G1 is supposedly the brightest OLED panel on the market which again would be better for cranking masks up but, $2799.00 eh I’m not sure.

Anyway just wanted to let anyone know in case they run into the same issue.

3 Likes

Hi You don’t give any details about the shader that causes the problem. Do you use slot mask? Aperture grill? Or something else. Which shader do you use? There are hundreds of different shaders, which can handle things differently. It is impossible without all the details to understand and help.

I too have questions about my oled wrvb and rvb masks, But I say I play on oled panasonic ez 950 which has a wrvb structure, with the shader guest adv rev2 with the mask 7 trinitron aperture.

And even like that with all the information? There is no answer, because no one really knows where worked on it …

The mask crt royal is bad, see the post where I clean it, and zoom on it, how is it possible to have the optimal quality with this? 64x64 and 512x512 png’s stretched on 4k, while the basic mask is bad and dirty? And that then crt royale does this.

These patterns rely on the physical pixel structure of the display monitor, so they need to be tiled using gl_FragCoord (or texCoord.st * OutputSize.xy) so the tiling always matches up.

That’s all I know about the real, concrete formula I saw on the hunterk blog.

So good luck my friend if someone has the knowledge, it’s up to him to enlighten us. Be well

1 Like

It’s the CRT guest shader, revision 7-27-21 which is why I posted in this thread. I use only slot masks, Mask 1 I believe.

3 Likes

You keep making reference to 1000 nits, but your EZ950 can only hit around 700 nits peak. There is no commercially available OLED that hits 1000 nits, even the 2021 Evo Panels that LG provide don’t hit that AND that’s not full screen brightness, that’s just highlights (clumps of pixels). Full screen brightness on your EZ950 will be about 120 nits (about the Rec.709 standard) so you don’t have a supercharged CRT I’m afraid.

1 Like

Yes, there is a difference between watching a movie rec 2020 (which only has 1000 nits streetlights) And the rest of the image, you never have 1000 nits on the whole surface, because the films hdr uses 1000 nits for what is the brightest sun, car headlights, it is not the whole slab has 1000 nits.

And have you ever watched a movie in rec 709 with the contrast at max 100 and the brightness at 100… Now even if I take your value 700 nits imagine multiplying by 7 your movie at 120 nits rec709 on lcd tv… Do you think you can save that without burning your retinas? I have a crt next door at 12 ft… I turn off the light, and I can see where my sony crt is compared to the oled. My oled outshines my sony, that’s all I see my friend…do you have an oled? Do you have a crt to compare it to?

I’m not comparing colorspaces, I’m stating the limitations of OLED as a technology in reference to your implication that your EZ950 can hit 1000 nits when it can’t, so I’m not sure why you brought Rec.2020 into the conversation.

Please provide a link to the article your screenshot is taken from as I’d like to see the measurements done where they have gotten that kind of peak brightness, as I assure you that’s incorrect. The Sony BVM-X300 that costs ~30k can do it, but your consumer OLED cannot.

Yes, I have an LG CX OLED and no I do not have a CRT to compare it to. But again, I wasn’t inferring that it doesn’t outshine your CRT in any way, just that your OLED isn’t hitting 1000 nits.

If we’re talking about trying to recreate what a CRT looks like (which hit between 60-100 nits full screen brightness) then you’re anywhere from 20-50% brighter with your OLED, not insubstantial but not earth shattering either.

3 Likes

I don’t know what you’re trying to prove here but well. So ez950 panasonic 820 nits measure.

I say 1000 and you say 700 you are wrong then 120 nits.

I’m wrong by 180 nits, and what does that change in the end for me.

Do you think my eyes are calibration probes? I don’t see the difference between 820 nits and 1000nits, because other things come into play such as tone mapping, color fidelity etc… This is a well known serious French site, it measures 820nits. .now nothing prevents me from forcing the hdr windows, or force the hdr on my tv.

I tell you that in rec 709 my oled at maximum exceeds my crt. I tell you that if I activate the hdr I go to 820nits and that it bursts my retina that’s all I know. There is also hdr available now at retroarch you know? . That I would not use I think, except to force the mask to 100%.with a shader…

There you go my friend, have a good one.

And yours is measured at? 665 cd… The same website, right? No.falsification, since you like accuracy.

The bvm x300 (I checked the measurements made on it, nothing but bla bla marketting made in sony) Fortunately I didn’t pay 25000$ more than you to have 155 nits more than the lg cx Lg cx 655 > ez950 820 = 155, nits more. Sony bvm > 1000 nits ( no real measurement found ) ez 950 820 nits > 1000? = 180 nits. So pay 25000$ for 180 nits … :grin::grin:

1 Like

I was making sure people weren’t under the false impression that consumer OLED’s can hit 1000 nits, which is not true. I’ve been calibrating displays for over 10 years now so I find it very difficult to not respond when misinformation is spread, as I’ve seen you mention that in a few different threads now.

I wasn’t trying to turn this into a contest, so I’m not sure why you felt the need to bring up the fact that the CX has a lower peak brightness, but yes you are correct.

The X300 was measured at 999cd/m² on this video at 8:11:-https://youtu.be/ESzWY0hW85Y

The $25000 difference isn’t just for 180 nits more brightness, but let’s leave it there.

1 Like

You have been calibrating for 10 years and? well, so have I. The only one I haven’t calibrated is my oled, because it’s perfect for me in sdr, in hdr And I don’t want to buy a probe that takes the specific hdr. So if you want to talk about that no problem, but it happens like this in the shader thread To order me to provide links? I have calibrated my plasma1080p panasonic my crt, all my tv, it amuses me.

And the proof my g20 and my nickname with the date so 2010 I calibrated already my tv Delta 2 , yeah!!! :wink:

Did I mention your TV? Just so you can see that on the same site there are both our tvs. Just to prove to you my honesty that’s all.

I find it very difficult to not respond when misinformation is spread, as I’ve seen you mention that in a few different threads now.

Because I rely on my calibration probe, you want to be precise? So no two panels are the same, and I didn’t measure my oled in hdr ( because I don’t have the probe for that ) so I buy a 1000 nits tv, which I didn’t check myself? So for you I tell lies? I can find measurements of variations of the same TV in different sites, so I buy a 1000nits TV and I say that my TV is 1000nits. ( who is the liar here? Panasonic? Or me who couldn’t check the real value of my panel? ) So please excuse the inconvenience to the thread. I was only answering a question Have fun and be well

wow this thread is heating up hahaha

1 Like

Keep it up guys… :slight_smile:

1 Like

nah, I’d rather we have moderation to keep the community as it was a few months ago. That’s right. We all want new users but – can anyone link to the meme that stacy lures new people in and the original people are shooed away?

7 Likes

Yes, well sorry, what do you want me to say? Someone is coming in the guest thread.

He tells me that I’m lying and spreading misinformation everywhere? He tells me that he has been a caliber for 10 years, and that he is here to make sure the truth is respected? So I answer him nicely saying “what are you trying to prove here in the guest thread”. So he tells me that he has been calibrating for 10 years? So I answer him that I do too, and I provide evidence So that he doesn’t spread everywhere that I would be a liar, it’s unbearable for me. When he doesn’t even know me, he doesn’t know what I do, or what I’m capable of doing. That’s all. Have fun, be well.

3 Likes