Can you share your measurement app again please?
Are you referring to Peak and Paperwhite Luminance settings as well as Contrast?
Can you share your measurement app again please?
Are you referring to Peak and Paperwhite Luminance settings as well as Contrast?
It’s a light meter I picked up for $30 on Amazon from an alphabet soup Chinese brand. URCERI
Does the trick!
Yep, although ideally Contrast should be left alone (5.00) to avoid any clipping. A couple notches might be ok, up to 5.50 or so, maybe 5.6 max (with the darkest masks). Those have been my findings so far.
Ok, so now for the downside:
Cranking up the Paperwhite Luminance and Contrast does soft clip the colors- the shader’s scanlines and mask keep the shades separate but they’re all the same RGB values.
The bigger problem, though, is that when we increase Contrast or Paper White we’re raising the black level, and we’ve already destroyed the black level by disabling local dimming and blasting 1000 nits 
This of course ruins the contrast ratio and has a very negative effect on perceived brightness.
So, slotmask is still “use at your risk” - we really probably want True Black HDR1000 (or whatever it’s called), so give it another 5 years.
A quick local dimming experiment:
Enabled local dimming. Used 240p Test Suite’s white screen to set mask strengths, in conjunction with a light meter. Targeted 220 nits, equalizing the peak brightness with my regular preset. This resulted in:
maskstr = "0.850000"
mcut = "0.800000"
I also had to adjust gamma in/out to 2.4/2.4 from the usual 2.4/2.2.
Local dimming on, reduced mask strength:
Local dimming off (full brightness), full mask strength:
You can see the horrible backlight bleed when local dimming is off. Honestly, I’m having a hard time deciding which I prefer more. The true black looks so good, and is such an essential aspect of CRT performance, but it comes at the cost of lost mask detail. In practice, the lost detail/accuracy isn’t very noticeable when playing a game, so I might prefer the local dimming. IDK, it’s a tough call.
Yes, use Local Dimming. You paid for it, didn’t you? The problem is, you’re using an IPS display which has horrible native black levels and contrast ratio as well as blooming performance so it absolutely has to rely on local dimming to get anywhere near decent contrast or black levels, however local dimming algorithms vary considerably so there’s the potential to introduce a different set of artifacts due to that.
So there are times when my TCL QM8751G gently reminds me that’s it’s only a miniLED and times it reminds me that it’s not an OLED but those times are actually in the minority. Most of the time I’m very satisfied with what I have and I know what other frustrations I would have to deal with and things I’ve gotten used to that I might miss out on if I go back to RRBBGGX only WOLED.
So we’re getting closer to an R-G-B tandem with no white subpixel, hopefully with MLA sometime in the future and that is when the OLED proposition becomes much more palatable for those who would have the unlimited loads of cash to burn of course, because you see how they’re pushing the brightest OLEDs with the best technology as ultra premium devices.
So that’s out of the question for me. So I’ll happily stick with plebian miniLED tech. VA is where it’s at though with the awesome native black levels which is more than enough. I can’t turn local dimming off on my TV because the highest brightness levels are tied to Local Dimming being on High, but if you ever experienced a TCL QM851G you probably wouldn’t want local dimming turned off either.
So even if you’re not going to purchase one, do take some time and study the TCL QM851G, QM8K and QM9K. The QM751G is just a baby QM851G with 720 Zones on a 55" Screen with a peak brightness of I think about 2,400nits!
Next year will be the year of RGB miniLED with much higher gamuts and brightness. Lets see how that goes.
PS, once you start using all of the features of your IPS miniLED, you’ll get used to them eventually and adjust your shader parameters to suit them best. I made the most out of an old crappy LG IPS by trying to run it at the peak of it’s brightness and saturation abilities, why? If black levels were poor and I was going to get the most out of that TV, I had to focus on brightness to achieve the highest contrast.
HDR video? No problem, I reverse engineered my own solution to watch HDR videos which I had gotten accustomed to on my non-officially HDR display!
By the way, after coming across some further information concerning my Cable Matters DisplayPort 1.4 to HDMI 2.1 Adapter I have managed to eek out RGB 4:4:4 8-bit at 100Hz with HDR and VRR enabled. Banding in gradients should be kept to a minimum via the use of Massoft’s Color Control, which allows me to enable nVIDIA’s excellent temporal dithering algorithm on Windows 11.
So this is the theoretical max refresh rate you can get out of an Nvidia 10 series GPU while maintaining RGB 4:4:4.
Anything higher requires HDMI 2.1 which my graphics card lacks or Display Stream Compression which it also lacks.
Not to be satisfied, I’ve recalibrated my HDR experience with BFI enabled and I have gotten some pretty interesting results. So I’ve now begun my high framerate journey and I’ve started using BFI again!
Meanwhile… https://www.reddit.com/r/buildapcmonitors/s/NLnjeGRzMS
It was fun for about a day, then the high wore off and I started to notice all of the problems.
The problem is that entire colors are dimmed, and this happens to already dim colors.
I think the problem is with the mask and scanlines adding too many black pixels to dark colors, confusing the algorithm and making it think “this should be black.”
The algorithm works very well for very high contrast content - like old black background games with a few bright colors. It’s the midtones that get screwed up.
So, yeah… True Black HDR1000 can’t get here fast enough.
I understand, not all miniLED backlight algorithms are the same though and more dimming zones also help with the brute forcing. The problem with the IPS miniLED is its over reliance on the local dimming technology to achieve an acceptable level of contrast. Due to VA’s superior native contrast ratio, a VA miniLED might not need to be so aggressive.
Also, on my display, I can tailor what gets prioritiezed by choosing a different tonemapping setting. For example Detail Preferred, Balanced, Brightness Preferred and High-Dynamic.
Maybe it’s time to trade up for a miniLED with more modern features and technology?
To summarize, local dimming performance can vary considerably between different brands and models across the miniLED landscape and this variance can range from excellent - near OLED performance to practically unusable - just a checkbox to tick.
Maybe, but this display is no slouch- It’s basically just a rebranded version of this:
I’m not super confident that the dimming issue I’m describing can ever be completely resolved, but a less aggressive dimming and/or more dimming zones might help reduce it.
True Black HDR1000 will save us all, hopefully. I’m a bit worried that it will remain restricted to small screen devices.
By the way, have you been able to access similar firmware updates to the Cooler Master?
I’ve been meaning to do that- might give that a shot a later today.
Seems the download was removed?
Thanks, should be working now 
Update 10/21:
Reduced presets, added instructions for mask settings- minimizes the number of presets, and I like keeping things simple.
A few tweaks to gamma, brightness boosts, etc
Added magic glow 0.02 - I think it helps mimic more realistic phosphor behavior
Minor tweaks to sharpness
Added a new mode, “Enhanced S-video,” featuring “smart dithering deblur.” Basically the cleanest possible image with separate Y/C.
Update 10/28:
Local dimming is back!
Local dimming really doesn’t interfere with the peak brightness at all, it’s just that it really doesn’t like certain masks. So I tested every single mask combination and narrowed it down to the ones that work.
Lately, I’ve also been asking: how much brightness is really enough? My experiments show that 200 nits is not sufficient. 300 nits is when it’s starting to get too bright (need to increase mask strength, scanlines or something to compensate). 250-300 nits is the sweet spot. 100 nits on a CRT isn’t the same as 100 nits on an LCD. I’ve come to the conclusion that we should not be targeting 100 nits in our shader setups. 250 nits is the right target for “CRT feel.” So, my target is now a minimum of 250 nits with masks and scanlines applied. This eliminated a lot of possibilities.
Color fringing when using any doubled phosphor width mask is another thing that’s been troubling me. E.g., RRGGBBX always has bad color fringing on white text. I haven’t found a good solution for this problem, so I stopped using these masks, for now. EDIT: ok, this is solved by lowered sharpness. In the RGB version, reduce horizontal sharpness to 5.00. However, I can’t get RRGGBBX up to 250 nits, so leaving it out of this collection for now.
I’ve also adjusted bright boosts and scanlines so that bright and dark parts of the image are adjusted equally, in order to preserve the original color range of the image.
And as always, sharpness tweaking was required- once I gave up on RRGGBBX, etc, it resulted in a different strategy and made things a lot easier.
HDR1000, Local Dimming Enabled:
https://mega.nz/file/tmURwAra#u5JPvkF2AvECOet_Sgvwa5JfCeBC_9DRMD1l66hJflQ
This explains why even HDR1000 is not enough for many full-strength masks. The full screen brightness of HDR1000 is 600 nits. HDR1400 is what we need, with a full screen brightness of 900 nits.
https://displayhdr.org/performance-criteria/
The “Full-screen Long-duration Test” is what matters.
True Black HDR1000 will not save us, since it only reaches 500 nits for the full-screen long duration test.
The best displays for CRT emulation are MiniLED displays with HDR1400.
This TV can do that:
953 cd/m²
This one’s right behind it:
768 cd/m²
This is the one I have:
542 cd/m²
Now it’s a large screen so I sometimes prefer a smaller, denser and brighter image using smaller Integer Scale Sizes, for example 8x or 6x.
At 50% Window Size Sustained Brightness: 860 cd/m².
The QM8K can do Sustained 50% Window 982 cd/m².
The QM9K can do Sustained 50% Window 1,293 cd/m².
While the QM851G can do Sustained 50% Window 1,792 cd/m² !
Remember these TV’s are huge compared to the typical 27"-32" monitors that many of us use so sometimes it just feels better to not use the full screen when emulating old skool games which many of us played on much tinier screens.
So I would say that the QM751G provides one of the best experiences I’ve had so far with CRT Emulation. In hindsight it’s just too bad I couldn’t afford and was reluctant to go with the larger starting size of 65" for the QM851G.
The QM8K and QM9K must also be commended for being brighter than the QM751G while also offering RGB subpixel layout along with Wide Angle Viewing Technology.
I could imagine getting some bargain deals on those in the not too distant future though. It’s only because of my experience with the QM751G that I can easily traject what an amazing experience it might be to use a QM851G for CRT Emulation.
I would qualify this statement a bit maybe by saying that they are the best for most usage scenarios or for most aspects of CRT emulation because OLED still destroys them in a completely dark room and especially if you’re moving around in that space.
We do know that OLED has its own limitations when it comes to subpixel layout compatibility, peak and sustained brightness, agressive ABL and risk of burn-in though.
There are a few TVs that may meet the minimum brightness required for HDR1400 but don’t meet some other qualification. If someone has one of those already, great. I believe the spec was introduced by VESA this year.
HDR1400 is the spec to look for.
https://displayhdr.org/certified-products/#tab-1400
OLED appears to be maxing out at 500 nits (full screen sustained)- that’s for a high end True Black HDR1000 display (currently only on small screen devices). 500 nits still has some limitations with full strength masks, since some masks are over a 90% reduction in brightness.
Per pixel dimming (True Black) is the main advantage, but I haven’t seen a 1000+ dimming zone display yet, I don’t know how it compares.
Smaller screens are preferable for retro gaming IMO, you get better PPI (which helps masks), better viewing distances for SDR content, etc. A 27” 16:9 monitor is roughly 20” in 4:3, which is perfect.
The problem with these types of displays and why OLED kills them in a dark room is less because of blooming due to a lack of zones and more because this blooming is exaccerbated by changes in viewing angle. Not only that, colour saturation, tint, gamma and brightness are also affected by changes in viewing angle, then there’s the black smearing and not as good motion handling and pixel response that all contribue to a less than ideal immersion experience in areas where OLED excels. OLED is close to perfect in almost all of these areas and it’s not because of per pixel dimming only, it’s also because of the lack of optical distortions caused the different layers which the light has to pass through before reaching the viewer’s eyes as well as the optical distance difference between the light source, front of the panel and user’s eyes. Things like refractions e.t.c all contrbute to this worse viewing angle performance and sub-optimal viewing angle position leads to everything that’s bad about LCD technology looking even worse.
Yes, I agree viewing angle is a problem for MiniLED TVs. It’s less of a problem for monitors or setups where you’re expected to sit directly in front of the display. I think most MiniLED TVs should still have a wide enough viewing angle for a 3 seat sofa setup.
with a full screen brightness of 900 nits.
My searing retinas when I bring up the RetroArch menu. x.x 400 nits sears my eyes as it is.