I gained enough brightness with the recent changes that RRGGBBX+slot is back on the menu!
Measured 212 nits- comfortably above my target of 200 nits 
No clipping involved. Just adjust Paper White Luminance to 1000.
I gained enough brightness with the recent changes that RRGGBBX+slot is back on the menu!
Measured 212 nits- comfortably above my target of 200 nits 
No clipping involved. Just adjust Paper White Luminance to 1000.
Can you share your measurement app again please?
Are you referring to Peak and Paperwhite Luminance settings as well as Contrast?
Itās a light meter I picked up for $30 on Amazon from an alphabet soup Chinese brand. URCERI
Does the trick!
Yep, although ideally Contrast should be left alone (5.00) to avoid any clipping. A couple notches might be ok, up to 5.50 or so, maybe 5.6 max (with the darkest masks). Those have been my findings so far.
Ok, so now for the downside:
Cranking up the Paperwhite Luminance and Contrast does soft clip the colors- the shaderās scanlines and mask keep the shades separate but theyāre all the same RGB values.
The bigger problem, though, is that when we increase Contrast or Paper White weāre raising the black level, and weāve already destroyed the black level by disabling local dimming and blasting 1000 nits 
This of course ruins the contrast ratio and has a very negative effect on perceived brightness.
So, slotmask is still āuse at your riskā - we really probably want True Black HDR1000 (or whatever itās called), so give it another 5 years.
A quick local dimming experiment:
Enabled local dimming. Used 240p Test Suiteās white screen to set mask strengths, in conjunction with a light meter. Targeted 220 nits, equalizing the peak brightness with my regular preset. This resulted in:
maskstr = "0.850000"
mcut = "0.800000"
I also had to adjust gamma in/out to 2.4/2.4 from the usual 2.4/2.2.
Local dimming on, reduced mask strength:
Local dimming off (full brightness), full mask strength:
You can see the horrible backlight bleed when local dimming is off. Honestly, Iām having a hard time deciding which I prefer more. The true black looks so good, and is such an essential aspect of CRT performance, but it comes at the cost of lost mask detail. In practice, the lost detail/accuracy isnāt very noticeable when playing a game, so I might prefer the local dimming. IDK, itās a tough call.
Yes, use Local Dimming. You paid for it, didnāt you? The problem is, youāre using an IPS display which has horrible native black levels and contrast ratio as well as blooming performance so it absolutely has to rely on local dimming to get anywhere near decent contrast or black levels, however local dimming algorithms vary considerably so thereās the potential to introduce a different set of artifacts due to that.
So there are times when my TCL QM8751G gently reminds me thatās itās only a miniLED and times it reminds me that itās not an OLED but those times are actually in the minority. Most of the time Iām very satisfied with what I have and I know what other frustrations I would have to deal with and things Iāve gotten used to that I might miss out on if I go back to RRBBGGX only WOLED.
So weāre getting closer to an R-G-B tandem with no white subpixel, hopefully with MLA sometime in the future and that is when the OLED proposition becomes much more palatable for those who would have the unlimited loads of cash to burn of course, because you see how theyāre pushing the brightest OLEDs with the best technology as ultra premium devices.
So thatās out of the question for me. So Iāll happily stick with plebian miniLED tech. VA is where itās at though with the awesome native black levels which is more than enough. I canāt turn local dimming off on my TV because the highest brightness levels are tied to Local Dimming being on High, but if you ever experienced a TCL QM851G you probably wouldnāt want local dimming turned off either.
So even if youāre not going to purchase one, do take some time and study the TCL QM851G, QM8K and QM9K. The QM751G is just a baby QM851G with 720 Zones on a 55" Screen with a peak brightness of I think about 2,400nits!
Next year will be the year of RGB miniLED with much higher gamuts and brightness. Lets see how that goes.
PS, once you start using all of the features of your IPS miniLED, youāll get used to them eventually and adjust your shader parameters to suit them best. I made the most out of an old crappy LG IPS by trying to run it at the peak of itās brightness and saturation abilities, why? If black levels were poor and I was going to get the most out of that TV, I had to focus on brightness to achieve the highest contrast.
HDR video? No problem, I reverse engineered my own solution to watch HDR videos which I had gotten accustomed to on my non-officially HDR display!
By the way, after coming across some further information concerning my Cable Matters DisplayPort 1.4 to HDMI 2.1 Adapter I have managed to eek out RGB 4:4:4 8-bit at 100Hz with HDR and VRR enabled. Banding in gradients should be kept to a minimum via the use of Massoftās Color Control, which allows me to enable nVIDIAās excellent temporal dithering algorithm on Windows 11.
So this is the theoretical max refresh rate you can get out of an Nvidia 10 series GPU while maintaining RGB 4:4:4.
Anything higher requires HDMI 2.1 which my graphics card lacks or Display Stream Compression which it also lacks.
Not to be satisfied, Iāve recalibrated my HDR experience with BFI enabled and I have gotten some pretty interesting results. So Iāve now begun my high framerate journey and Iāve started using BFI again!
Meanwhile⦠https://www.reddit.com/r/buildapcmonitors/s/NLnjeGRzMS
It was fun for about a day, then the high wore off and I started to notice all of the problems.
The problem is that entire colors are dimmed, and this happens to already dim colors.
I think the problem is with the mask and scanlines adding too many black pixels to dark colors, confusing the algorithm and making it think āthis should be black.ā
The algorithm works very well for very high contrast content - like old black background games with a few bright colors. Itās the midtones that get screwed up.
So, yeah⦠True Black HDR1000 canāt get here fast enough.
I understand, not all miniLED backlight algorithms are the same though and more dimming zones also help with the brute forcing. The problem with the IPS miniLED is its over reliance on the local dimming technology to achieve an acceptable level of contrast. Due to VAās superior native contrast ratio, a VA miniLED might not need to be so aggressive.
Also, on my display, I can tailor what gets prioritiezed by choosing a different tonemapping setting. For example Detail Preferred, Balanced, Brightness Preferred and High-Dynamic.
Maybe itās time to trade up for a miniLED with more modern features and technology?
To summarize, local dimming performance can vary considerably between different brands and models across the miniLED landscape and this variance can range from excellent - near OLED performance to practically unusable - just a checkbox to tick.
Maybe, but this display is no slouch- Itās basically just a rebranded version of this:
Iām not super confident that the dimming issue Iām describing can ever be completely resolved, but a less aggressive dimming and/or more dimming zones might help reduce it.
True Black HDR1000 will save us all, hopefully. Iām a bit worried that it will remain restricted to small screen devices.
By the way, have you been able to access similar firmware updates to the Cooler Master?
Iāve been meaning to do that- might give that a shot a later today.
Seems the download was removed?
Thanks, should be working now 
Update 10/21:
Reduced presets, added instructions for mask settings- minimizes the number of presets, and I like keeping things simple.
A few tweaks to gamma, brightness boosts, etc
Added magic glow 0.02 - I think it helps mimic more realistic phosphor behavior
Minor tweaks to sharpness
Added a new mode, āEnhanced S-video,ā featuring āsmart dithering deblur.ā Basically the cleanest possible image with separate Y/C.
Update 10/28:
Local dimming is back!
Local dimming really doesnāt interfere with the peak brightness at all, itās just that it really doesnāt like certain masks. So I tested every single mask combination and narrowed it down to the ones that work.
Lately, Iāve also been asking: how much brightness is really enough? My experiments show that 200 nits is not sufficient. 300 nits is when itās starting to get too bright (need to increase mask strength, scanlines or something to compensate). 250-300 nits is the sweet spot. 100 nits on a CRT isnāt the same as 100 nits on an LCD. Iāve come to the conclusion that we should not be targeting 100 nits in our shader setups. 250 nits is the right target for āCRT feel.ā So, my target is now a minimum of 250 nits with masks and scanlines applied. This eliminated a lot of possibilities.
Color fringing when using any doubled phosphor width mask is another thing thatās been troubling me. E.g., RRGGBBX always has bad color fringing on white text. I havenāt found a good solution for this problem, so I stopped using these masks, for now. EDIT: ok, this is solved by lowered sharpness. In the RGB version, reduce horizontal sharpness to 5.00. However, I canāt get RRGGBBX up to 250 nits, so leaving it out of this collection for now.
Iāve also adjusted bright boosts and scanlines so that bright and dark parts of the image are adjusted equally, in order to preserve the original color range of the image.
And as always, sharpness tweaking was required- once I gave up on RRGGBBX, etc, it resulted in a different strategy and made things a lot easier.
HDR1000, Local Dimming Enabled:
https://mega.nz/file/tmURwAra#u5JPvkF2AvECOet_Sgvwa5JfCeBC_9DRMD1l66hJflQ
This explains why even HDR1000 is not enough for many full-strength masks. The full screen brightness of HDR1000 is 600 nits. HDR1400 is what we need, with a full screen brightness of 900 nits.
https://displayhdr.org/performance-criteria/
The āFull-screen Long-duration Testā is what matters.
True Black HDR1000 will not save us, since it only reaches 500 nits for the full-screen long duration test.
The best displays for CRT emulation are MiniLED displays with HDR1400.
This TV can do that:
953 cd/m²
This oneās right behind it:
768 cd/m²
This is the one I have:
542 cd/m²
Now itās a large screen so I sometimes prefer a smaller, denser and brighter image using smaller Integer Scale Sizes, for example 8x or 6x.
At 50% Window Size Sustained Brightness: 860 cd/m².
The QM8K can do Sustained 50% Window 982 cd/m².
The QM9K can do Sustained 50% Window 1,293 cd/m².
While the QM851G can do Sustained 50% Window 1,792 cd/m² !
Remember these TVās are huge compared to the typical 27"-32" monitors that many of us use so sometimes it just feels better to not use the full screen when emulating old skool games which many of us played on much tinier screens.
So I would say that the QM751G provides one of the best experiences Iāve had so far with CRT Emulation. In hindsight itās just too bad I couldnāt afford and was reluctant to go with the larger starting size of 65" for the QM851G.
The QM8K and QM9K must also be commended for being brighter than the QM751G while also offering RGB subpixel layout along with Wide Angle Viewing Technology.
I could imagine getting some bargain deals on those in the not too distant future though. Itās only because of my experience with the QM751G that I can easily traject what an amazing experience it might be to use a QM851G for CRT Emulation.
I would qualify this statement a bit maybe by saying that they are the best for most usage scenarios or for most aspects of CRT emulation because OLED still destroys them in a completely dark room and especially if youāre moving around in that space.
We do know that OLED has its own limitations when it comes to subpixel layout compatibility, peak and sustained brightness, agressive ABL and risk of burn-in though.
There are a few TVs that may meet the minimum brightness required for HDR1400 but donāt meet some other qualification. If someone has one of those already, great. I believe the spec was introduced by VESA this year.
HDR1400 is the spec to look for.
https://displayhdr.org/certified-products/#tab-1400
OLED appears to be maxing out at 500 nits (full screen sustained)- thatās for a high end True Black HDR1000 display (currently only on small screen devices). 500 nits still has some limitations with full strength masks, since some masks are over a 90% reduction in brightness.
Per pixel dimming (True Black) is the main advantage, but I havenāt seen a 1000+ dimming zone display yet, I donāt know how it compares.
Smaller screens are preferable for retro gaming IMO, you get better PPI (which helps masks), better viewing distances for SDR content, etc. A 27ā 16:9 monitor is roughly 20ā in 4:3, which is perfect.
The problem with these types of displays and why OLED kills them in a dark room is less because of blooming due to a lack of zones and more because this blooming is exaccerbated by changes in viewing angle. Not only that, colour saturation, tint, gamma and brightness are also affected by changes in viewing angle, then thereās the black smearing and not as good motion handling and pixel response that all contribue to a less than ideal immersion experience in areas where OLED excels. OLED is close to perfect in almost all of these areas and itās not because of per pixel dimming only, itās also because of the lack of optical distortions caused the different layers which the light has to pass through before reaching the viewerās eyes as well as the optical distance difference between the light source, front of the panel and userās eyes. Things like refractions e.t.c all contrbute to this worse viewing angle performance and sub-optimal viewing angle position leads to everything thatās bad about LCD technology looking even worse.
Yes, I agree viewing angle is a problem for MiniLED TVs. Itās less of a problem for monitors or setups where youāre expected to sit directly in front of the display. I think most MiniLED TVs should still have a wide enough viewing angle for a 3 seat sofa setup.