Very good, you definitely need to do this to get the most of these presets.
This is good operating procedure, however for these types of presets, you might have a better experience overall using the * DisplayHDR: Peak 1000 nits mode.
You definitely don’t want to neuter the brightness of the display with these things and you have the ability to run at 240Hz which is a dream if you pair these presets with the CRT-Beam-Simulator.
Do over your Windows Calibration using the Peak 1000nits mode. Make sure your Windows Calibration looks right to your preference. Most of the calibration is objective though, only the very last part, where you have to adjust the brightness/Saturation slider is subjective.
This calibration has no bearing on RetroArch in HDR mode however. It would only affect shaders and presets which don’t use HDR and then only if HDR is switched off in RetroArch. It’s still good to have things configured so that you can always get the advantages of HDR though for example when viewing your HDR Screenshots and stuff, they’ll look right.
The defaults mean nothing basically.
What applies to one display doesn’t apply to others.
This is actually pretty simple if you don’t overthink it. Follow your intuition. Use your eyes. There’s no way I know of to get it completely accurate and achieve what we’re trying to use HDR inverse tonemapping for in any case.
Pick your favourite games, ones that you’re familiar with and just set the Paper White Luminance to what’s comfortable to your eyes in terms of brightness.
If you push it too far colours and whites might look too “hot” and harsh. That means there’s clipping and maybe you’re exceeding the limitations of your display.
Like wise, look at things like white text to fine tune the Peak Brightness, regardless of what RTINGS says if the white text stands out to the point where they look like a light bulb and you can’t make out the scanlines and mask then it could be too high. Again, set that to what looks best to you and slowly tweak over time.
Retro Crisis failed to do this in his last video and the result was an unpleasant image. He seemed a bit overwhelmed by the number of choices even though Genesis only has a handful relatively and one can easily and quickly jump to the section they’re concermed with by using the left and right keys or buttons or the L & R buttons. In addition to that, the presets with the longest, most outlandish name or the more recent ones or hyperbole like Neo-GX, Extreme or Supreme tend to be the newer and better ones.
I’m surprised there was such a positive response to the video despite these apparent oversights but I can’t complain too much because I’m grateful for the opportunity to share and showcase what I do.
Please report back with your improvements and post some pics as well.
Yes, no and maybe because the RetroArch settings have an additional Contrast control which the Shader Parameters doesn’t have. RetroArch controls do nothing for Sony Megatron Color Video Monitor Presets by the way, they’re intended for other shaders.
If you’re completely lost you can use the grey ramp and colour bar tests to get a ball park for getting the highest colour volume before clipping and also, proper grey scale at least on the high end. Seeing a few black bars chopped off at the low end is normal as well as living with a little clipping in the reds to get them look actually red at least on some displays.
I tend to focus on my favourite games for calibration though.
Remember to report back and share some pics/video clips.
For Mega Bezel Integer Scale needs to be off in the RetroArch–>Settings–»Video–»Scaling menu and Aspect Ratio set to Full as per the Mega Bezel setup instructions.
The presets would have integer scaling enabled (or disabled) via the Shader Parameters.
The same applies to Uborder.
For everything else, Integer Scale should be On in the RetroArch–»Settings–»Video–»Scaling menu unless otherwise specified in the readme for the particular Shader Preset Pack.
This is why it’s important to read the entire first post and to browse the thread. Since my W420M Preset Pack, I’ve self contained and included the appropriate Guest Shader.
Thank you for clarifying re. Integer Scaling. And also you’re completely right, I actually went ahead and re-read the first post while I was waiting for my comment to be approved, and I stumbled on the answer. I even tried deleting my comment but it wasn’t letting me for some reason. Maybe because I’m a new account…? I’ll try to just edit next time until I get delete privileges.
For this monitor (along with all the other qd-oled 2024 monitors; they all have the same panel), there are issues with the Peak 1000 mode. The EOTF curve doesn’t track well past 10% APL, leading to some image scenarios where HDR 400 is brighter. See here for more details.
MSI engineers are aware of the issue and trying to solve it with firmware but until then, I’m running HDR400 mode (and I can attest it usually IS brighter than P1000, sadly).
I was going to tackle incorporating the CRT Beam Sim eventually, once I have the HDR issues sorted out. I wish I had one of those spiffy top of the line 2025 monitors that can do a 1080p 480Hz mode. Those refresh rates are finally getting closer to CRT motion clarity…
One q: playing around, why does changing the CRT Resolution also change the colours so much?
Don’t delete, allow others who visit the thread to benefit as well.
I understand what you’re saying but do remember that we’re only using HDR in this case to “unlock” any additional brightness that the display can provide. Any colour correction can be done at the Shader Parameter level.
Most of or at least a large percentage of the screen is dark or off anyway so the APL levels issues may not apply the same way in this scenario. Maybe you can test it both ways and see but you’re going to need the extra brightness headroom in order to utilize the temporal improvements of the CRT-Beam-Simulator.
“For smaller APL from there, the True Black 400 mode (referred to from here as ‘TB400’ for ease) doesn’t get any brighter, and reaches it’s peak at around 471 nits maximum. The ‘Peak 1000’ mode (referred to from here as ‘P1000’) on the other can get brighter for the smaller APL of 5% and 1% areas measured, reaching up to 1002 nits maximum and therefore meeting the 1000 nits peak brightness spec of the panel.”
“The shape of the lines in P1000 mode is very similar, but the difference is that in SDR content the screen only reaches up to around 500 nits maximum, whereas in HDR mode it reaches twice that at 1000 nits.”
“You can see from the measurements in these real content videos and movies that the P1000 mode was able to reach higher peak luminance than the TB400 mode for lower APL scenes and for bright highlights. The Christmas light videos were a very good example of low APL % scenes where the P1000 mode reached a lot higher luminance than TB400, and where you can experience the full brightness capability of the panel.”
“Using a normal white test pattern (signal input level = 100) you can see that both modes are the same for all APL between 10% and 100%, but the P1000 mode can reach higher peak luminance for the smaller APLs. This is in keeping with the earlier measurements of HDR content in the two modes.”
“However, you can see that for the larger APL the P1000 mode is now darker, and this is caused by that poor PQ tracking for the mid grey shades (signal levels 45 – 80) for the larger APLs. For instance at 50% APL the TB400 mode has a 112 nits higher luminance while at 100% APL the TB400 mode is 100 nits higher. This equates to around an 18 – 19% difference in perceived brightness (based on XCR)”
I think I’ve read enough. You’re not going to be using many High APL scenes when doing these CRT Emulations. Almost half the screen will be black/off due to the pillar boxing on the left and right sides of the screen.
Then you have the scanlines gaps taking up near half the remaining pixels of the screen being dark/off.
Then even the phosphor mask is going to have black lines for the slot mask and aperture grille wires and and the only time all RG and B subixels are going to be lit at the same time are when displaying white.
So you should be able to take full advantage of the display’s full luminance range in Peak 1000 mode.
It’s up to you if you want to leave Windows in SDR mode.
I really don’t know. I just try my best to corrrect whatever differences I encounter. It could be as simple as the light is passing through a different filter (the mask) so there will be differences in the output.
Notice I said share some pics/videos and not screenshots.
Both of those screenshots look similar except for the TVL.
If you want me to be able to “see” what you are seeing you would have to rely on high quality photographs or video recordings of the screen.
Now that I think about it one of the reasons why you might be experiencing colour shifts when adjusting Resolution (TVL) might be due to none of the Display’s Subpixel Layouts available in the Shader actually matching your Display’s actual Subpixel Layout.
So at some TVLs the misalignment might just be worse than others. Some colour and sharpness shifting is normal when switching TVL on displays which have matching subpixel layouts available though but it might be even worse with one that doesn’t match.
By the way, the risk of burn-in/uneven wear is real. Some say newer OLED displays fare better than older ones but I’m not so sure by how much.
I’m not sure what you can do to mitigate against the pixels used for the scanlines wearing out faster than the pixels used for the scanlines gaps.
Then there’s the uneven wear caused by the pillar bars of 4:3 content.
Nothing lasts forever but I now consider OLED Displays to be “temporary” displays, they’re way too expensive for me to call them disposable.
This is something that you won’t want to do while using subpixel accurate CRT shaders as they are designed to address and map down to the subpixel level of the display to simulate as accurate as a 1:1 ratio with emulated CRT RGB Phosphors. So you would need to be running native 4K or 1080p centered with no scaling to Fullscreen or Aspect Ratio.
For users of my Mega Bezel Preset Pack who don’t wish to worry about accidentally overwriting Mega Bezel v1.14.0 everytime they update their shaders:
It’s quite simple to make a copy of the Mega Bezel 1.14.0 folder and rename it, “Mega_Bezel - 1.14.0” and a copy of the latest version and similarly append the version # to it and just open whichever one you wish to use and select all, copy, then open the “Mega_Bezel” folder, select all, delete, then paste the files you copied into the main Mega_Bezel folder.
You can repeat this process with the latest version if you wanted to switch back to it.
Or, if you wanted to be cool you could do the first step above, then using a mass editing tool like “Notepad++”, use the “Find in Files” feature to search for and replace every instance of “/Mega_Bezel” in my preset pack with “/Mega_Bezel - 1.14.0”.
Non of my presets need to use HDR. Some are preconfigured for HDR mode but can be used in SDR by just toggling the SDR/HDR Shader Parameter.
So, you can try the hybrid approach using my Mega Bezel or CRT-Royale Preset Packs. I have an entire folder in my Mega Bezel Preset Pack dedicated to HDR Ready Mega Bezel presets.
You can also use my tips to make any older preset HDR Ready.
Lastly, you could just try the Megatron presets and see how they look on the HDR400 display.
The Sony Megatron Color Video Monitor system requirements recommend at least an HDR600 display but many of my presets use Paper White Luminance values close to 450 so you can try those and see. For example my Near Field Presets.
It might look okay in the dark with the Megatron presets.
Sounds great for any of my preset packs, however OLEDs, while looking amazing will degrade over time especially if doing a lot of CRT Shader stuff. Eventually you might start to notice the effects of uneven wearing out of the part of the screen that corresponds to the ~4:3 gameplay area as well the scanlines wearing out some lines of the screen faster than the scanlines gaps.
So, it’ll look and feel absolutely awesome once you get everything setup properly but it won’t last forever.
Once you can deal with that possibility, then proceed. A better alternative might be a miniLED TV.
The additional brightness headroom and subpixel alignment/layout compatibility cannot be understated and you don’t have to worry about one of your favorite hobbies damaging one of you’re favourite home entertainment purchases.
What makes you think your PC is not worth hooking up to your TV?
Have you tried running any of my Sony Megatron Color Video Monitor Presets on it?
List the specs or whatever parts you have lying around and I can give you an idea of what’s possible.
Sony Megatron Color Video Monitor doesn’t need much processing power.
Why OLED? Why a laptop? What about something like a console style or home theater style PC? Laptop CPUs and GPUs have lower performance than their desktop counterparts. In addition to that they’re often marketed using the same model numbers as much faster desktop GPUs.
My man, your answers are always super detailed and on point. Appreciate it.
You kinda hit it on the head when talking about the OLED, I would like to avoid any unnecessary extra mileage. That’s why I was looking into the OLED laptop. You brought up miniLED. It sounds like you think that might be a better option in your opinion?
If you were putting a setup together to get your heaviest presets to run in for the 16 bit era how would you go about it? I got some money to throw at the hobby so lmk what’s out there. I don’t have the time or technical ability to hunt down a CRT and restore it in order to run original hardware. Plus, the convenience of a modern solution is more in line with what I’m looking for.
Hey, no problem. Your questions usually deserve such detailed responses, even if I may not immediately have the time to offer the proper response when I read them.
Your follow-up is going to need a very nuanced answer as well. So sometimes I might start then edit an add more detail when compiling the perfect reply.
Let’s start with the display.
If I’m to interpret this correctly, what you’re saying is that you won’t be playing so much or even at all on your 48" OLED but you would on your OLED laptop?
I can’t fathom playing video games on such a tiny display as a laptop display unless I’m going mobile/away from home and even then, I’d much rather just plug in an HDMI cable and hook up to any flat screen TV I could find. If you’ve been following my preset pack development you’d know that it wouldn’t need to be 4K, it wouldn’t need to be OLED and it wouldn’t need to be HDR.
There is no perfect display technology right now for all use cases. For movies and most types of gaming any currently available OLED technology comes real close to perfect but then there’s the burn-in risk, the limited brightness and we also need to factor in the cost.
Subpixel layouts don’t even enter the conversation unless we’re talking about CRT-emulation and to a lesser extent text rendering in a desktop environment.
What you’ll ultimately require really depends on how pedantic and meticulous you are and your budget is when it comes to this idea of accurate CRT emulation.
When I talk about accuracy, it is not that I’m trying to be the voice of accuracy or are claiming that my presets represent the pinnacle of accuracy.
I’m really referring to the techniques used in the shader/preset itself.
I consider the ones which don’t add any unnecessary subpixels or dilute the phosphor emulation for the sake of brightness to be a less accurate implementation than ones which do that.
The principles which drove the creation of the Sony Megatron Color Video Monitor are perfectly aligned with that. CRT-Guest-Advanced is also capable of doing a fantastic job but the last time I checked the performance requirements are a bit higher than Sony Megatron Color Video Monitor for a similar level of accuracy and I used to spend quite a lot of time turning down or turning off settings which added stuff that caused things like additional blur, haloing, dilution of the mask and white lines over the mask that wouldn’t be there on a real CRT.
So this is not a criticism of the awesome and excellent ability of Guest.R’s work in any way it’s just to say that Sony Megatron already strips down some of what Guest-Advanced has to offer to the average user to something which is ready to be used by the more pedantic, phosphor sniffing type of user.
I still use and leverage a large part of CRT-Guest-Advance-NTSC in my Sony Megatron Color Video Monitor Presets, which are actually Sony Megatron Color Video Monitor + CRT-Guest-Advanced-NTSC + Grade + Img-mod (and for some + SuperXBR) hybrid presets.
Overall OLED has the most beautiful presentation and WOLED can display some of the Mask Types properly, albeit in R-B-G or B-G-R orientation as opposed to a CRT’s R-G-B phosphor layout.
I say this because of the way the OLED image seemingly floats on that deep black background as well as the glossy glass-like finish on the screen but the real seller of the technology, which drives the point home for me are the near infinite viewing angles.
I spend more time looking at video games and listening to the music than actually playing them so in scenarios when the room is completely or almost completely dark and I’m walking around, even if I’m looking at the screen from a near 90° off center, it looks perfect and it maintains the illusion of a CRT-like experience for me.
Now that’s huge but that’s basically where the most meaningful advantages stop for me and there are so many disadvantages that there’s a severe price to pay for this level of accuracy and immersion.
So you buy an OLED TV for something like this, especially if you’re going to get more and more drawn into the hobby - which might be the case if you buy shiny new hardware which allows it to become even more immersive experience and chances are you’re going to be doing more and more things which are not good for the health and longevity of said OLED technology.
So if you have the resources to replace an OLED TV every few years after its worn or burned in and are willing to accept and live with all the risks and consequences, then be my guest.
When I was just doing Mega Bezel presets, I hadn’t noticed anything much when it came to burn-in on my 2016 OLED TV but when I finally decided to tackle the Sony Megatron Color Video Monitor funnily enough, I couldn’t get it to work or look properly at all. Apparently there was a bug which caused it turn turn HDR off when using Vulkan. This was quickly worked around and the fix incorporated into the shader then I was really able to see what it had to offer.
It was so vivid while being so sharp and pure that it felt amazing. However, I did wonder about running and possibly ruining my TV with such high brightness levels and that’s exactly what happened eventually.
It’s not what put the final nail in it’s coffin but it substantially lowered the value proposition it brought to the table.
The thing is, I didn’t even know that it was being damaged. That is because I used to set relatively far from the screen - well at optimal viewing distance for a TV of that size.
When I was forced to view from a nearer distance, if was then that I noticed that something was wrong.
VA miniLED for all intents and purposes matches it when viewing from within the viewing angle sweetspot but you can’t do the same walk around the dark room thing without the blooming and colour shifting starting to appear.
IPS is nice to have the phosphors in R-G-B - which is how most of them are laid out but the blooming and black levels, especially during off angle viewing are atrocious.
The actual in-game image looks great though, especially in a room with some lights on.
So miniLED seemed to off the best compromise for me because I knew that I wasn’t going back with OLED any time soon and I couldn’t afford to in any case and it’s been great!
The additional brightness headroom makes games really pop and allows me to run BFI for additional smoothness and preservation of the mask details when the background is scrolling or sprites are moving.
I can also use virtual any Mask and it would look as it should, unlike WOLED, which you would only have a small subset of Masks and TVL combinations which will line-up properly with the special RWBG subpixel layout.
QDOLED is even worse.
For further information on displays, you can look in The Best TVs for CRT Education thread.
Now for the hardware computer hardware part.
My current system uses a GeForce GTX 1070 plus AMD Ryzen 5 5600X and that can run anything I created and would throw at it up to Wii and Dreamcast/Naomi/Atomiswave. I tested LRPS2 with once and it seemed to run well but I don’t think I tested it with my most demanding shader presets, which would have to be my Mega Bezel MBZ__0 and MBZ__1 presets.
I don’t use those presets much if at all now though I use my Sony Megatron Color Video Monitor stuff.
For a while I was running them on a secondary machine at 4K with up to PS1 and possibly Dreamcast/Naomi/Atomiswave.
That system has a GeForce GTX 970 and an AMD Ryzen 5 3600X.
Both systems had at least 16GB RAM.
I’ve been able to test and experience on a system using a Ryzen 5800X3D and a Radeon RX 6700XT.
I also had a hybrid old/new system with an old Core i5 2500K and a Radeon RX 6600 that ran everything up to PS1 with no issues at 4K using my Mega Bezel presets.
Lastly, that system was upgraded to a Ryzen 5 8400F with everything else the same and it continues to handle everything flawlessly.
So that should give you a rough idea of the type of baseline you should look for.
I’ll update with what might put together if I was building but I need to know if you would also be playing modern PC Games or would like to also emulate other systems like WiiU and PS2/3.
Also, will you be doing any type of recording/streaming/video capture.
Before I go into such detail, I the GeForce RTX 4070 Super Ti was the minimum NVIDIA GPU I was thinking of upgrading to if I was in a position to upgrade and not the 5070 Ti due to the loss of PhysX support. I await to see what comes out of it being open sourced.
On the AMD side, definitely the AMD Radeon RX 9070 or 9070XT.
For CPU for emulation, you really don’t need to be that picky but the Ryzen 5 8400F, Ryzen 5 7500F and Ryzen 5 7600 are all great starting options for emulation and even PC Gaming.
If you want to do more then you can step up Core Counts and performance with the Ryzen 7 7700X, 9700X or Ryzen 5 9600X.
Or just go for it with a Ryzen 7 7800X3D or 9800X3D. If it’s a gaming build or if you need extra cores for productivity a Ryzen 9 9950X3D or 9950X or 9900X.
For RAM you want at least 32GB DDR 5 6000MT/S CL 30 in a dual channel kit.
For Storage, a 2 or 4TB NVMe PCI-E 4.0 drive with DRAM Cache, although there are some specific newer drives without DRAM Cache that seem to be performing quite well.
Prices can vary quite a bit for that type of configuration depending on the brand but I wouldn’t pay more than US$280 for the 4TB variant so that should narrow things down a bit.
I purchased a couple Fitwok FN970. There’s also an HP model that uses this same configuration.
The one DRAMless drive to consider would be the Silicon Power US75 or drives which use the same combination of NAND and controller chipset.
For motherboard, almost any AMD B550 or B650 chipset based motherboard would suffice. Just make sure it has all the peripheral features you desire for example WiFi and USB C.
I can’t count how many times I’ve booted up to play any game and it turns into me cycling through different shaders while the game runs in demo mode just because something about the pixel art is that interesting to look at.
This is what led me back to this forum because I had been following your updates working with 4k OLED displays and wanted to see for myself.
I sit somewhere between 3-4 feet from mine. I’m pretty sure that’s too close from what would be considered optimal viewing distance.
I’m ok with viewing “sweet spot” being head on. I don’t need the extreme viewing angles in my case.
MBZ__0 have been my go to since you introduced em to me a while back. Very nice to look at, albeit on a very mid display that doesn’t get too bright. Haven’t had a chance to try the Sony Megatron presets.
Negative.
Just to wrap up, I was at Best Buy a while back and saw one of those <$200 Asus laptops and decided to pick it up to see exactly what I could do with one in terms of emulation. I was able to run SNES, Gen no prob. The only shaders it could handle were some of the presets in the shaders_slang/CRT folder and it’s been good. Been rocking with it for about a year now since I spend a lot of time away from home. Now I have more free time and I want to see how things look on a much more premium display.
Size isn’t a big factor for me either. Before coming on here I was looking for a 4K OLED somewhere in the 20"s, but no dice. I don’t do any AAA gaming and the games I do play, I have the PS5 and LG when I’m home.
Appreciate you man and look forward to hearing back
Despite all of the aforementioned limitations, I still think these things look best overall on OLED. If you’ve never experienced or had an OLED TV before, it’s probably something you should experience at least once in your lifetime.
My reaction on seeing OLED for the first time was jaw dropping.
On seeing miniLED, it was more like, “this is not bad at all.”
Newer OLED TVs have heatsinks, overall better cooling systems and newer chemical compositions to resist burn-in even better than my older one. I’d still be reluctant to experiment again though, unless I had the resources to consider this a more or less disposable item purchase.
Okay, do remember with monitors you tend to pay more per inch than with a TV and most of them seem to be QDOLED.
There isn’t much information available concerning how the unique subpixel arrangement of QDOLED fares but we do know for a fact that they won’t line up perfectly with subpixel accurate emulated CRT Masks.
Despite not being perfect either, WOLED is still better in this regard.
27" - 28" might be a good starting point if viewing from close to the screen.
Optimal viewing distance for what? For normal TV and movies viewing maybe not but for an intimate, immersive arcade like experience, maybe closer is better.
I’ve noticed that the further away you are from the screen, both saturation and brightness as well as perceived sharpness tends to drop off.
Colours can really pop in an extraordinary way when viewing presets from up close compared to the same presets from a couple feet further away.
That’s why I created my “Near Field” presets.
You should also check out my 36" Arcade Neo-Gx and Neo-GX Ultra Clear presets.
On a 55" TV, using the suggested CAR from the filenames, you can get a nice 36" CRT Arcade monitor simulation that’s a sight to behold!
Alright, so you can stick to the parts recommendations on the lower end of the price and performance spectrum and you should be just fine.
Depending on the specs of your old PC, you might be able to just grab a Graphics card and slap it in and maybe just upgrade the storage to a 2TB SATA SSD.
You can use your existing controllers or get some dedicated controllers. I use XBox One/Series S/X controllers.
KTC and Innocn sell some decent HDR1000 miniLED monitors from US$299 for a 27". So you can also check those out.
You’d need the more expensive 120Hz+ ones for a better and more CRT-like motion experience using BFI/CRT-Beam Simulator but a 60Hz one may not necessarily be a deal breaker.
I doubt they can beat the value proposition of a TCLQM751G 55" TV at under US$480 though. So go check that out as well.
Thanks man, what good is having a hobby if you can’t share in the experience with others?
That’s def the plan even if I have to hook it up to my TV once and just see for myself. Weigh out at that point whether it’s worth the guaranteed issues that will come up later.
I have to check the prices on these. If you could go back and see the miniLED before the OLED, would you? Any input latency differences between the two?
yea, good point. How much mileage we talking bout here? A year? More/less? I understand use is a big factor. Just an hour or two a day, for example. Or whatever your experience was.
That price is very much in play. Are these a tier under OLED or am I looking at this the wrong way? I’ve seen the videos RetroCrisis has done covering your shaders and I want an experience like that. What I would consider the best possible spot to be in without OG hardware and a CRT/PVM/BVM. I’ve seen what a modest display has to offer and now I want to get closer to the ceiling when it comes to shaders/emulation.
Just to touch back on the laptop angle. I’m away from home a lot and being able to have a premium option that I can travel with is appealing. That said, given the OLED degradation and the $1500+ price tag of the laptop, would be tough to justify lighting that much money on fire.
Well they’re not guaranteed. It’s just that my only point of reference was my 2016 model OLED TV. Since then, there have been great strides in the burn-in prevention department though. I just haven’t been able to test it myself and don’t have enough data to make any pronouncements.
That TV lasted a long time by modern standards. It even survived a lightning strike albeit with a damaged HDMI port.
Also, I used all sorts of presets with it before I started using HDR presets and hadn’t noticed any image retention before. However after moving to HDR, I got even more into the hobby and my usage balance shifted even more towards retro gaming.
So I can’t say whether the burn-in was due to a gradual, linear accumulation over the period in time or a logarithmic one which quickly got worse starting from when I started using HDR and also let my guard down.
After all of these years of babying the TV and being unable to play one of the best games ever created - Civilization V without causing image retention due to the menus, moving to miniLED was like a breath of fresh air and a weight off of my shoulders. I could now embrace “pause” and wallpaper once again! So there’s a lot more peace of mind now.
Well it’s impossible to go back in time and both technologies arrived on the scene during different eras. It’s only since the 2023 TCL QM850G that I think miniLED TVs have reached to a point where I can say they’re good enough to be a viable alternative to OLED. Back in the day I was also looking at dual layer LCD but it had a false-start in the consumer market and you know nowadays one false start and you’re out of the race.
When I bought my OLED TV, I was also looking for the best 3D experience, there’s so much that was different about that purchasing thought process and I also could have afforded the OLED TV at the time. Based on my research, I felt like that was the best choice for me.
So if I had to do it over again. It would happen the same way because my model TV still remains one of the best ways to enjoy 3D in the home. The next year, they canned 3D TVs. Being able to run CRT Shaders was never factored into the equation.
The TV eventually stopped working and I needed to replace it. More wanted to but it felt like a need to me and my best option at the time was a TCL QM751G. I also did my research on these and the big brother TCL QM851G was giving the best miniLED TV, Sony’s Bravia 9 a run for its money at 1/3 if the price and the QM751G was not that far off the QM851G in terms of specs. Plus I wanted a 55" and the QM851G only came in 65" and up.
I picked the TCL over the similarly priced offerings from Hisense because it seemed to have the faster and more responsive backlight system.
With miniLED technology there’s a lot of variance among different manufacturers. For example LG and Samsung seem to be at least a generation behind.
Even those miniLED monitors from KTC and Innocn may not be able to perform on the level of the TCL QM751G but at least they’re bright enough and that’s what’s most important.
On paper yes but in practice it’s not enough to be an issue for me or to even be noticeable. Remember there’s more to input latency than just the display lag/latency and there are several ways to lower this in other parts of the latency chain.
You should read the first post of the link I sent you concerning best TVs for CRT emulation to better understand the pros and cons of faster/slower pixel response.
OLED’s near instantaneous pixel response is not always an advantage as it produces more stutter like animation as opposed to the smooth interframe transitions that a little bit of ghosting/blur can produce on LCD panels.
You could say that but it all depends on your use case. For CRT Shader emulation they might be superior in some ways for example subpixel layout and brightness but inferior in other ways, viewing angles and backlight zone transition performance, blooming and granularity.
Plus no burn-in or automatic brightness limiter dimming stuff while you’re in game and other annoyances which are unique to OLED.