That’s def the plan even if I have to hook it up to my TV once and just see for myself. Weigh out at that point whether it’s worth the guaranteed issues that will come up later.
I have to check the prices on these. If you could go back and see the miniLED before the OLED, would you? Any input latency differences between the two?
yea, good point. How much mileage we talking bout here? A year? More/less? I understand use is a big factor. Just an hour or two a day, for example. Or whatever your experience was.
That price is very much in play. Are these a tier under OLED or am I looking at this the wrong way? I’ve seen the videos RetroCrisis has done covering your shaders and I want an experience like that. What I would consider the best possible spot to be in without OG hardware and a CRT/PVM/BVM. I’ve seen what a modest display has to offer and now I want to get closer to the ceiling when it comes to shaders/emulation.
Just to touch back on the laptop angle. I’m away from home a lot and being able to have a premium option that I can travel with is appealing. That said, given the OLED degradation and the $1500+ price tag of the laptop, would be tough to justify lighting that much money on fire.
Well they’re not guaranteed. It’s just that my only point of reference was my 2016 model OLED TV. Since then, there have been great strides in the burn-in prevention department though. I just haven’t been able to test it myself and don’t have enough data to make any pronouncements.
That TV lasted a long time by modern standards. It even survived a lightning strike albeit with a damaged HDMI port.
Also, I used all sorts of presets with it before I started using HDR presets and hadn’t noticed any image retention before. However after moving to HDR, I got even more into the hobby and my usage balance shifted even more towards retro gaming.
So I can’t say whether the burn-in was due to a gradual, linear accumulation over the period in time or a logarithmic one which quickly got worse starting from when I started using HDR and also let my guard down.
After all of these years of babying the TV and being unable to play one of the best games ever created - Civilization V without causing image retention due to the menus, moving to miniLED was like a breath of fresh air and a weight off of my shoulders. I could now embrace “pause” and wallpaper once again! So there’s a lot more peace of mind now.
Well it’s impossible to go back in time and both technologies arrived on the scene during different eras. It’s only since the 2023 TCL QM850G that I think miniLED TVs have reached to a point where I can say they’re good enough to be a viable alternative to OLED. Back in the day I was also looking at dual layer LCD but it had a false-start in the consumer market and you know nowadays one false start and you’re out of the race.
When I bought my OLED TV, I was also looking for the best 3D experience, there’s so much that was different about that purchasing thought process and I also could have afforded the OLED TV at the time. Based on my research, I felt like that was the best choice for me.
So if I had to do it over again. It would happen the same way because my model TV still remains one of the best ways to enjoy 3D in the home. The next year, they canned 3D TVs. Being able to run CRT Shaders was never factored into the equation.
The TV eventually stopped working and I needed to replace it. More wanted to but it felt like a need to me and my best option at the time was a TCL QM751G. I also did my research on these and the big brother TCL QM851G was giving the best miniLED TV, Sony’s Bravia 9 a run for its money at 1/3 if the price and the QM751G was not that far off the QM851G in terms of specs. Plus I wanted a 55" and the QM851G only came in 65" and up.
I picked the TCL over the similarly priced offerings from Hisense because it seemed to have the faster and more responsive backlight system.
With miniLED technology there’s a lot of variance among different manufacturers. For example LG and Samsung seem to be at least a generation behind.
Even those miniLED monitors from KTC and Innocn may not be able to perform on the level of the TCL QM751G but at least they’re bright enough and that’s what’s most important.
On paper yes but in practice it’s not enough to be an issue for me or to even be noticeable. Remember there’s more to input latency than just the display lag/latency and there are several ways to lower this in other parts of the latency chain.
You should read the first post of the link I sent you concerning best TVs for CRT emulation to better understand the pros and cons of faster/slower pixel response.
OLED’s near instantaneous pixel response is not always an advantage as it produces more stutter like animation as opposed to the smooth interframe transitions that a little bit of ghosting/blur can produce on LCD panels.
You could say that but it all depends on your use case. For CRT Shader emulation they might be superior in some ways for example subpixel layout and brightness but inferior in other ways, viewing angles and backlight zone transition performance, blooming and granularity.
Plus no burn-in or automatic brightness limiter dimming stuff while you’re in game and other annoyances which are unique to OLED.
Yes but they might look even better on a 55" TCL QM751G or Hisense U7N/U75N which have the extra brightness headroom to open the door to using BFI/CRT-Beam-Simulator with the TCL having the edge due to it being brighter with more dimming zones and faster transient backlight response.
The TCL QM851G might be a dream with its additional brightness or even an QM850G or the Hisense U8K/U8N.
You can use Pastebin.com or you can encapsulate the log in Preformatted text parentheses.
The preset loads just fine on my end and I checked the paths and nothing seems untoward.
Be sure to update your slang shaders at least once and you can also install one of my packs from the W420M pack and onwards which have the correct guest shader self contained.
Also don’t be afraid to use the miniLED Shader Pack as that would contain many improvements and things that I have learned since making my earlier preset packs. You would just have to adjust the Display’s Subpixel Layout to match the subpixel layout of your display.
Which is something I recommend doing anyway. Of course you’ll also have to adjust the Peak and Paper Paper White Luminance to values that are suitable for your display as well.
Unless you have an identical display to the one used during the development of the preset, you probably need to adjust some more settings, possibly on your TV or via the Shader Parameters in order for things to look as good as possible on your display.
Newer TVs have features like Dynamic Tonemaping which can significantly alter the image for better or worse.
Remember the presets are already highly fine tuned so any change or difference in how the display processes things or applies its EOTF, Gamma, White Balance, Saturation can lead to things looking more washed out or clipping of colours for example.
So, it’s a step by step process to determine which combination of settings work best with your set/setup.
So here’s where you might need to start trying stuff, seeing how it looks then reporting back with photos and descriptions of what you’re experiencing then you might get a little back and forth as to what to try to adjust.
You seem to be enjoying yourself. Which preset is this?
Once you dial in your specific Peak and Paper White Luminance values, you can use them as a starting point or guide for setting those values for other presets.
Good that you’ve chosen one of the newer presets to work with and I see that your system is coping fine! Is this the old PC or the nes laptop? Right now I’m working on some tweaks to these newer presets. Mostly trying to fix mask alignment issues and reconfigure some of the higher TVL Shadow Mask offerings which show some strange colour artifacts.
So don’t be surprised if you see changes and/or improvements to presets which have looked a certain way before
By the way, it’s a good idea after adjusting a preset to suit your display, to save a Core, Game or Directory Preset with Simple Presets On. That way if there are any updates to the preset, they should continue to load seamlessly and your changes to things like the Peak and Paper White luminance will be preserved.
Well it seems proper, however, it looks like this is a GPU screenshot and not a photo of the screen. Unfortunately GPU screen!shots would look mostly identical on my (or any other) display so it’s not possible to make valid comparisons or get a good idea of what you’re experiencing unless you take some high quality photos then annotate somewhat the differences if any between the photos and what you’re actually experiencing on-screen in person.
Funny enough, after we talked previously, this cheap 4K chinese monitor landed on my lap and I hooked it up to the laptop and gave it a run. It doesn’t reach the recommended HDR brightness suggested in the Sony Megatron thread, but at around 490 nits and some parameter adjustments it looks good enough for being free.
It’s a shame not being able to adjust certain parameters that won’t change on my end like peak brightness and display layout on a whole suite of the shaders all at once. But your starting points are fine, so all good.
Is there a better way to do it? Or are you talking about setting up a pro camera at the screen and taking a pic that way?
You were my gateway to shaders a while back so this aspect of the hobby is very much out of my wheelhouse. I think after a couple tweaks to peak brightness, display layout, 4K instead of 8K, region etc. they look great.
Well that’s not too bad. My Mega Bezel HDR Ready preset pack would probably work nicely with that but I don’t think the 1050Ti would be able to keep up.
You can try the CRT-Royale Preset Pack and also the SDR to HDR conversion recommendations.
Notepad++ Find (and Replace) in Files can do this.
It depends on what you’re doing now? Was I correct in thinking that the image you posted was a generated/captured screenshot and not a photo of the screen?
I just described how I would go about taking photos of the screen in another thread so here goes:
Now why would you want to interfere with this if you have a 4K display? On a 1440p or 1080p (or 8K) display that might be necessary but do you really want to alter the TVL of my presets which are already optimized for 4K displays?
So leave the parameter @ 8K even if the monitor is only 4K? I thought this was a parameter dependent on one’s personal display. If I’m wrong I’ll leave it alone moving forward.
Precisely. These presets are already optimized for 4K displays so there’s no need to customize the Display’s Resolution or CRT-Resolution (TVL).
That’s only necessary for folks using other resolution who wish to find appropriate TVL’s/consistency for those resolutions.
8K doesn’t really mean, it’s for 8K displays only. It just means divide the TVL by 2 but sometimes other characteristics are more desirable when choosing 8K over 4K, like the slot mask height used or in some cases scanline/mask alignment.