System requirements - Building a RetroArch PC

comparing the specs on these:

43" 
dimming zones: 720
max brightness (HDR 100% window): 683 nits
120Hz max
$999
32"
dimming zones: 1152
max brightness: 1000 nits
160Hz max
$999

More dimming zones, more nits, smaller size, same price. Is it worth it? I’m leaning toward no.

This one, though:

https://www.amazon.com/gp/product/B0BCK1K44F/ref=ewc_pr_img_2?smid=A2SHWIYMNRTTNZ&th=1

Tops out at 144Hz but basically the same thing as the KTC for $720. At that price, it’s a more difficult choice. The extra brightness would allow for BFI+Scanlines+Aperture Grille(100%) while maintaining at least 100 nits brightness, or full strength slot masks without the use of BFI. With the Q90C, some form of brightness mitigation with shaders would be necessary when using BFI.

1 Like

QD-OLED will save us, praise god. I think we’ll finally have decent monitors in 2024; hopefully we’ll get some sweet black friday deals but it may be 2025 before the QD-OLED monitors are really affordable.

EDIT: matte coating on the LGs is confirmed :frowning:

1 Like

For me, I’d read lots of reviews. There are folks saying lots of good things about these on Reddit.

If it was considerably cheaper than the Samsung, plus good reviews and much better specs, I’d probably go for it but it’s good to get some independent reviews from reputable sources first.

Also, you might be lucky enough to get an RGB Subpixel layout.

Also ask the question, do you want to be able to easily go vertical one-time?

If you’re unsure about trusting the brand/product you can probably invest in one of those 3rd party warranties.

If the smaller or larger size is more preferable than the other then that makes it easier as well.

1 Like

This is a major question for me atm. As cool as the idea of a single very large monitor is, there are certain benefits of a dual display setup that a single large display doesn’t offer. I could just replace one of my current 27" with the 32" and I wouldn’t be losing anything, although it might look a bit awkward. Can’t do the same with the 43" due to space constraints. So this is tipping the scales toward the 32" Innocen a bit more. And with any luck they’ll be coming down in price even more, and I can just acquire a second one at some point while I wait for affordable QD-OLED.

1 Like

The KTC (Which I bought at a great discount during Black Friday) has twice as many dimming zones.

The dimming isn’t good for anything Retroarch related (I have it off by default) but is awesome in HDR AAA games.

If it matters, the KTC does not have speakers and the INNOCN does.

TBH, the biggest reason I chose the KTC is because it’s black and matches the rest of my setup. :wink:

1 Like

That is good to know. So now I’m leaning toward the KTC, because those dimming zones are pretty essential.

Have you tried it with the Megatron shader? Just curious.

There appears to have been a mini LED breakthrough because we supposedly have 5000+ dimming zone mini LEDs coming out next year for less than $900 (27").

2 Likes

Yes. While not as good as an OLED it does look very nice. Night and day compared to the HDR10 it replaced.

Since it doesn’t have per-pixel dimming like an OLED, the dimming doesn’t look good with shaders, but the 1000 nits is awesome.

2 Likes

Have you tried Megatron on OLED? I was assuming it wouldn’t have the nits for it.

1 Like

I have not, but I thought that OLEDs had higher nits than 1000. Apparently I am mistaken.

OLEDs are better at HDR not because of nits, but because of black levels. :man_shrugging:

There have been a lot of reports that Megatron works really well on OLED.

2 Likes

Best OLED TV HDR Game Mode Brightness Compared

LG G3

LG C3

Samsung S89C

Samsung S90C

Samsung S95C

Sony A95L

Best Mini-LED TV HDR Game Mode Brightness Compared

Sony X93L

Sony X95L

Hisense UX

Hisense U8/U8K

Samsung QN95C

Samsung QN90C

1 Like

Yeah it has enough nits for limited HDR but not for the HDR1000 spec, and it’s only for small areas of the screen, which works fine most of the time for movies and modern games, but the sustained full screen brightness is actually worse than old LCD.

I think Megatron can work well on OLED without the use of BFI but I don’t think will be bright enough with BFI+Megatron.

I would still recommend OLED for modern games and movies but for Megatron we just need the sheer brightness that is only found on the HDR1000+ displays (unless you’re willing to forego motion clarity for the sake of brightness).

1 Like

Ignorance is bliss when it comes to things like this.

Makes me not feel like ever touching a CRT again, lest I make myself sensitized to what I might be missing out on.

For me on my lowly 60Hz (native) OLED TV with around 660nits typical, motion and brightness is all joy!

Also note that the Sony Megatron Color Video Monitor documentation lists around ~600 nits as the amount of brightness needed to really enjoy it with ~1,000 nits providing more headroom for en even better experience.

Looks like the RetroTink 4K folks are jumping on the HDR CRT “filter” + OLED TV bandwagon!

https://www.reddit.com/r/crtgaming/s/3wx8IQu8HT

The TV used in this example is a Samsung S90C. So you can pickup some anecdotal feedback by reading the comments.

Might be a good idea; once seen it can’t be unseen. Even my 10 year old Plasma ruins me for any modern display, only OLED with BFI comes close (about 90% of the motion resolution of the plasma). With BFI the brightness of the OLED is barely better than the plasma, and the only real advantage would be the higher resolution.

Of course, it matters what type of games you play, too. Being able to eye-track an object with perfect clarity makes a huge difference in any kind of twitch gaming. Not so much if you’re playing RPGs, strategy games, etc.

I think that’s not using BFI, so 600 nits would be about right for slotmaks: 50% reduction from scanlines, 66% reduction from slotmasks, leaves you with around 100 nits.

Yes it’s kind of amusing to see them gushing over things that we’ve been doing in Retroarch (mask emulation!) for years, now :slight_smile:

The photos are decent but they’re definitely not using BFI, and I imagine they’re lowering the mask strength a bit.

1 Like

By the way @Nesguy, I omitted the LG G3 from my list but I just got a refresher that it was actually the brightest OLED TV of 2023 beating out all other W-OLED and QD-OLED offerings thanks to its use of MLA technology.

This MLA doesn’t seem to be snake oil at all.

I’ve updated my previous post to include its HDR Game Mode Brightness.

Here’s what RTINGS and Stop The Fomo have to say about that and what’s coming in 2024!

https://www.reddit.com/r/buildapcsales/s/bT3b9FLL2U

1 Like

So after all that back and forth, here is the monitor I just bought 2 of. It came down to the fact that there are some great discounts for this monitor right now, and anything larger than 27" is just too much for a standard-size desk.

Yes, it’s ultimately still a backlit display, yes it still has a terrible matte finish, but that’s just the state of things right now. Should at least be a significant upgrade over the absolutely horrible 1080p office monitors that currently sit on my desk, and should be very good for the Megatron shader since it’s an HDR1000 display.

4 Likes

Congratulations! I know you’ll provide us with some real world performance data once you familiarize yourself with them!

I built a system last year with an ASUS TUF Gaming VG289Q 28" 4K matte display. It wasn’t so bad at all even though it was matte.

Glad you got something that suited your needs as opposed to just your wants.

1 Like

All PC parts have arrived! Super psyched, haven’t done a build in a looong time. Here’s what I wound up getting. Got the GPU used on ebay for half the price. Probably complete overkill for Retroarch?

https://pcpartpicker.com/user/KingSlime/saved/#view=3V9WrH

3 Likes

Looks like a great build! I’m happy for you!

1 Like

Everything is built and running perfectly! Build was perfectly smooth except I forgot to remove the warning label from the cooler heatsink, had to remove the cooler and re-install it :sweat_smile:. There was also a small moment of panic when I booted up for the first time and there was no signal to the monitor; that’s because I was plugged into the motherboard instead of the GPU. :smiley:

4K, HDR, local dimming and wide color gamut are all such a massive improvement that now my trusty Panasonic Plasma looks somewhat dull in comparison, and I’m afraid I’ll have to spend several thousand dollars upgrading my living room set up :flushed:. Tried out a few games using the Megatron shader and it looks perfect. The local dimming is a bummer with shaders, like @Duimon mentioned, but awesome for modern games. Looked great in Alan Wake 2.

Now on to the negative. Right off the bat, I’m a bit concerned that my two new monitors are showing very different colors; white is NOT the same. However, one is connected with an HDMI cable at 144Hz and the other via DP at 160Hz, because the included DP cables weren’t long enough for both monitors. I’m hoping this difference is just down to different connections. I’m also noticing some pretty bad uniformity on one of them. I’ll probably return and roll the dice again. Also, there is a very slight vignette due to the panel construction. It’s not too bad, but it’s worth noting.

Exciting things happening at CES 2024, but I think we need to take “3000 nits” with a grain of salt. This is probably referring to a 10% window, so it will be a nice boost to HDR but I highly doubt it will be enough for the HDR1000 spec.

1 Like

3,000 nits seems like what the panel is actually physically capable of. It’s then up to the TV/Display manufacturers to decide how far they’re willing to push the envelope because there would be pros and cons to running it as bright as possible.

Some constraint might be power supply requirements, energy certifications and regulations, burn-in and image retention prevention/reduction, panel longevity, reliability, eye health.

Remember that MLA is a passive technology so that if improvements are mainly coming from that area then it should apply across the board meaning to all window sizes. Then there are other improvements in the processing which LG Display has claimed have also contributed to significant improvements in brightness.

So I’m more leaning towards optimism with regards to suitability for things like BFI with Megatron while running at high enough levels of HDR brightness.

1 Like