Yeah I can see you’re quite passionate about this but I still dont get why youd want to pay out $750 on an upscaler when you can have a CRT for half the price. The brightness will come to modern displays so we can have BFI and accurate scanline simulations but until then I just think CRTs are the better option.
Yeah Im using the s95’s BFI and the Eve Spectrum’s blurbusters implemented back light strobing both of which are just too dark with accurate scanline implementations.
Maybe this years TVs have got bright enough - I think some quantum dot LCDs could probably be bright enough - some go up to 2000 nits and beyond.
Yeah I dont get the wont touch an emulator but then use an LCD/OLED to play their games. I see it on the MiSTer forums all the time - they say this emulator drops a frame in every 1000 compred to real hardware and how thats a terrible thing and then go on to say theyre using a flat screen.
The MiSTer has terrible scanline reproduction and most dont even bother with that but bang on about emulator vs fpga accuracy. Of course there are bad emulators but for a huge host of retro platforms the biggest improvement is not the hardware/software its the display. Mind you as you say even back in the day there were concentric circles of CRTs and input signals etc. We’re probably not as bad as car nerds though right?
lol yeah, I’m also into synthesizers. We’re definitely not the worst offenders here lol
But yeah, the original hardware purists shit on everyone else because it’s not the real thing gais, I can tell by the pixels, the FPGA guys shit on the emu guys for placebo accuracy differences and on the OGHW guys for wasting money and space, the emu guys shit on the others for wasting money.
Same thing with CRTs vs FPGA upscalers vs shaders lol
lion_king_osd.mp3
I have nearly every retro console (collector), a Mister, and a PC dedicated to emulation and I pretty much only use the emulation PC now. Having to hook up all the old consoles is a major pita and quite frankly, I don’t have the time to dedicate to games with crappy save systems so I pretty much need save states these days to make it through longer games.
If I am experiencing a game for the first time I will always play it on one of my CRTs, but after that I use my OLED a majority of the time. The shaders presented in this thread are outstanding, I am very excited to try these out on a MiniLED TV one day with a high peak brightness.
That reminds me of audiophiles of the 90s (and possibly still today?) People telling you to get £200+ cables. A mixture of snake oil salesmen on one side and on the other the male pysche.
Yeah I havent gone down the original hardware route (yet!) But I do have MiSTers, SNAC and CRTs and I too have got to the point of convenience and more importantly save states at my geriatric gaming age - how on earth does anyone get to the caves in Ghosts and Ghouls in their 40s? (hmm possibly that’s just me as I was terrible at it in my teens)
Its not about being passionate or not. Im just pointing out the obvious and why this thing costs as much as it oes . You cant really compare a scaler to a CRT. They are apples to oranges.
And besides
-
CRTs are not really for everyone
-
CRTs weigh a lot
-
CRTs are limited in terms of size
-
CRTs have geometry and convergence issues which many people do not want to deal with
-
New CRTs are not being made anymore so you have to scour for some used ones which may not be in good condition + shipping is a pain in the butt
-
Many people want to play their consoles on a new modern screen(be it OLED or LCD)
According to Mike Chi the creator of this device the intial batch of RT4K already sold out and there was so much demand his site crashed(which indeed it happened).
Most of the people who buy this stuff know that the tech its not quite there to do BFI + Mask/Scanlines and reach CRT-tier motion(and its the same case on PC really) but they dont care since they just want something that is good enough to scale all their consoles. That said thank to support from Mark Rejhon of Blurbusters there are a bunch of BFI modes in there which you can do with custom modelines(like you can do 720p@360hz BFI I believe) and you can compromise by using higher TVL shaders or not using shaders at all.
Forgot to mention btw, one of the interesting features Mark implemented on this thing is simulating a double or triple shutter film projector via BFI(96hz for double shutter(48hz)or 144hz for triple shutter(72hz)).
As for your shader(or the Cyber variant), I am not sure if it would work on RT4K but I could be wrong.
AFAIK, it doesn’t have a GPU, so it’s not really running shaders (and therefore, can’t run the megatron or any derivatives). However, the thing that the megatron shader illustrates is that, once you get the brightness up via HDR, you can just draw scanlines and masks–which are simple enough to implement in FPGAs–without needing to do all of the complex blurs that other shaders use to preserve brightness.
Has anyone gotten Megatron to look bright enough on an OLED w/BFI? I was assuming you’d need HDR1000+ for Megatron if you’re also using BFI.
I think it’s impossible on OLED. It’s funny how much this feels like Plasma vs LCD all over again.
not on my c2, too dim as well for my liking.
Of course it doesnt have a GPU since it uses an FPGA. You can technically render shaders(graphics) with CPU too.
I imagine the shader would have to be rewritten from scratch for the RT4Ks FPGA.
The RT4K FPGA is supposedly pretty powerful but I am not sure if its powerful enough for Megatron.
That said, tbh the dense triad(slot or dot) shaders in the RT4K look close enough to the 300TVL Mega/Cybertron to me so Im unsure if it would be worth the effort. But again that could be just me.
Shaders are a specific thing: programs that run on a GPU and are written in a high-level shader language or GPU assembly. FPGAs can technically be on a board that also has a GPU, just as they can share a board with, say, an ARM CPU (like the DE10-Nano), but that’s not happening with the RT4K, AFAIK. Yes, you can render shaders on a CPU using an emulated/soft-GPU like llvmpipe, but that’s not an option on the RT4K, either.
It would not have to be rewritten, its functionality would have to be reimplemented entirely, which is possible–as I mentioned twice–but at that point it is not really related to the megatron shader except on a conceptual level, which is: HDR for brightness, draw scanlines and mask, and AFAIK that’s already in-place on the RT4K.
@MajorPainTheCactus This whole tangent is pretty far off-topic, so let me know if you’d like me to spin it out into its own thread.
Semantics. That is more or less what I meant.
Ty Cyber for everything as well!
So I had some time over the last day or so to investigate this problem where certain values end up outside of the rec. 601/709 gamut triangle i.e they are in wider colour gamut spaces.
This happens when extremely small values (extremely dark colours) go into the ST2048 PQ function. Due to presumably floating point inaccuracies it ‘warps’ these very dark colours out of rec.709 gamut.
You can ‘fix’ this by adding a clamp(colour, 0.000001, 1.0) before feeding the colour into the HDR10 transform i.e add this line:
scanline_colour = clamp(scanline_colour, vec3(0.000001f), vec3(1.0f));
at line 1669 in crt-sony-megatron.slang.
This comes at the cost of technically non zero blacks albeit it probably rounds down to zero black.
Because this has no real impact to the final image as these darks are just too dark to perceive that they are slightly outside the gamut, I’m going to leave the shader as is. People can use the Windows 11 AutoHDR if they really want a solution to this but I’d defy anybody to see the difference (in fact I’d really like to know if you can and more importantly what scenario). I presume that Windows 11 AutoHDR is using higher precision and/or a higher precision power function and is why its able to resolve this issue properly rather than using a clamp.
Excellent news! This definitely explains why we needed the analysis tools to see the issue in the first place.
I will experiment with that clamp fix at some point, but by the sound of things i very much doubt that i will see anything that you didn’t.
It’s worth noting that even if Win11 AutoHDR is (technically) more accurate in this very specific regard, it’s still using a piecewise sRGB decoding gamma for SDR to HDR conversion, which is incorrect for literally everything older than the mid-00s, and is only correct for a smattering of PC exclusive games past that point.
Incorrect decoding gamma would be waaaaaaaaay more noticeable than near-black gamut overshoot.
Ah interesting I havent had the chance to upgrade to win11 to try this out. One thing I would say is that we can transform into sRGB gamma space in SDR using existing shader settings and so Windows 11 AutoHDR would be able to do the correct transform into rec. 2020 - would it not? As I say I havent got win11 to try this so I could be missing something.
I found I was able to select Vulkan again if going into the Mupen Core options and selecting ParaLLEI-RDP option for RDP Plugin