Mega Bezel Reflection Shader! - Feedback and Updates

Yeah I looked into it earlier and it’s probably because basic doesn’t have scalefx (so it runs faster ;))

We do have the basic-extra-passes base preset which has all the passes like scalefx. We could easily create a basic-reflect-extra-passes and then you would get the same treatment on the screen that you can get with the newpixie-clone preset

2 Likes

On checking again, it was really the basic_extra_passes that looked the same as standard shaderwise. While basic, potato and basic_reflect were the ones that looked low res.

Other than that I’ve been tinkering here and there. I realized that I had left in the original setting for

HSM_DOWNSAMPLE_BLUR_SCANLINE_DIR = “100.000000”

in addition to my setting which was

HSM_DOWNSAMPLE_BLUR_SCANLINE_DIR = “0.000000”

and the one that had taken effect was the 100 setting. This explains why some of my games looked a little softer than expected at least some of the in-game text.

After making the change to set it back to 0, I had to readjust my latest halation changes because it was set while using the “incorrect” HSM_DOWNSAMPLE_BLUR_SCANLINE_DIR setting even though at this point everything was looking good as can be shown in the most recent screenshots.

What I noticed was what was seen was vastly different depending on my viewing distance. I first settled on HSM_DOWNSAMPLE_BLUR_SCANLINE_DIR = “0.000000” and halation = 0 for the home console cores (using Composite - Sharp) in order to extract maximum clarity in the outlines as well as text (especially on lower resolution consoles) and this looks really amazing and I think I’m going to leave things here for a while. I still can’t believe these old games can look this good! Makes me want to play them all over!

I did worry that by leaving halation at 0, I was skipping a step in the processing that was important. Although to me, this looks closest to the kind of clarity that I used to experience on my old Commodore 1702 monitor that I used to game on in my youth.

As for my Arcade - Sharp preset I couldn’t use HSM_DOWNSAMPLE_BLUR_SCANLINE_DIR = “000.000000” because it leaves quite a bit of posterization in the image, which was particularly visible from close viewing distances. Setting it to 100 cleaned up most if not all of these artifacts and had the image looking much more natural in my opinion without making things blurry or significantly softer. After that, I added 0.2 doses of halation and that brightened up the overall image quite nicely without adding any significant blur or bright clipping!

So with that, I’m a happy camper once again! Hopefully these things can stay fixed for a very long time to come so that we can focus on enjoying the games! I’m expecting things to get confusing again with the advent of HDR support though. In the meantime, please enjoy the shader presets. Screenshots to follow at a later time.

2 Likes

Just an fyi. Tested my presets with the basic-extra-passes base preset and GPU usage was in the low to mid 60% range at 4K using my trusty GeForce GTX 970. Looking forward to testing the basic-reflect-extra-passes preset to have the best of both worlds!

3 Likes

That sounds too good to be true in 4K. :frowning_face: What processor are you running on that machine.

I have yet to find the time to do the same testing, and I won’t have 4K on those boxes in any case. I hope my results mirror yours.

1 Like

I’m running an Intel Core i5-9600K in the machine with the GTX 970 (used to be 970 SLi but one conked out a while ago) and I’m running an AMD Ryzen 5 5600X in the machine with the pair of GTX 1070s in Sli. This is on 8 bit and 16 bit era cores by the way. The last time I checked the Ryzen 5’s CPU usage was about 2%. It’s generally so low that I don’t really pay much attention to it.

@Duimon Perhaps, you can share some more about your setup and your performance experiences and maybe we can get closer to understanding why it’s the way it is and possibly even uncover some hidden bottlenecks and quirks somewhere. That’s one of the things I do actually. I don’t want to say for a living but it’s kinda a part of what I do for a living - which is trying to get the most performance and best experience out of whatever hardware is available, even the lowest performing for as little money as possible.

The first thing I would do is try running some tests using just one monitor hooked up to your PC. I could just imagine how detailed and defined my CyberLab Mega Bezel Death To Pixels presets would look on a 4K monitor that’s smaller than typical large 4K TV sizes. I assume everything is going to look even better than on a 55" as they used to on my 13" monitor back when I was a child.

1 Like

Yeah I’m just trying to figure out why my performance is so much worse than yours. My CPU is an i7 6700K. With my 1070, once I upgraded to 4K monitors I was having a hard time running the STD preset. I do have dual monitors but that shouldn’t make that much difference.

1 Like

Perhaps it’s the additional resolution from running multiple monitors or some other bug or anomalies when running multiple monitors and using RetroArch? I’ve seen RetroArch not behave well at all on some trying to get it to work from a laptop hooked up to an external display. In one case I had to set the external monitor as my primary, switch it in single display mode, then restart the machine for it to work full speed. Remember RetroArch samples the primary monitor’s actual refresh rate and attempts to adjust itself to match or at least you can tell it to. I don’t fully understand why but in my experience it’s not always so simple and cooperative when using it in multiple monitor mode, even clone mode.

Then perhaps it’s the particular cores you run. I haven’t tested PlayStation era or more modern console cores with it and I don’t intend to anytime soon. I use Mednafen PCE Fast/Beetle PCE Fast, FB Neo, Genesis Plus GX, PicoDrive and Nestopia (can’t remember if I’m currently using that or FCEUMM in my LaunchBox setup at the moment but I’ll check) primarily.

Unhook one of those monitors temporarily and start running some tests and send some numbers, data and observations. I’m particularly interested in finding out your CPU and GPU usage figures when running particular cores and also when idle. That alone can tell you quite a lot. I use RTSS but you can send whatever data you wish using whatever tools and methods you prefer and we can do some analysis.

You can probably try running the same cores I’ve listed above for a more apples to apples comparison.

Even when I had a Core i5-2500K with SLi GTX 970s I used to run RetroArch in 4K with no issues. Of course that was long before I came across HSM Mega Bezel Reflection Shader but I was still getting some nice image quality using the Analog Shader Pack. So the CPU shouldn’t really be the bottleneck at all in your case.

So you have two GTX 1070s in the same machine? So does that mean that Retroarch can use the combined processing of both cards at once?

1 Like

Unfortunately it can’t or at least doesn’t with the cores I use. I don’t think RetroArch ever supported SLi or Multi GPU or that it ever will. If threaded video can add latency and other undesirable effects imagine trying to keep latency down on a pair of GPUs working on multiple frames or parts of frames and keeping everything in sync. I don’t know of any emulator that currently supports multi-GPU tech.

1 Like

From what I understand, last year you could have used both implicitly using the DirectX driver, without software support. Going forward only explicit (Software support.) will work.

Nvidia nixed SLI support in their driver.

1 Like

All they did was stopped adding or working on new profiles. The technology still works and can be enabled or disabled using the driver. I never suspected that they might have stopped it from working altogether. Perhaps that day might come eventually. People might say that SLi is dead and stuff like that because many if not most current games don’t support it but you’d be surprised at how many still do! Also, I’m a real person not a statistic and I actually play games (well have but not actually play) which came out over a very long time period so it’s a mix of old, not so old, new/current, not so new and in that mix quite a large number of games support SLi including many modern ones. Remember SLi is something that can be implemented from the game engine level and many games using the same engine will also benefit from the technology. So while it doesn’t always work, that’s not so bad because I can just lower my resolution or use resolution scaling and be just fine but when it does work, I get performance which can be just as good or better than a GTX 1080Ti, which is similar to an RTX 2080 and also an RTX 3060!

2 Likes

When Nvidia acquired the SLI tech along with 3DFX, they could have implemented it in hardware, like 3DFX did. I wish that had been the case.

I have a 1997 Titan Tyan server with two Voodoo2’s in it that flies playing 3DFX games.

Don’t mind the mess, I was working on this while I was a renter, before I joined the ranks of homeowner. :grin:

6 Likes

Wow! Talk about old skool! You actually play games on that thing? Shouldn’t that be preserved in a museum?

1 Like

Yeah it is a very cool box. I actually built it with all new parts. :grin:

It is dual booting FreeDOS and Win98 using all the unofficial win 98 service packs.

I have an FTP server running in both OS’s so I can transfer files easily.

Edit: It is part of my own personal WIP museum. I have ~47 computers waiting for it to be completed.

4 Likes

I see, very impressive!

Wow! Even more impressiver! Lol

1 Like

Back on topic, I will try to find time to do some single monitor tests along with my under-powered PC tests, ASAP.

2 Likes

Great! Your PC isn’t under-powered, just misunderstood. Lol

Remember to completely disconnect the second monitor if you can. That single monitor must be the primary monitor and you have to reboot after setting it as the primary monitor before testing as well.

2 Likes

I’m not sure I will get the chance to do any single monitor tests that will be conclusive, now that I am running the 3070 Ti. I don’t want to swap GPUs just for testing.

I can turn up some internal scaling on some 3D cores and we will see what we see.

1 Like

Well at least you might be able to spot some bottlenecks which might be limiting your new GPU’s potential and you should now know that you can give the 1070 a little more credit than it was getting before based on your experience.

1 Like

The 1070 will be replacing the 960 in my HTPC so it will definitely do some good there.

I did some R&D on my next upgrade and it isn’t as expensive as I was thinking. For around $2000 I can upgrade to a new MB with an 11th gen i9 Rocket Lake CPU (10 cores.) and 128GB of RAM.

It will not be happening tomorrow. :grin:

2 Likes