I changed the colour temperature, it’s better but still far less vibrant than the previous preset.
Edit : changing back Saturation from -0.10 to 0 is better !
I changed the colour temperature, it’s better but still far less vibrant than the previous preset.
Edit : changing back Saturation from -0.10 to 0 is better !
Perhaps you should take a little read about it. There are bugs in the new version. You can copy the preset from the old one and replace the new one or you can give my CyberLab Mega Bezel Death To Pixels (Composite - Sharp) or CyberLab Mega Bezel Death To Pixels (Arcade - Sharp) presets a try. Keep checking back for updates and tweaks.
Just playing some games now that the image quality issues have been sorted out and I’m noticing some performance degradation when using CyberLab Mega Bezel Death To Pixels Composite - Sharp in the new version compared to the previous version. If I swap versions, using the same shader preset, I’m seeing some examples where GPU usage is going from the low 70s to the high 90s when switching from the old to the new version. Is this expected behaviour?
Thank you @Cyber, I tried your presets and they are really close to what I’m trying to get !
I really appreciate that. I’m glad that I could help and contribute! I don’t know if you checked back since my latest update but I now have the halation set to 0.08 (well 0.075 actually). Just by playing around with the halation setting alone between 0.0 and 0.5 or even just past 0.5 you can achieve a wide range of looks ranging from very sharp to very soft.
These presets also work on both the latest and previous versions of HSM Mega Bezel Reflection Shader. The previous version is faster but the halation setting might have to be set higher to about 0.5 to achieve a similar look due to a bug.
Happy retro gaming and spread the news of these new shader presets and this incredible HSM Mega Bezel Reflection Shader!
Just sharing some new screenshots!
CyberLab Mega Bezel Death To Pixels (Arcade - Sharp) - 11-09-21
Created with:
and:
Mega Bezel is updated to V 0.9.023 2021-09-11
Some shots from the Newpixie-Clone_Smoothed_Rolling-Scanlines__STD__GDV.slangp
Good looking screenshots!! Impressive!!
Wow! Is that what it started out at?
Oops, I forgot to take out the examples! This is why the examples are packaged separately
It’s updated now and back down to 15mb
Did you notice the ~20% performance hit between v0.9.21 and v0.9.22? I’m using GTX 1070s in SLI RetroArch only uses 1 GPU though. I recently started using Run Ahead and Frame Delay to reduce input latency where possible and with the newer version of HSM Mega Bezel Reflection Shader some games which ran fine with that setup are now struggling with not much difference or improvement in image quality between versions.
There might be some performance hit as I had to adjust a bunch of stuff to gain consistency of the image handling between the STD presets and the ADV presets. I hadn’t noticed a performance hit that big though. The caching feature I’m working on should improve performance a lot and get us to much faster than v0.9.21.
About performance testing the shader, the most reliable way to do this is to use the imageviewer core, load an image, then turn on fast forward and see how many fps you get.
Well I’m definitely looking forward to this!
I understand. I didn’t pick it up doing performance testing though. I noticed the difference during real world gameplay. Once I realized something was amiss I turned on my RTSS OSD and immediately saw the differences in GPU usage, which were very consistent and reproducible between games and shader versions. The ~20% difference I was referring to was in GPU usage not FPS. I observed the GPU usage causing the dips in fps and increases in frametimes using my graphs and numbers. I won’t argue that your method is more reliable at assessing performance but I’ve been running RTSS for years profiling and observing the performance of many many games so it’s very easy for me to see any performance issues when I’m playing. I run everything capped at 60fps. When the GPU usage reached the threshold of 97 to 100% usage that’s when the fps took a dip below the magical 60fps. In the older version of the shader in the same scenarios the GPU usage was consistently about 20% less than in the newer version all else being equal and as a result my frame rate never dipped below 60fps nor did the audio go out of sync with the video over time.
When I first discovered the Mega Bezel the frame and Bezel were actually images not code. The main issue @HyperspaceMadness was trying to solve was the difficulty setting up overlays using coordinates. He was still in the middle of developing the reflection, and it went through a lot of changes.
Over the last year and a half it has grown into a real beast with a ton of magical features. (From an artists perspective.)
The advanced preset requires a fairly beefy system. Personally I don’t have a problem with that. (The potato and basic presets still solve the original issue and the basic_reflect adds the reflection.)
If we end up keeping the ADV preset and add some more magic to it, I would be comfortable with it requiring an i9 and RTX 3090. (As long as the documentation reflects the requirement.)
In any case, if @HyperspaceMadness is successful with the caching, real world performance will only be an issue for artist setting up presets. (Scrolling through the parameters list at 22fps.)
@Cyber Thank you again for your dillegance my friend. (Man I love this community.)
What you have to understand is retro gaming and IT on the whole is a huge part of my life and indirectly my livelihood. It definitively helps with my mental health especially in these difficult times we are facing around the world these days. So these things matter quite a bit to me.
The fact that I can’t do what @HyperspaceMadness or @guest.r or even yourself are doing in terms of coding and graphic design gives me utmost respect and appreciation for your works. Videogames mean so much to me, especially the 8 and 16bit era games. I love the amalgamation of the arts that’s present in games marrying the very basic and ancient forms of entertainment and enlightenment in the forms of artwork, music, animation and storytelling with advanced technology and science.
With that said performance is something that needs an eye to be kept on because if not then how can this beauty be enjoyed? I’m not trying to nitpick, complain or even influence here and I do appreciate the continuous advancement but the differences in my observation in the visual quality between the old version and the new version using the STD base preset were so small that I really couldn’t understand it being worth so much extra GPU usage. Of course I don’t fully understand the whys and hows but @HyperspaceMadness did explain what happened so I would think that the foundations are being laid for bigger and better things. Now is just a very funny time to be needing a GPU upgrade so anything that worked fine moments before but now reminds me that the venerable GTX 1070 is reaching up to its limits running an emulator gives me the chills. Lol
Thank you again to all who have contributed. I wish this could reach the masses, that they might also be able to experience the joy that I am experiencing.
Yeah the performance is something I always have an eye on.
It’s no problem I think your approach of telling me what you see as issues and giving explicit examples of old vs new versiins and testing info has been great and much valued and appreciated, so please keep letting me know how things are working for you.
Yes in the standard presets you shouldn’t have actually seen any difference in visual quality between v0.9.21 and v0.9.22. The core changes were a bit more about how image scaling and other image features were handled to make it easier for the preset creators as well as enabling the split background which allows adapting to different monitor aspect ratios.
Thanks! I think everyone appreciates your words of appreciation I’m glad it’s adding to your enjoyment of games!
My bad. I could have stated that better. I was actually referring to the variations which referenced the standard base preset and not the base preset itself. Using my CyberLab Mega Bezel Death To Pixels presets as an example, with identical settings the old Mega Bezel v0.9.21 looks slightly different to the v0.9.22. Remember the new one uses an updated guest shader package and there were bugs fixed and changes to the halation behaviour among who knows what other minute changes. Even after adjusting my halation settings to account for this, I still don’t think it’s 100% identical. I actually settled on an in between value because I liked both the 0.1 setting as well as the 0.5 setting. I love the way things look now. I would even say I prefer it. When I go back to the old version Mega Bezel, I think it looks slightly softer with the halation set back to 0.5. It was clearly different with the halation left at 0.8 switching between versions but all still look good. I even looked back at some screenshots when things were a bit softer and I thought wow! I really love the way this looks too but I’m extremely satisfied with where the presets are now. They might be a minute tad darker than I would like but it’s not dealbreaking and I can always turn up the brightness on my TV but then my blacks end up floating in other areas of my TV viewing experience (although not noticeable in gameplay though).
I’m very hesitant to adjust the brightness or gamma because I don’t want to break anything or cause clipping and I understand the limitations of current OLED TVs and applying all of these layers of features which each darken the original image in some form. I’m talking about scanlines, fake scanlines, masks, vignettes, tube textures and other stuff. I also remember reading through some of @guest.r and @Nesguy 's conversation about seeking the “holy grail” of LED TV’s capable of driving enough brightness to be able to use masks and not have everything end up darker than a real CRT. I remember somewhere it was said that increasing the brightness using the shader parameters caused clipping.
Anyway, the point is my optimal shader settings (currently) are the way things look in v0.9.22 and v0.9.23. Switching back to v0.921 now feels like a slight step down apart from the excellent performance and the nicer looking static bezel highlights.
As you brought this up, I tried using my settings with different base presets and I noticed that the potato and basic looked the same shaderwise but the basic_reflect looked very different and much lower resolution. I’m gonna research what was changed again but is this supposed to be a trade-off in shader quality to get the reflections? I can’t remember if I tested the potato preset but I probably did and it looked the same as basic shaders but will have to verify. GPU usage went down from about the 90s to the 30s going from standard to basic. I’m going to try this on my GTX 970 system next. All of my testing and experiences are at 4K.
I have been in up to my elbows creating basic_reflect presets and have not noticed any difference, in any of the reflect base presets. I will have another look using the newest release.
I am not surprised you are having performance issues in 4k using a 1070. Even the 2060 is only recommended for 1080 AAA games.
One of the items on my shortlist is to do some testing with the basic_reflect on older hardware. I have a 950 and a 750Ti system available. I also want to test my Hybrid presets on an Intel iGPU using glcore.
My 3070 Ti arrives on Tuesday. (It is replacing my 1070.)
So you’re saying that my shader presets should look the same if I switch between standard, basic and basic_reflect base reference shaders with changes only noticeable in the Bezel?
Why aren’t you surprised? My GTX 1070 wasn’t complaining while I was using v0.9.21. The only thing I had to give up was Run Ahead using 1 particular core but I could still keep using Frame Delay. Run Ahead using a second instance and Frame Delay remained on with all the other cores I used and I still had lots of GPU performance headroom.
You lucky…j/k. Congratulations! My 1070s have to last me until prices come down and availability goes up unless I get a good deal on a sale or trade.
Are you using threaded video?
No, I never needed to. Doesn’t that introduce additonal latency/input lag?