Duimon - HSM Mega Bezel Graphics and Presets - Feedback and Updates

If you’re thinking of using an adapter to connect the HDMI port of the laptop to the Displayport cable, that wouldn’t help. Does the laptop at least have a mini Displayport port?

1 Like

@Cyber is also correct that the external display is only powered by the Intel IGP.

2 Likes

@Cyber no, I have 3 USB ports, an HDMI port, and a card reader slot.

1 Like

Don’t confuse “powered” with connected to eh. The dGPU can still “power” games and stuff on an external display even though the external display is connected to a port that’s connected to the IGP. The rendering can still take place on the dGPU but this output is then fowarded by the IGP to the external display at a very minimal performance cost.

Are you really going to lecture me about semantics on my own thread?

Apologize.

You made a guess and I confirmed it’s accuracy through quick research.

1 Like

Awww…shuks, that sucks…I suggest you run at half your external monitor’s vertical and horizontal resolution if available to make sure you still get excellent scaling, or perhaps you could even setup a custom resolution of 1,920 x 1440. That might actually be able to fit within the HDMI 1.4 port’s limited bandwidth at 60Hz. It’s so crazy it just might work!

2,520 x 1080 = 2,721,600

1,920 x 1,440 = 2,764,800

It’s not that many more pixels to push than the 2,520 x 1080 that’s already working fine for you at 60Hz. Just look up custom resolutions in the nVidia Control Panel.

1 Like

I’m not lecturing, just clarifying that’s why I used the terms in " " I felt that the term powered could have easily been misconstrued. It could have meant “rendered by” and it could also have meant connected to without necessarily being “rendered by”.

I didn’t mean any offense whatsoever. I don’t quite understand what the issue is here.

This wasn’t really a guess.

That didn’t sound like an “Apology”. :no_mouth:

@Neofuuma A quote from Dell.

This is a software-controlled hybrid video system - only the Intel GPU has a connection to the internal or HDMI display panel. Though the nVidia chip is used when the application and Optimus support it, all video passes through the Intel GPU on its way to the display panel – this is how most systems other than those designed for high-end gaming or workstation use are designed.

The is no provision for hardware or firmware control of the GPU on this system - for that, you’ll need a system that has hardware-switchable graphics (you’ll find that in Precision workstations, XPS 17 models with RTX GPUs, and many Alienware systems).

Apparently it can really depend on the software/game that is running. As an experiment you can try running RA in a large window or switching to the glcore video driver. Just to see if it make a difference.

I’d be interested to know the results.

2 Likes

@Duimon. I managed to get the monitor to go in 2560 x 1440 resolution @60hz , albeit everything looks really stretched. RA seemed to run fine in fullscreen and with glcore driver. I had 60fps playing Boogerman on bsnes core using the basic MB preset

2 Likes

You might have to make sure that your display isn’t stretching the image to fill the screen. There might be a setting somewhere in the monitor controls or OSD. What you would want is to see vertical black bars on either side of the screen even on your desktop once you switch the resolution to 2560 x 1440. Once that is done, RetroArch should treat the display like any other 2560 x 1440 16:9 display and everything should scale properly.

You might also have to enable GPU scaling if you can’t set this properly using the displays controls.

1 Like

I’m not surprised, it would probably work in Vulkan also. The bandwidth, as @Cyber mentioned, is probably within limits.

As far as the stretching goes. If you want to take advantage of the 21:9 aspect I think 2560×1080 would be your best bet. (Although it will look a bit fuzzy since 1080 isn’t a factor of 1440)

Then…

In RA settings make sure you use “FULL” as your video aspect.

Then load one of my presets. Under the Background layer shader parameters…

Set Scale Aspect to INHERIT FROM SCALE MODE (i.e. 0.00)

And Image Fill Mode to STRETCH HORIZONTAL (i.e. 1.00)

And turn off Mirror Wrap.

and it should look like this…

For graphics that have LEDs in the background do the same with the LED layer.

Or paste these into a background only preset.

HSM_BG_SCALE_KEEP_ASPECT = "0.000000"
HSM_BG_FILL_MODE = "1.000000"
HSM_BG_MIRROR_WRAP = "0.000000"

Or these into a background with LEDs.

HSM_BG_SCALE_KEEP_ASPECT = "0.000000"
HSM_BG_FILL_MODE = "1.000000"
HSM_BG_MIRROR_WRAP = "0.000000"
HSM_LED_SCALE_KEEP_ASPECT = "0.000000"
HSM_LED_FILL_MODE = "1.000000"
2 Likes

@Duimon

Went back and did switch to 2560 x 1080. I did not see those settings in the Basic version, so I went to the Standard version and was able to replicate. I only got 40-45fps though so it was not really playable. I moved back to the basic version, and while I may have some black bars on the side, I managed full speed. It may just have to be the concession made to be able to play on a larger, external monitor vs. my 15" screen with my current laptop. I can live with that for now :grin: Thank you to you and to @Cyber for helping me try and get all that figured out!

3 Likes

Very soon, when the shader is updated, the Standard will perform as well as the current Basic, and have all the graphic related features of the Advanced. These wide screen scaling settings will my default where they are supported.

@HyperspaceMadness Has just a few more bugs to iron out.

3 Likes

Does 2560 x 1440 work? You can also try 1720 x 720 or 1280 x 720 as those are multiples of your vertical screen resolution so you should get perfect scaling just with black bars on the sides.

I’m glad you got something that works though!

https://www.nvidia.com/content/Control-Panel-Help/vLatest/en-us/mergedProjects/nvdsp/To_create_custom_timings_for_your_HDTV_display.htm

https://www.nvidia.com/en-us/drivers/custom-resolutions/

2 Likes

The Holiday season is back and I’m in the spirit! Thanks again for this wonderful shader!

5 Likes

Hi there, I have tested the Duimon Mega Bezel shaders with many versions of Retroarch, even the latest version (downloaded from retroarch website) in a fresh install but I simply cannot load these shaders… I’m on Win10 with a Ryzen 5700G and a GTX 1080 using Vulkan. It always shows “shader preset couldn’t be applied”. What am I doing wrong? Thanks in advance!

2 Likes

First of all. Do you have the Mega Bezel shader from @HyperspaceMadness installed and can you load one of the base presets?

Link to shader thread.

Make sure you follow his installation instructions carefully.

It is the HSM Mega Bezel Shader. :grin:

These are the Duimon Mega Bezel Graphics and Presets. I only take advantage of it’s features. :innocent:

@HyperspaceMadness is the wizard behind the magic.

If you do have it installed. Please post a log and He and I can help you sort it out.

5 Likes

Got it to work, thank you! :slight_smile:

4 Likes

Wow! 2000 posts, 38,000 views, and 1091 clicks on my GitHub link. That’s amazing! :star_struck:

Looking back I would never have thought all that has happened since then would ever happen.

I have completed 89 graphics plus misc supporting graphics like monitors etc.

I have created overlay versions for almost every one.

I even created day and night no-curvature overlay versions by request on Discord.

I have created 1720 presets!

By request I have created EmulationStation Carbon theme controller graphics for newer systems.

I have lived and breathed this project for 551 days. If I spent only 2 hours a day that would be 1100 hours, but it has probably been more than 2000 hours. :exploding_head:

I can’t begin to tell you how great of an effect this project has had on my quality of life. It has been one of the most enjoyable journeys I have ever had.

Just the other day I fired up Batocera on my new Pi400, to continue my R&D on my intended overlay contributions. I fired up one of the homebrew SNES games to give it a whirl, and what do you know?

Doesn’t that look familiar? :grin: My graphics make up roughly 90% of the Batocera default overlays!! How cool is that? An operating system dedicated to emulation, distributed globally for dozens of hardware platforms!

Now that is cause for celebration.

Problem is…

I had nothing to do with it. :frowning_face:

Apparently this post…

One single post, by some curious brand new (And never heard from since.) user, was much more than just a simple question.

He has a GitHub repo with my graphics in it with not a single credit to my name, or link to my page, or anything else you might expect.

He didn’t fork my repo, which you might expect, since he is on GitHub.

2000 hours of hard, dedicated work, just so some damn poser could take my graphics and butcher and Frankenstein them to the point of embarrassment.

Given how active I am in the community and how generous I am with requests, I would think he might approach me and ask to collaborate.

Nope.

I had been planning for over a year to approach the Batocera team with my overlays. I even uploaded pre-built decorations to a repo for the son of a bitch. :face_with_symbols_over_mouth:


As much joy as this project has given me, this naked betrayal is causing me even more pain. It has been dominating my thoughts, my mood, and keeping me awake at night.

That my friends, is NOT quality of life. It is to the point that it is endangering my 27 years of sobriety.

It’s not just the betrayal, it is shame.

I wanted to do something different. I put my source online, unlike any other artist I have ever known.

I feel like a damn fool.


I am taking a break. I don’t know for how long. At least until next year.

I don’t want to keep my team hanging so I have uploaded my newest graphics, and my newest Advanced presets to a “New” folder in the Duimon-Mega-Bezel repo.

For now I have kept the current CC license and the source online. (Although I may make the source repos private.)


I have some new computer parts arriving soon for an upgrade, some new server parts for a second server, and I may paint my living room.

Take care.

9 Likes