I have been thinking about getting a Arc a750 myself, but from what I have seen around most people have nVidia cards on the forum. It’s good to have multiple types and brands of cards for testing to help optimization.
I really like the card and it’s features so far, extremely good for the price. But it is very much a niche thing. I’m trying to find the things that it struggles with and so far mega bezel is the one.
I tried with different drivers but so far it just failed to load shaders on the others.
Thanks
Hmm, The V2 version of the shader when I finish the optimization phase might be a bit faster.
One thing I’m thinking about is possibly making a script which could turn all the tunable parameters into static values. I think would have a big performance improvement. That would result in a Mega Bezel package that had static values which means not configurable after the script was run. This would be a bit like the BigPEmu versions of the Mega Bezel.
I think for us guys that like to tinker with the parameters all the time that might be more trouble… unless I’m not understanding how it would work. Once you “locked in” with the script would there be a way to “unlock” to make more tweaks?
As I’m sure you are aware, it is highly publicized the Arc drivers are very much first gen and need LOTS of tweaking by the Arc software team. I am sure once the driver matures, the performance will improve. Of course waiting will be the hard part.
Yeah of course, just was wondering if anyone else had tried mega bezel with an Intel GPU before. I’m gonna dive into the settings on my own and see if I can get it working with any tweaks.
I managed to make one in bash, it traverses the preset chain till the main reference.
You’ve to tell him what’s the file with all the pragmas too, so that it can lock even the default parameters not changed by the presets themselves.
It is not perfect everytime, but works mostly.
It would be cool if ‘not simple’ presets could save the defaults too, it seems it skip those parameters.
Currently using it to use koko-aio smoothly on this crippled samsung phone.
Maybe it works under windows too with Cygwin/Wsl/Whatever
don’t have a powerful computer, so I applied the potato presets from the mega bezel and was happy with the result, but I wanted to improve it a little more. So does anyone know what configuration I have to do to make the game screen straight, without curves, and reduce the black edges to use more space on my screen? Sorry if my English is bad.
To turn off the curvature of the screen edges, use the curvature setting and change it to zero. To use more screen space (I’m assuming you mean to make the game area larger) use the viewport setting and increase it. I don’t use the potato presets, so I’m not sure what is available in there, but if those settings aren’t there then you can’t change it.
Thank you, I was able to greatly reduce the curvature of the screen and increase the playing area, thank you for the instructions
!Just an update, I reset my retroarch configs and used glcore drivers and Mega Bezel works flawlessly with the A770! I must’ve messed something up before, but now I’m gonna go back and try vulkan drivers to see if they work now.
Just wanted to update anyone coming here in the future. Gonna keep testing different settings as well.
The first thing that comes to mind than would be resolution upscaling.
Back one more time to say that I saved my current settings exactly as they are but switched to vulkan drivers, and it once again cannot run Mega Bezel. Back to glcore and it’s flawless again. I thought Intel had a pretty good handle on vulkan drivers but something weird must be going on. Either way, glcore works like a dream on all cores I tested.
Ah this is great news!
Yeah it’s a good question. Sometimes there is a difference in performance on Vulcan vs GLCore (in either direction) on different types of GPUs.
Looking forward to all the amazing upcoming updates!
Would love to see some Ambilight style shader presets with console specific backgrounds.
Would be particularly cool to see an effect where the LED light on the physical console (if the background is of a Sega Genesis or PlayStation for example) actually has its own realistic lighting behavior independent of the ambient glow emanating from the TV screen.
Yes this will be supported
Hi @Hyperspacemadness, first of all thank you for the work you do and share with the community, to you and to everyone who collaborates with you. Since curiosity is strong in me, can you give us some more hints of what V2 of your shader pack will consist of, and I wanted to ask if you have thought about a specific time frame for their release. Always grateful for your work.
I think I found a bug in the current version. I upgraded to 1.16.1 from 1.12.0. Previously this worked fine and looked as I wanted. With shadowMask 1, force on fake scanlines, add any amount of bloom higher than 0 and you get this:
Should mention I first noticed it in the ffmpeg core after loading up content higher than 400p, because it forces the fake scanlines to trigger. This is all in MBZ__1__ADV__GDV-NTSC. My temp fix for the time being will be to turn off bloom.
Screenshot from 1.12.0 (fake curved scanlines triggered on interlaced content, and some bloom):
I was wondering if this is possible or not?
During gameplay, is it possible to access the current shader file (via ftp on my tablet) and adjust parameters directly? If so, what file should I look for?
They are in the shaders folder if you saved them… or in the presets folder in Mega Bezel but I have no idea if the changes will happen while you have the game open. I’m guessing you will have to reload the shader file. Probably way easier to do it through the F1 menu.