ScummVM only utilising a software renderer

Hi all,

I’ve spent some time tackling this now. Is it normal for ScummVM to only utilize the software renderer? I noticed this at first when I realized some lag while using shaders with ScummVM, and when I investigated why as this shouldn’t happen with my machine I was able to narrow it down to the fact my ScummVM core is only using software renderer.

When I go into the ScummVM GUI menu, no option for OpenGL is presented. Attaching the renderer line as OpenGL in scummvm.ini does nothing. I have tried with both gl and glcore in the video settings of Retroarch.

I haven’t been able to find anything online about this at all.

The only thing I could find is in the scummvm-wrapper git for libretro (assuming this is even the correct repo for where the core gets built from), the bundle_datafiles.sh for that repo has a line

“hw_render = “false””

So is this simply intended and there is no gl rendered support for ScummVM in Retroarch, or am I derping out somewhere along the way here?

Thanks in advance.

2 Likes

I am using the Vulkan video driver. When I run ScummVM and check the driver it says Vulkan.

The driver settings are in the Retroarch menu, not the ScummVM UI.

You may want to see if you have a scummvm.cfg in \config\scummvm or a scummvm_libretro.cfg in \config that is setting a software override.

Thanks for the reply - Love your bezel overlays with the HSM pack by the way!

I have tested it with Vulkan, gl and glcore. I do have an override config for ScummVM, but only for testing the three different video drivers in retroarch. I have tested all 3 combos in the override config, as well as removing the override config and testing with all three drivers as the main retroarch.cfg setting. I have the same experience as you, in that when I’m running the core it says it’s utilizing the driver that I have set in retroarch.cfg

But ScummVM itself is running in software mode - So for example if you try to load a game with mods, it will give a warning error on load “Mods cannot be used with software mode”, when you go into the scummvm menu it shows running in software mode only with gl not an option, and both visually and performance-wise it is very apparent that ScummVM is running in software mode instead of using the potential advantages of using the gl renderer.

To confirm: the output display is much more pixelated than the standalone ScummVM while using the gl renderer - whereas if you set the standalone ScummVM to software renderer, it also gets the same pixelated output as Retroarch further confirming that the Retroarch core is not actually utilising a graphics driver but instead just running in software mode - and the performance hit is very substantial.

I believe there is a difficulty in properly describing this issue: ScummVM utilizes a software renderer for its own personal backend, which it seems from my testing is always the software renderer in the Libretro Core, while the abstracted video output of Retroarch itself, while it is effectively using the video driver (thus shaders can be utilized), it is not properly plugged into the ScummVM engine, so ScummVM is always rendering software rendering, even though Retroarch is outputting it successfully to its own graphical output with a driver. It seems like the ScummVM core is not actually built to take advantage of hardware rendering and instead is just using a software renderer wrapped by Retroarch, rather than the hardware renderer of Retroarch properly being plugged into ScummVM. I’m not a very bright man, so this is probably oversimplified and off the mark, but if a dev or somebody with a bit more know-how could confirm this behaviour, or tell me what I’ve done wrong so that I can effectively utilise ScummVM to its fullest potential in Retroarch ,that would be great.

I hope this makes sense and can lead to further assistance! ^.^

1 Like

An opengl accelerated core would have code akin to

glsm_ctx_params_t params = {0};
params.context_reset = context_reset;
params.context_destroy = context_destroy;
params.environ_cb = environ_cb;
params.stencil = true;
params.context_type = RETRO_HW_CONTEXT_OPENGL_CORE;
params.major = 4;
params.minor = 2;
glsm_ctl(GLSM_CTL_STATE_CONTEXT_INIT, &params);

I see nothing of the sort at https://github.com/libretro/scummvm-wrapper/tree/main/src (assuming it’s the correct repo) so my guess is that it only has a software renderer indeed.

3 Likes

If it does not appear in the ScummVM menu, it does not have it.

This implementation is new, compared to the one we had before which was very old, and they are correcting many things. In the releases you can see what they are adding.

If you want to improve the speed, the Vulkan video driver is the fastest. And if you enable Threaded Video, you’re going to get considerably better performance at the cost of losing accuracy.
Setting » Video » Output

We usually try to stick with software renderers whenever possible, since they work the best with RetroArch’s various video and sync technologies (including shaders).

1 Like

I always thought that threaded video added input latency without affecting accuracy and things like that? Yes, I just checked:

In a modern PC it is not necessary to have hardware acceleration, perhaps in more modest devices if it is useful.

@aorin1 I can tell you from my own experience that the latency is almost null, where I have seen it most affected is that it eats frames. It’s a great option, either way, I don’t think it affects an adventure game too much.

Thanks for getting back to me all!

I’ll have a play around with Threaded Video and see if it allows me to run the shader with the content.

Edit: Threaded Video was definitely the money ticket here, yikes.

I’ve honestly never used that setting before so it didn’t even occur to me give that a go but the performance gain has been very significant in this instance. And the output looks great with the shader as well, so can’t identify a downside to the software rendered on my end anymore. Gonna have to read into threaded video and what it does now and ascertain if I should be having it on by default now :thinking:

Thanks again for the help!

1 Like

The big thing that threaded video does for shaders is they will no longer block the emulator from progressing to the next frame even if they aren’t finished rendering. The result is something like frameskip, which is generally not great in high-action games but shouldn’t be a problem at all for point-and-click adventure games (though cut scenes may suffer a bit). The aforementioned latency is also not really a problem for these games, and the last noticeable side effect is the smoothness of framerate sync is reduced (most noticeable on smooth pans/scrolls; again, not likely to be a big problem on these games).

So, probably not a great idea to enable it by default everywhere but ScummVM is a pretty good fit for it, insofar as the drawbacks won’t really affect it.

You’ll still need to be a little careful with the shaders, as using one too beefy will push the frameskipping high enough to be distracting even on these games. It’ll look like you’re running them on a 386 instead of a 486 :stuck_out_tongue:

4 Likes

Yes, hw acceleration support is not currently implemented in the retroarch core, only software rendering is available.

That is an old backup repository, to be removed soon or later. Correct one is https://github.com/libretro/scummvm

Regarding performance issues, before switching to threaded video mode I would check if core options can help. You can try enabling “Timing > Allow Timing Inaccuracies” and/or “Frameskip > Auto”. You can also enable “Auto Performance Tuner” setting, that is supposed to switch the relevant core settings automatically, if needed.