Possible to add new resolution settings: max_video_fullscreen_x/y?

Hi!

My quest for setting up a dedicated Apollo Lake based RetroArch system using Ubuntu 16.10 with KMS continues. My aim with this system is to basically set everything up and be done with it. I plan on using different TVs and monitors with the system (for example when bringing it to friends), so different resolutions need to be handled well. Without further ado, here’s the particular “issue” I have:

This system is 4K capable and the Intel video driver will output 4K if it detects such a screen on HDMI. The problem here is that I don’t want it to run in 4K, ever. I suspect I’m not alone in this. There are some drawbacks with 4K and they’re pretty much all related to performance. Everything from rendering to framebuffer handling in the driver slows down. This may be an issue for 3D rendered games, but it’s also an issue for the 2D consoles if one wants to reduce input lag by using performance sensitive settings such as max_swapchain_images and frame_delay. Besides, there really is very little point in using 4K with these old consoles/games.

So, why not just use the existing video_fullscreen_x/video_fullscreen_y settings in RetroArch? The problem with that solution is that it hardcodes the resolution. It’s also a problem for lower resolution screens that can take higher resolution content and scale it down. For example, my Samsung plasma has a native resolution of 1360x768, but also accepts 1920x1080. When I boot the system, the Intel driver correctly identifies the native 1360x768 resolution and uses it for the command line. However, with video_fullscreen_x/video_fullscreen_y set to 1920x1080, RetroArch will force this resolution when starting. The problem with that is twofold: 1) it actually usually looks worse than running at the lower native resolution and 2) it adds a couple of frames of input lag since the picture now needs to be passed through the TV’s built-in scaler before it can be shown.

I think two new video settings would be useful to handle this case: max_video_fullscreen_x and max_video_fullscreen_y

The way I see it working is:

From what I’ve seen, RetroArch normally applies the same resolution as the system is currently using for the display. The new settings max_video_fullscreen_x/y would have no effect at all for system resolutions equal to or lower than the one configured in these settings. However, if the system is currently using a resolution higher than the one used in these settings, RetroArch would constrain itself to the configured max resolution.

For my system, I would set max_video_fullscreen_x=1920 and max_video_fullscreen_y=1080. For any monitor or TV with 1920x1080 resolution or below, the system would behave exactly as today. For any higher resolution screen, such as 4K TVs, RetroArch would constrain itself to 1920x1080.

These settings are probably mostly interesting in the Linux/KMS case, which is what you’re most likely to use when setting RetroArch up in a console-like installation (meaning you don’t have access to a GUI/desktop and keyboard + mouse).

By the way, this is pretty much a continuation of the discussion held in this recent thread and suggests a way to handle this automatically instead of using manual re-configuration. I can also mention that I haven’t been able to find any automatic way of handling this directly in Linux.

1 Like

I’d be willing to add this setting myself, at least to the OpenGL driver, if you (the LibRetro Team, that is) think you’d be able to accept it. I don’t think it’s an overly obscure feature and I’d recommend that it’s only available in retroarch.cfg and not exposed in the GUI.

I very much doubt it would be turned away if it doesn’t affect anything else. However, I’m not seeing how it would fix your 720p/1080p problem. Wouldn’t it still try to push out 1080p in that case?

EDIT: I think someone (radius?) also added the ability to change among some common resolutions from the menu…?

Okay, I’ll give it a shot.

The GPU driver’s default behavior is to use the display’s native resolution, even if the display reports that it can take higher resolutions. So, I’m actually quite happy with how it works for all displays with 1920x1080 or lower resolution. The only additional thing I want to do is to constrain the maximum resolution RetroArch tries to run at. So, if I connect a 4K display, RetroArch would check the values of the proposed settings and compare them to the system’s current resolution (which will be 3840x2160). For example, when it sees that the system resolution is higher than max_video_fullscreen_x=1920 and max_video_fullscreen_y=1080, it will instead use 1920x1080.

Hmm… I haven’t seen that. I’ll take another look.

Hi Brunnis,

I have a similar problem with a 4k system, only in my case instead of switching the actual fullscreen resolution (some of my external displays can’t actually set < 4k resolutions) I would prefer to keep the fullscreen resolution at 4k so I can continue to use X normally, but have RetroArch render to a smaller sized framebuffer that is then scaled up by the GPU, so basically giving the increased performance (letting me actually game at 60fps) without changing the screen resolution. I suppose there are use cases for both situations.

Okay, here’s a first implementation attempt: https://github.com/Brunnis/RetroArch/commit/d8eb0c39d07bf12527413399e1c0f4b43a814e41

Take a look and see what you think. The new settings are called video_max_fullscreen_x and video_max_fullscreen_y. I’ve only tested them on Windows so far and they seem to be working as expected.

EDIT: Please note that when using Linux+KMS, the setting video_fullscreen needs to be “true” for any of the settings video_fullscreen_x/y and video_max_fullscreen_x/y to take effect. This has been true all along, but it’s worth mentioning since video_fullscreen defaults to false in Linux+KMS mode even though it actually runs in fullscreen.

[QUOTE=bparker;51631]Hi Brunnis,

I have a similar problem with a 4k system, only in my case instead of switching the actual fullscreen resolution (some of my external displays can’t actually set < 4k resolutions) I would prefer to keep the fullscreen resolution at 4k so I can continue to use X normally, but have RetroArch render to a smaller sized framebuffer that is then scaled up by the GPU, so basically giving the increased performance (letting me actually game at 60fps) without changing the screen resolution. I suppose there are use cases for both situations.[/QUOTE] I see what you mean. However, the fix for this sounds a bit more involved than what’s needed to fix my case (see code in commit above). I don’t think I’m the right person to attempt that.

On this line:

Could you end up with a problem if you hook up to a display that’s a weird aspect ratio (for example, taller than wide)? Would it be safer to compare the dimensions individually?

This is a bit tricky… My current implementation compares number of pixels, which means that the individual x/y values themselves aren’t actually what’s important, but rather the product x*y. This can be a bit confusing, but it should handle different aspect ratios well.

An alternative solution would be to compare x and y separately. If the current system resolution exceeds the set maximum in either direction, the resolution would be changed to that specified by the settings. This has some side effects; for example:

System resolution: 1600x1200 video_max_fullscreen_x/y: 1920x1080

Result: Since 1200 exceeds 1080, the actual resolution used by RetroArch is 1920x1080. With my current implementation, the resolution will actually remain at 1600x1200, since it contains fewer pixels than 1920x1080.

If we choose to measure each side individually, like illustrated above, we can combine it with checking the current aspect ratio and compare long side to long side and short side to short side. For example:

video_max_fullscreen_x/y: 1920x1080 With a screen in landscape, we’d compare video_max_fullscreen_x to the horizontal resolution. With a screen in portrait, we’d compare video_max_fullscreen_x to the vertical resolution.

There are quite a few ways in which to attack and solve this problem. I’m kind of leaning towards the simple method currently implemented, combined with an extra description row in retroarch.cfg that explains that it’s the resulting number of pixels that is used in the comparison.

After a few hours, I now have a solution to my problem and it involves appendconfig. The solution is not completely obvious, so I’ll post it here for anyone else that wants to solve the same problem. I used Ubuntu 16.10 Server and I have configured the script from step 3 below to auto-run when I start the machine.

Note: The script below limits RetroArch to 1920x1080 if the system is using a resolution that exceeds that, for example when a 4K TV is connected. You can easily change the limit by editing the lines in step 2d and the maxWidth and maxHeight variables in step 3.

1. Install fbset: sudo apt install fbset

2. Create the necessary RetroArch appendconfig files: a) sudo nano retroarch_no_res_append.cfg b) Insert the following lines and save the file: video_fullscreen_x = “0” video_fullscreen_y = “0” video_fullscreen = “true” c) sudo nano retroarch_max_res_append.cfg d) Insert the following lines and save the file: video_fullscreen_x = “1920” video_fullscreen_y = “1080” video_fullscreen = “true”

3. Create a new shell script and add the lines below to the file. Substitute {user_password} with the user’s login password.

#!/bin/bash

maxWidth=1920 maxHeight=1080

width=$(echo “{user_password}” | sudo -S fbset | awk ‘NR==2{sub(/.mode “/,”");sub(/x./,"");print;}’) height=$(echo “{user_password}” | sudo -S fbset | awk ‘NR==2{sub(/.*x/,"");sub(/"/,"");print;}’)

if [ $((width * height)) -gt $((maxWidth * maxHeight)) ] then echo “Current resolution is ${width}x${height} which is higher than the specified maximum of ${maxWidth}x${maxHeight}. Starting RetroArch in ${maxWidth}x${maxHeight}.” retroarch --appendconfig “/path/to/retroarch_max_res_append.cfg” else echo “Starting RetroArch in ${width}x${height}.” retroarch --appendconfig “/path/to/retroarch_no_res_append.cfg” fi

if read -r -s -n 1 -t 5 -p “Press any key to abort system shutdown and return to the command line…” then echo " Shutdown aborted." else shutdown now fi

4. Use the script created in step 3 to launch RetroArch. When you quit RetroArch, the script will shut down your computer unless you press a key within 5 seconds.

Hi,

Are those parameters implemented ? video_max_fullscreen_y is exactly what I need for my issue : only a limited part of my screen is viewable (inside an arcade cabinet)

Thanks

Bubs

Have you tried to use the parameters? What happens when you do?

Tried yes, but nothing happen…

I looked closer at the link above and while this code is still in @Brunnis github repo it must not have been merged into RetroArch.

It could use a little tweaking based on the comments upthread but this feature makes a lot of sense in the context of Lakka with low-powered devices. I definitely see the value.

Maybe Brunnis or someone else will pick the project back up.

I am not sure if this is the same problem that i have. My problem is that i run 4k resolution @ 24hz as desktop resolution on my htpc but when i start retroarch i want to have [email protected] So i changed the config to start Retroarch with 1920x1080. But after i quit retroarch the desktop resolution i set to [email protected] I want it to switch back to 4k resolution when i quit retroarch.

RetroArch won’t switch modelines back on exit. You can use a utility like 12noon to handle it, though.

Sorry i forgot to add that i am running Ubuntu and i was not able to edit my post before it was approved. But i found a pyton-script after alot of google that maybe will help me

1 Like

What I usually do in Linux is just use a launch script for RetroArch that sets the modeline via xrandr and then returns it to the normal modeline on exit.

I’d like to know how to specify resolutions on a per-core basis on my Windows build. This, from Brunnis’ post earlier,

…is exactly what I’m looking to do. I have several cores (GB, GBC, GBA, VirtualBoy) that use overlays that require specific placements and scaling of the video in order to fit “inside” the overlay. Other cores can just go fullscreen with no issues, but since I set up my system when I had a resolution of 1920x1080, my scaling and position settings no longer work on my new 4K monitor. I’m looking to force RA to use the specific resolution of 1920x1080 ideally on a per-core basis, but I’d accept RA using that resolution across all cores if that’s the only way for this to work…