"radeon: Failed to allocate a buffer" on RHD 7750

I’m running into an odd issue with the open source Radeon driver on Arch Linux. If I try and start a game in the MAME (2014) core for RetroArch I get…

radeon: Failed to allocate a buffer:
radeon:    size      : 1073741824 bytes
radeon:    alignment : 16384 bytes
radeon:    domains   : 4
radeon:    flags     : 4
radeon: Failed to allocate a buffer:
radeon:    size      : 1073741824 bytes
radeon:    alignment : 16384 bytes
radeon:    domains   : 4
radeon:    flags     : 4
radeon: Failed to allocate a buffer:
radeon:    size      : 1073741824 bytes
radeon:    alignment : 16384 bytes
radeon:    domains   : 4
radeon:    flags     : 4
radeon: Failed to allocate a buffer:
radeon:    size      : 1073741824 bytes
radeon:    alignment : 16384 bytes
radeon:    domains   : 4
radeon:    flags     : 4
radeon: Failed to allocate a buffer:
radeon:    size      : 1073741824 bytes
radeon:    alignment : 16384 bytes
radeon:    domains   : 4
radeon:    flags     : 4
radeon: Failed to allocate a buffer:
radeon:    size      : 1073741824 bytes
radeon:    alignment : 16384 bytes
radeon:    domains   : 4
radeon:    flags     : 4
RetroArch [ERROR] :: gl_create_fbo_targets :: Failed to set up frame buffer objects. Multi-pass shading will not work.
RetroArch [ERROR] :: gl_init_fbo :: Failed to create FBO targets. Will continue without FBO.
RetroArch [ERROR] :: gl_check_error :: GL: Out of memory.
RetroArch [ERROR] :: gl_check_error :: Non specified GL error.
Segmentation fault (core dumped)

Now the system has 4GB of RAM and was only using 300MB when I tried this. The GPU has 1GB of RAM so it’s not likely that either. So what exactly would cause OpenGL to “out of memory” when it’s clearly not?

Are you running any shaders?

Yes, but I got the same result after turning it off.

Ok, I’ll test when I get home to my laptop running the open source driver.

Looks like it may be a shader issue (it wasn’t deactivating when I tried to turn it off before).

It’s a bit weird though. Here’s an example case:

Load 1944 in MAME 2014, go into shader, load xBR HLSL (5xbr-v3.7a.cg), set to 5x and apply. It applies fine and you can see it working.

If I now quit RetroArch, reopen it and load the game again, it’ll bail out with the error messages originally listed.

So it works if you apply it to a currently running game, but doesn’t work if you try and load a game with the shader already activated.

Hmm. Yeah, that’s weird. I wonder if it’s initially making a larger framebuffer than normal that then shrinks down to the actual size or something like that. Radius may know for sure, since he’s done a fair bit of work on the MAME core(s).

I tried to make the buffer size WxH but I got black screens in several games, so it’s using a baked in 1600x1200 buffer by default IIRC. I guess you’re exceeding your GPU capabilities doing 5x with that buffer size.

I asked R-Type for a way to fix that but he suggested to use alternate video approach which looks terrible imho so I dunno…

Ok, I -don’t- get this issue with the official Catalyst driver so it seems to be a limitation with the open source driver. Which is a shame as the open source driver is a lot nicer than the bulky, ABI breaking official driver!