Retroarch slows after selecting shader

Computer Specs: Macmini2,1 Mac OS X Version 10.7 Processor: 2.16 GHz Intel Core 2 Duo Memory: 2GB 667 MHz DDR2 SDRAM Graphics: Intel GMA 950 64 MB

Retroarch Version: 1.3.6 Stable

When I load any shader via in game or through the retroarch.cfg (video_shader = “”), Retroarch slows down a lot. The actual XMB interface (no emulator loaded) slows down and becomes unusable and it’s very slow navigating the menu. Once I go back into retroarch.cfg and remove the shader in (video_shader = “”), everything is back to normal.

Has anyone experienced this?

I’ve attached my cfg file

Your GPU won’t be able to handle much of anything, unfortunately. You might be able to squeeze some more performance out of it if you disable the ribbon shader in settings > menu.

I guess it is pretty old. I also installed GMABooster to O/C it to 400Mhz, but it didn’t seem to workout too well.

Would a raspberry pi 3 do better with shaders than my Mac mini?

Slightly better, but its GPU isn’t exactly a powerhouse, either.

I’m having issues with n64 turning the screen white, and then crashing (no shaders selected). I’m able to play NES/SNES.

The core does not start so nothing is added to tetrarch-core-options.cfg

I see your command line in your signature to get a log. How do I do the same in OSX?

I have this from OSX tho, but it’s probably not helpful. http://pastebin.com/0VtwPhMH

I was able to pull a log http://pastebin.com/40vyKh5x

Hmm, that’s a weird error :confused:

We’ll look into it. I don’t think it’s anything you can deal with on your end.

Same thing happens w/ Nestopia? Did I miss a prerequisite before installing retroarch?

http://pastebin.com/2kf6FmRN