Why is RA so cpu intensive? - SOLVED

Nestopia by itself is nowhere near as cpu intensive as RA running Nestopia. Why is this?

My processor runs at full speed the entire time while running RA and the fan has to blow really hard. This isn’t the case with Nestopia. Why is the cpu having to do so much more work with RA?

Are you sure it’s the CPU that’s running so hot and not your integrated graphics? I have an ION board and it would cook if I had any shader enabled.

Is your performance monitor showing your CPU to be maxed, as well?

Core Temp shows the processor is maxed out - running at top speed and both cores hover around 25-75% load. The processor runs at a near constant 2.2GHz, sometimes rarely dropping to 2.1 or 2.0 Max is 2.2.

Task manager shows that 50-60% of the CPU is being used.

Regardless, this is a lot more than Nestopia standalone in both cases. Task Manager shows 10-20% of the CPU being used with Nestopia standalone. Core Temp shows less than 25% load for both cores. Processor when running Nestopia standalone usually fluctuates between 1.1 and 2.0 GHz.

I’m only using a single-pass shader, does that really require this much additional power?

Single-pass shaders can be intense, but crt-easymode is fairly light (considering how nice it looks). That should be irrelevant for CPU usage anyway, though. Some guys on IRC tried it out and they’re getting less CPU usage in RA than in standalone Nestopia, so something else is amiss with your configuration.

Any ideas what it could be? Overlays? Drivers? Video settings? OS?

My graphics card settings are as follows: Ambient occlusion: off Anisotropic filtering: application-controlled Antialiasing FXAA- off Antialiasing gamma correction- on Antialiasing mode- application-controlled Antialiasing- transparency- off CUDA GPUs - all Maximum pre-rendered frames - 1 Multi-display / mixed gpu acceleration - multiple display performance mode Power management mode - prefer maximum performance Shader cache - on Texture filtering- anisotropic sample optimization- on Texture filtering- negative LOD bias - allow Texture filtering quality- high performance Texture filtering trilinear optimization - on Threaded optimization - auto Triple buffering - off Vertical sync - use the 3d application setting

Here are my retro arch settings: Video- Integer scale off Full screen on Vsync on Hard gpu sync off Everything else default

Shader- Default filter nearest Shader passes 1 Shader #0 crt-easymode.cg Shader #0 filter don’t care Shader #0 scale don’t care

Drivers- Video: gl Audio: sdl Audio device: blank

Everything else default.

I just checked again and the task manager shows up to 60% CPU usage with RA but only up 20-30% with nestopia standalone.

Ok, I did some more readings with Core Temp by leaving it running and visible while playing in fullscreen mode.

question - if I leave core temp running so it’s visible, does this automatically disable vsync? I assume it would still be on but I was getting tearing while playing. Is this because vsync is disabled or because it’s trying to sync back and forth between the desktop (the start menu is visible) and the game while coretemp is visible? If vsync is being disabled then the readings I got weren’t accurate for vsync enabled.

With nestopia by itself, frequency is around 1.2 ghz and sometimes cycles up 2.2 but usually stays below 1.5. With nestopia in Retroarch it’s the opposite. The processor stays around 2.2 and 1.5 most of the time, and only rarely dips to 1.2. Core temp shows that both cores fluctuate between 40-60% most of the time, sometimes reaching up to 70%, sometimes down to 20%. Then I experienced the “A/V burp” and it showed both cores at 100% for about a second (probably less).

So, the A/V hiccup is caused by a sudden spike in CPU usage where it is taxed beyond its limits. Why would this be happening? It never happens with Nestopia standalone and I also get lower CPU usage overall with Nestopia SA. Its so confusing because some people report lower cpu usage with RA.

I tested here, core i7 4700HQ 2.40GHz, nvidia GT 740M, driver 331.38, xubuntu 14.04, kernel 3.14.21, in Xorg, using compton as window compositor. Using top command. Clean config in retroarch (only tweak aspect ratio to Core provided and video_disable_composition = “false”).

  • CPU usage of 6~8 % in nestopia standalone

  • 13~14 % in retroarch, RGUI

  • 31~36 % in bsnes/higan standalone, balanced (tested with cx4 type)

  • 44~48 % in RA rgui, balanced core

  • 21~24 % in vba-m standalone, openGL 2x

  • 12~14 % in RA rgui

  • ~29 % in ppsspp Qt version

  • ~23 % in RA

RetroArch idle: 9~13 %

SOLVED.

I found the culprit: it was “Threaded optimization” in the NVidia control panel settings. Setting this to either ON or AUTO resulted in horribly high cpu usage in Retroarch. I created a custom setting for RA and set “Threaded optimization” to OFF and now my cpu usage is much lower and RA is showing similar cpu usage as Nestopia standalone (if not better, I haven’t looked at it closely enough yet to tell).

The question remains: why did Nestopia standalone not suffer this problem?

Is threaded optimization a setting that is enabled in Retroarch by default, but not in Nestopia SA?

I’m thinking that my driver might not be playing nice with something, either my OS or my hardware or something. If I was having basic problems with threaded optimization, then it would make sense that I was having problems with high cpu usage in Retroarch if Retroarch is designed to use threaded optimization by default.

Or it could be that RA is not designed to play nice with threaded optimization, or there is a conflict there somewhere. What I need to do is find a program that I know uses threaded optimization and see if I get similar performance problems - high cpu usage and cpu spikes.

Anyway it’s still a mystery to me, but by creating a custom setting and forcing threaded optimization OFF for Retroarch, my cpu performance using Nestopia now seems on par with Nestopia SA.

I will need to play test this for a while in order to see if the A/V hiccup problem will also be solved by this, as I think it is a related problem to the high cpu usage.

PatrickM, could you share some hardware specs? In the terminal:

lspci
cat /proc/cpuinfo | grep "model name"
uname -a

Sorry, are you referring to the RA terminal? I don’t know how to enter text into it, typing does nothing. :confused:

Oh, sweet. Glad you got it figured out. I don’t know wtf “threaded optimization” is/does. Do you have a link to an explanation?

Sorry, are you referring to the RA terminal? I don’t know how to enter text into it, typing does nothing. :/[/quote]

No, i’m speaking about the linux terminal. What’s your SO?

Can’t find anything official, but it says “allows applications to take advantage of multiple CPUs” in the description given in the control panel. Doing a google search has turned up a lot of forum posts from people who have had problems with it.

I don’t know why it’s necessary, even. My applications still take advantage of both cpus as needed. I think it’s just really buggy. But it’s still a mystery why Nestopia SA didn’t have this issue. Lots of possibilities come to mind, but my head hurts and I’m just glad I solved the high cpu usage issue. :slight_smile:

Sorry, are you referring to the RA terminal? I don’t know how to enter text into it, typing does nothing. :/[/quote]

No, i’m speaking about the linux terminal. What’s your SO?[/quote:34cuf0a3]

I’m running Win 7 64 Bit.

2.2 GHz Intel T4400 4GB RAM Nvidia ION 9300