Emulation slows down on TV, but runs fine on PC monitor

hey guys,

i just recently discovered this awesome emulator/frontend and am really loving it. unfortunately, as the title says i’m experiencing slowdowns when using it on my TV. weird thing is, on my PC monitor it runs perfectly fine, smooth and without any slowdowns.

this is my setup:

windows 7 x64 32gb ram nvidia GTX980 (4gb) intel i5-3570k (oc’d ~4ghz)

the pc is connected via hdmi (nvidia port) to my denon AVR which in turn sends the picture to the TV (hdmi)…

before running retroarch i use the win+p shortcut to make the TV the only active screen. libretro detects a refresh of 60.002hz which is what i’ve set. i use the crt royale shader…but my GPU should be able to handle it, right?

i’ve been testing mostly SNES (bsnes accuracy) and PSX (mednafen software) so far. for example in super mario world it slows down every time i clear a stage and that black vignette closes on mario. sometimes the slowdown will “stick” and the game then only runs at exactly 30fps until i open the GUI and go back to the game again…then the slowdown is disappears!

similar slowdowns happen with final fantasy IX on PSX…however they seem much more random. they can just pop up at any time. even when i’m not doing anything…just walking around the screen, nothing changes but suddenly it slows down to 40 sometimes 50fps. sometimes it recovers on it’s own, sometimes i have to open the GUI for it to calm down.

when i go into the video menu during such a slowdown libretro is also detecting a lower refresh rate (i.e. ~30hz, 40hz,…)

i can avoid the super mario world slowdowns for the most part if i disable or select a more lightweight shader. so it does seem to be a performance issue(?)…question is, why is it only there when i use the TV? my pc specs should be more than sufficient i think…

anybody have any ideas? i’d really love to get this thing working…

thx

Well my first question is “What is an AVR?” Is this connected when you attach the HDMI to your PC monitor?

it’s an “audio-video receiver”…it receives signals from different sources which it then decodes (in case of bitstream data i.e. dolby digital), splits and sends on to the corresponding devices (speakers, tv, beamer…)

it receives and sends video (RGB) to my TV via HDMI. the video signal to my PC monitor however comes directly from the GPU via DVI. the sound (PCM, 24bit 192khz) is received by the AVR via HDMI and then sent to the speakers in both cases.

what you are suggesting is that the AVR is the root of the problem? i was also thinking that might be causing it…maybe it’s confusing the “dynamic rate control” in retroarch somehow? i already disabled every kind of “processing” (auto lip-sync and the like) in the settings of both the ARV and TV…unfortunately that didn’t help.

Perhaps you could try bypassing the receiver, and plug the HDMI cable directly from your video card to your TV?, just to test. Also, make sure Game Mode is enabled on your TV, it’s mandatory.

hey guys,

so i believe i found the culprit and surprisingly enough it wasn’t the AVR after all…the slowdowns were still there with a direct connection to the TV. what seems to be causing the slowdowns is the “hard gpu sync” option… maybe “hard gpu sync” hates HDMI connections? or maybe it just hates my TV. anyway problem solved, kind of…would have liked to use it since it supposedly reduces input lag. but it doesn’t feel so bad with it turned off really.

1 Like

Did you try setting it to 1? That will still negatively affect performance but not as bad as 0.

1 Like

hm…no i haven’t. what does that setting do exactly?

i have to say the difference in input lag feels very small to me whether “hard gpu sync” is enabled or not…

but the real question is…why is there even a performance problem when i display retroarch on my TV / via HDMI while it’s not there on the PC monitor?

hard gpu sync forces it to hold off and try to emulate the frame as closely to vblank as possible, which is more demanding on the CPU. A setting of 1 gives it more leeway. No sync at all is still fine and if it doesn’t bother you, go with it.

Dunno what’s up with your TV setup. RetroArch tries to detect your refresh rate, so if it’s getting something weird from your splitter/distributor, that could cause the issue.

1 Like

ok, i just did a quick test with the bsnes accuracy core, “hard gpu sync” on and frames set to 1 and that seems to help with the slowdowns…it’s kinda weird though right? i mean it doesn’t seem like a “performance issue” in the sense that there’s a heavy load on the cpu and when it’s done it speeds back up. once the slowdowns happen they stay…until i exit to the UI and return to the game. then it runs fine again until the next slowdown.

oh well…as long as i can get it working i’m happy:) thanks for your help!

btw. is there a way to get a mame (0.176) playlist in retroarch? scanning the romset doesn’t seem to do anything…

Native MAME scanning isn’t implemented yet. You can scan for FBA, though. There are also some utilities floating around on the forum that can scan MAME ROMs and make playlists against their databases.

ok, guess i’ll go hunting for those utilities then…thanks again!

I noticed something kinda similar when hooking up my computer to a receiver and then a tv via HDMI. I had frame issues in every game, no matter the system. Then I disabled transparency in the Windows 7 settings and it worked fine.