I play with a custom aspect ratio at 1920x1080 for my gaming channel, and many shaders on SNES9X core cause audio distortion and game speed distortion, especially my favorite, the Analog TV Shader Pack. Tried alt+tabbing, etc. but I don’t know what to do. Tried a ton of shader presets, both CGP and GSLS or whatever, and about 50% have the same issue. Is there some fix to this? Seems like the previous Retroarch didn’t have as much of an issue, maybe that’s the problem, but not sure.
If your GPU just isn’t fast enough to get full speed, there’s not much that can be done about it, short of buying a faster/stronger one.
I’m a bit dated, but surely not that dated, right? I mean I can play Skyrim at max graphics, this is SNES. Idunno, it had occurred to me, but I dismissed it.
Different shaders vary in their graphics card requirements. What kind of GPU do you have in your system ? If a specific shader worked in 1.3.2 it should in theory work now in 1.3.4 but I don’t know what changes were made that could slow it down.
[QUOTE=lordmonkus;39313]Different shaders vary in their graphics card requirements. What kind of GPU do you have in your system ? If a specific shader worked in 1.3.2 it should in theory work now in 1.3.4 but I don’t know what changes were made that could slow it down.[/QUOTE]
Actually not sure about the GPU, but turning off “Hard GPU Sync” seemed to fix my problem. I had gone into this thinking Retroarch would be hard to set up, so I had followed a guide that said turn that on.
Yeah Hard GPU Sync is is preferred to be on, it reduces input lag. You could try turning on Threaded Video but from my reading up on things I gathered it’s something that is preferred to be set to off. I could be wrong on that one though, I have mine set to off and no problems. If I am wrong I hope someone can correct me on it, still learning as I go with Retroarch.
Threaded video is usually a last resort because it will make your scrolling stuttery. Hard GPU Sync indeed reduces input latency but it’s demanding. You can try setting it to 1 instead of 0, which is less demanding but also less effective. Still better than nothing, though. You could skip Hard GPU Sync and use the frame delay setting, which does pretty much the same thing but it doesn’t adjust automatically, so a single setting might not work for all games/cores.
The only advantage to threaded video in this use-case is that it has a weird side-effect whereby too-heavy shaders don’t cause reduced framerate and instead causes something like frameskip. That is, the game still runs at its normal speed, it just drops frames of video if it can’t render them fast enough.
Your CPU should be fast enough for most cores but it could be suffering from throttling (emulators are weird and don’t use the CPU all the time during a frame, and this can trick your PC into thinking that it’s idle and needs to clock down, causing performance issues). You might try setting your power saving settings to ‘max performance’ or whatever to see if that’s causing problems.
Also, just to be clear: shaders don’t use the CPU, they use the GPU, so even though you’re emulating SNES or whatever, if you run a heavy shader, it’s your GPU that’s getting taxed.
Just realized GPU=Graphic card. Nvidia GeForce 555M I’ve not had an issue with input latency, but I am playing Chrono Trigger at the moment, and I imagine it would be more noticeable with action games than with RPGs. Some shaders still have it, but it’s not nearly as bad now.
Try to make a profile for retroarch in your nvidia panel. Tell to use the nvidia card instead of the intel gpu. In “power mangement mode” choose “prefer maximum performance”.
See if that helps.
[QUOTE=Tatsuya79;39341]Try to make a profile for retroarch in your nvidia panel. Tell to use the nvidia card instead of the intel gpu. In “power mangement mode” choose “prefer maximum performance”.
See if that helps.[/QUOTE]
That’s a good idea. Will it take away from my recordings on OBS, though?