Building new emulator PC, go with 3060Ti or 6700XT?

My new project is building a two player arcade cabinet that I am going to stick a PC into and use LaunchBox+RetroArch+all the other emu cores. While the main goal of this arcade will be to use with some light guns and play various shooters on PS1 and PS2, I do plan on having being able to use the joysticks and buttons to play some SNES, N64, and Gamecube games as well.

I am trying to figure out what GPU to get for this emulator PC. I will be upscaling every game I can up to 1440p resolution, so that plus filters will require a bit of GPU grunt. It has been a long time since I have had an AMD card, but in looking at the availability, price, and performance today I see that a 6700XT in modern games renders on average a bit better than a 3060Ti does, uses a little less power, and is cheaper as well. But the real question is, how do the AMD cards perform in the various emulators I plan on using? Does anyone have some experience using 6000 series AMD cards to play games while upscaling in things like PS2 and Gamecube? Do they run into any weird graphical issues?

1 Like

I cannot say for certain but it appears from recent conversations that AMDs OpenGL drivers are lacking a bit compared to Nvida. There have been reports of cores that force glcore not running.

Legend has it, youā€™ll need atleast RTX 4090 - 4x crossfire setup :stuck_out_tongue_winking_eye: The 4090 has only an pricetag of that of an new car these dayā€™s :thinking:

No seriously, any gpu will do just fineā€¦ Except when you start using additional ā†’ Shaders :flushed: to polish ingame gaphrics somewhat ā€¦

Then, itā€™s an different storyā€¦

I saw my gpu peak from between 5% - 10% Max to 40 - 75% depending what shader preset (passes ā†) you use !!

And iam talking about an new RX-6600 PCIe 4.0 card i recently have purchased ā€¦

Take care for yourself & eachother ā€¦

cheers TD

A 3060 is an overkill.

I have a 1060 and still have a lot of breathing room for more advanced emulators like Cemu and Yuzu.

For PS2 and Gamecube/Wii upscaling even a 1030 should be more than enough.

And then there is the pS3 project which is still in its infancy ! But looks very promising , iam running many games at Full speed and Upscaled x3 in rpcs3 with my card (RX-6600) ā€¦ Good value card price/performance wiseā€¦ :kissing_smiling_eyes: :ok_hand:

cheers TD

I guess I should have just asked AMD or Nvidia so this didnt turn into a discussion on whether the card choice is overkill or not. lol. But I thought a generic ā€œAMD vs Nvidiaā€ would turn into a war too. Sure the cards might be a bit overpowered, but I do want to be able to have a very smooth 60fps constant even when upscaling the resolution and using shaders so I figured better more powerful than not enough.

So we have one person saying AMD OpenGL driver have been having some issues lately, and another saying an RX 6600 is working well. Hmm. I would like to go with the 6700 XT, as it is the same price but offers better performance and lower power draw. I was just worried about if there were any weirdness with their emulators, as I have often seen that even though AMD has the performance there in games it only would run at like half the speed it should in various program tasks like encoding or running GPU compute applications and the like. I guess that was just a dumb question then on my part because emulator game rendering is still just rendering graphics.

But thatā€™s the point. If you plan to emulate the PS2/Gamecube with shaders + upscaling, a 1060 is still more than good enough with plenty of breathing room left.

In most cases, the bottleneck with emulation is the CPU, not the GPU. So you could save some money from the GPU and get a faster CPU, like an i5 12400, which should be the most efficient choice (6 very fast cores (you wonā€™t need more for emulation) with no ā€œefficient coresā€ nonsense.

If i was to build a PC just for emulation, i would go with a 12400 CPU and a 1660 GPU (the fastest non RTX Nvidia has) which should be perfect for upscaled emulation and higher FPS for systems up to 360/PS3 and Switch.

You canā€™t go wrong with an even faster GPU, but i doubt it will ever reach the usage you expect.

Itā€™s your money though.

As for your on-topic question, go for Nvidia for itā€™s better GL performance.

1 Like

And one last thing, i assume you plan to emulate those systems in RetroArch? Because if you do, you are not going to have a good time. Dolphin is based on a 2+ year old version and even then a few things donā€™t function as they should and there are additional graphics bugs.

The PCSX2 equivalent in RetroArch is also in alpha stage with various issues (like how it gets stuck in the background after quitting) and nobody knows when a proper core will be made.

So maybe you have to stick to the standalones for the time being. Thankfully, they both have some sort of shader support, though not as extensive as RetroArchā€™s.

1 Like

I definitely remember AMD being more troublesome for with emulators and having inferior OpenGl, but this was years ago. Still, if you suspect it may cause problems Iā€™d just go with Nvidia. Upscaling just to 1440p is underutilitzing the RTX 3060, but perhaps you want to connect or upgrade to a 4K display in the future or use downsampling. Otherwise, you might as well just stick to something like a RTX 3050.

2 Likes

&

Why worry about OpenGL problems :sweat_smile: ? More and more new console emulators and or new revisions of old emulators are switching to ā†’ Vulkan :smirk: :ok_hand: ā€¦ i.e: psp, pscx2, rpcs3 the list goes onā€¦

Planet Vulkan is the future, the needs of the many outweigh the needs of the few :vulcan_salute: :grin:

And from my experience, using Vulkan requires less cpu resources i have noticed.
Which is not to say that CPU is suddenly unimportant!

I recommeneded the 6000 series gpu, also because it has RSR feature something that other Amd gpuā€™s lack (not supported in driver / ā†’ NoN-RDNA gpuā€™s). RSR (Radeon Super Resolution) is an technique to upscale for example 1080p to 1440p in pc games to maintain FPS or even improve at higher resulotions.

And itā€™s not a heavy priced card at that too ā€¦

I donā€™t know if Nvidia cards has an similar upscale technique ? Not talking about INGAME settings like FSR, DLSS, etcā€¦ , but pure GPU driver based ā€¦

Cuz ingame there are settings like DLSS (Nvidia) that does an similar thing to improve fps in a nutshell But developers need to implement that in their games for nvidia to make use of it ā€¦

Whereas, AMD can apply RSR to ANY game by an flip of a switch :yum: :ok_hand:

So, choose wisely :woozy_face:

Cheers, TD

1 Like

The 6700XT should be fine and would have a lot of extra headroom for running at 4K with the most advanced shaders currently on the planet. You might also be able to get by with a 6600 but you may not have the extra headroom if you want to stream and stuff like that at 4K using the GPU at high bitrates at the same time. Iā€™m not even sure if having a faster 6700XT would help in this regard because Iā€™m not 100% sure how encoding works on AMD cards, but what I do know for certain is that the quality and encoding efficiency is currently worse than both Intel and nVIDIA and they have a lot of catching up to do.

If you go with an Intel non-F CPU, you can take advantage of the dedicated encoding block in the IGP to more or less negate this shortcoming. If you went with an nVIDIA GPU, you could easily use an AMD CPU and not have to worry about encoding performance.

Then again in general nVIDIA GPUs are more expensive for the rasterization performance they offer and they have higher driver overhead so AMD GPUs tend to perform relatively better than them when paired with older/slower CPUs.

So these are most of the pros and cons here. You donā€™t have to pay more for a 3060Ti in order to have the ultimate emulation experience, you can step down to a cheaper nVIDIA RTX 2060 12GB, 6GB, 1660 Super or Ti, RTX 3060 or RTX 3050 and be just fine.

As for the CPU, you donā€™t need much. If Intel donā€™t leave out the IGP because it can come in handy. Anything in the XX400 range from the last few generations should be fine. With AMD the X600 range would be excellent from the 3600 onwards.

So just consider all things once more and make an informed decision but donā€™t overspend unless you know youā€™re going to be doing other things with the system besides emulation.

You can take a look at my shader preset pack to see a good example of what can be achieved with the different performance tiers of probably the most advanced Shaders/Shader combinations in the world - HSM Mega Bezel Reflection Shader.

I donā€™t use generally use curvature but certain types of curvature need some serious GPU grunt if also using the highest Mega Bezel performance tiers but do note that the highest performance tiers are not a requirement since the lower ones can look very close in terms of the quality of the CRT effects.

fyi, when you talking about Video encoding ā€¦ For the best quality , encoding has to bedone by CPU not GPU ā€¦ Although it GPU+CPU combo can speed up the process quite an bitā€¦ But in my experience, the results arenā€™t that great compared to cpu ā†’ ONLY encodesā€¦

i.e artifacts, color banding, compression noise ā†’ filesize etcā€¦etc ā€¦

Just sayingā€¦

I have 2 videos here:

https://mega.nz/file/tJwA1TwC#Jl-CPVgsHTFHcXO527ih-qUAWX3L0i_jQvkRyXMw_PM

https://mega.nz/file/cRwgUDSC#OGEH-hDzfQOq7gsjCUh0C5OsQcVMsBPjo-6I-67m3L0

One was encoded using an nVIDIA GeForce GTX 1070 and the other, an AMD Ryen 5 5600X.

One had an average encoding speed of 3.66 fps while the other had an average encoding speed of 60 fps.

Can you tell which one is the GPU encoded file and which one is the CPU encoded file?

First of, the ā€œflicksā€ looks like animated gifs, an bunch of still images stitched together LoL !! 2nd, the color palettes of that of an game is much different of that of cartoons, or live action movies !!

Realy, the best example to compare gpu vs cpu is to use VCEenc/NVEenc (depending wich card you have) on Live action motion captured content at 1080p res ā€¦

Not to mention the DIFFERENCE in ā†’ā†’ FILEsize ā†ā† !! Noticed atleast an 100Mb gap between your samples up there ā€¦!!!

And very SHORT samples at that too, multiply these samples to an duration of an movie !!! The difference will be huge in filesize !!!

Compression + quality = CPU encode buddy ā€¦

Use CloneBD , Hybrid among others, that support GPU encodingā€¦ To create an test sample using ā†’ Very Low-Med Bitrate (2500 - 3500 kbits/sec / Avg - VBR), and upload these to me for comparisonā€¦ ā€¦

Test samples of atleast 1080p res, because of the bitrate youā€™ill notice the difference more easily ā€¦

cheers, TD

All I see in that comment is a bunch of ā€œnoiseā€ and deflection from my simple question at the end of the post. The source doesnā€™t matter and you seem to be missing the point.

You still havenā€™t told me which one you think is the GPU encoded file and which one is the CPU encoded file and why.

Anyway, the point is that while there might be a benefit in terms of achieving the absolute best possible quality encoding and the most extreme small file sizes by using CPU based encoding, this can come at a great cost in terms of time. If a user already has a system with a decent GPU, it doesnā€™t have to be spectacular, they can achieve a more than acceptable level of quality and compression ratio and very importantly, encoding time using whatever resources they have at their disposal.

Good or acceptable quality and file sizes are all subjective but theyā€™re not things that are exclusive to CPU based encoding for all users.

I like to repurpose older systems with few cores and not very fast cores for use in emulation as well as streaming machines. The CPUs are more than up to the task for emulation + streaming (but not all of that plus encoding) and when paired with a modern GPU, the user has the option of using the GPU to offload their encoding duties to or the Intel IGP as well if using an Intel CPU.

This is why I gave my original response comparing GPU encoding resources based on the different GPU choices the OP had to make showing pros and cons of going with either vendor and how it was possible to negate some of the cons depending on which CPU choice the OP went with.

So please donā€™t take my original response out of context and attempt to turn it into a pissing contest between CPU (software) and GPU (hardware accelerated) encoding choices. The original post didnā€™t even mention streaming. That was just a side note.

Very easy, the smallest flick should be the cpu encodeā€¦ If, it wos encoded with the very same preset / settings (bitrate, fps, maybe filters) !!

Well, true craft takes timeā€¦ AND ussually ā†’ Money :smirk: :ok_hand:

Yes and NO !! You canā€™t have BOTH , especially when using GPU only (My point exactly) !! However , GPU encodes are just great for streaming / broadcasting yesā€¦ But for your favorite media content that have to last decades eg , favorite movies, TV-series, family recordings etcā€¦ You want the best of best out of it, REALYā€¦ ā†’ Quality CLOSE as possible to the ORIGINAL ā€¦ but with the smallest filesize possible ā€¦ YOuā€™ll never achieve BOTH with current GPU tech ā€¦ NO WAY ā€¦

It wos not diz ducks intention to flush your response down my toiletā€¦
Even though my bowl wouldnā€™t have the slightest difficult to do JUST THAT :point_up: :smile: :pinching_hand:

Indeed, just like you wos trying to make an point, diz duck merely stated the facts of the difference in Low Bitrate encodes between GPU & CPU. The differences are just huge in quality and in size,when it comes down to Very Low Bitrates.

And believe me , i have experimented allotā€¦ Even using FILTERS to cleanup the source right BEFORE encode process wonā€™t favor the GPU encode in any way (macro blocks, compression noise etcā€¦) !!

Therefor, GPU encodes is an No-GO-SHOW ā€¦ oh nose :lying_face:

cheers, TD

I would argue that because of this issue, you are only trying GPU encoding in low budget software, which isnā€™t a valid sample.

You get exactly what you pay for. Trust me when I say that GPU encoding in high end software is more than acceptable.