Question about GPU/CPU pairing for dedicated RA/retrogaming machine

Hi everyone, I’d like to ask for opinions about the following matter: I got an old PC running a I5 9600K @4900Mhz (all cores) with 16GB DDR4@3600mhz on MSI Z390 A-PRO.

I was considering turning it into a dedicated RA/retrogaming machine and I was thinking of adding a good video card: I found some good deals, among which the one I like the most is a 6700 XT. I plan to game on a 360hz 1080P monitor.

Here my question: Is my configuration enough for this GPU or is it a wasted purchase? Is there a risk that the GPU bottlenecks the CPU? I know that this is an issue with modern titles but don’t know if it will be still the case with emulation…

AFAIK the GPU is used mostly for shaders, and I’d like to get the best out of the most demanding shaders while managing to also make the most of the monitor’s refresh rate: it is possible with this configuration or do you foresee any problem?

Many thanks!

1 Like

With any hardware, there can exist scenarios where you can max things out and bring them to their knees.

However, for emulation use up to the PS1, N64, Dreamcast even Nintendo Wii era, that hardware pairing you have there should be more than sufficient, especially at “only” 1080p resolution.

There are many high quality shader offerings available, some of them can put a strain on the best GPUs depending on how they are configured, while others which are among the best looking and offer the most authentic CRT emulation experience might be able to run on integrated graphics or even on cell phones.

You say you plan to game on a 360Hz 1080p monitor but if you want this system to be geared towards CRT Shader emulation then what you might want is a very bright, HDR600 or better capable 4K display.

Some bonus features would be MiniLED backlighting/OLED technology, 120Hz refresh rate plus BFI and HDR1000 certification to compensate for the BFI.

So I would say that your monitor might be the weakest link in your proposed setup but it can still work.

You absolutely need your display to be able to use RGB 4:4:4 Full colour format at 4K at whatever refresh rate you would like to use.

2 Likes

Hi Cyber! Thanks for your reply. I know your great work and I absolutely want to use the DTP presets👍

I also know that a 4K monitor would be the best choice for CRT shaders - which are my goal for a much more challenging project - it’s just that with this PC I don’t think it’s possible to manage high refresh 4K.

My main goal here is to put to use this old rig and at the same time test BFI @ highest refresh rate available today (except for a couple of 500hz monitors), and this is sadly only possible with 1080P monitors, currently.

Yes, there is a 1440P @360hz from ASUS but it costs a fortune (in my country ~1.500€) and in my opinion that price is not acceptable for a monitor with those characteristics (and I would say it is not acceptable in general, as far as I’m concerned).

For 4K we are relegated to miserable 160Hz at most (IPS MiniLeds) and in a few months we will see 240Hz but who knows at what price. As far as I have been able to understand by studying the matter a little, at least 300hz of refresh is needed to take advantage of a BFI that offers a truly significant reduction in motion blur (80%).

So at the moment choosing 4K means having the maximum visual performance for game and shader but also being very limited in terms of motion quality: also for this reason I would first like to understand how significant the high refresh BFI is for me and whether I can instead sacrifice it in favor of a higher resolution and better HDR performance.

I say this because the thing I absolutely cannot stand is the abysmal difference in motion quality of normal panels (60hz) or monitors where it is not possible to effectively implement the BFI compared to CRT.

1 Like

You’re welcome.

Thanks.

Wow! Seems like you’ve spoiled yourself by going down the high refresh rate rabbit hole and now you can’t unsee low refresh rate “artifacts”.

That reminds me of myself back in the CRT days. After a while it was 85Hz or higher or bust. I would literally see the flickering of the screen I when using 60Hz.

Meanwhile, I’m there in blissful ignorance, enjoying finishing games in my motion blur filled slide show of a 60Hz OLED TV.

That’s one of the reasons why I didn’t want a 65" TV. The brain gets used to things so quickly that I knew that if I did, I could never go back to 55" or smaller.

Do know that with these things there are always going to be compromises. Even CRTs have their weight, convergence and geometry issues.

Instead of focusing solely on brute forcing high refresh rate BFI at the “problem”, why not also look into the research and work that blur busters has been putting into the topic?

I’ve read about their ideas concerning displays with strobing backlights and stuff like that. Maybe that might be the way. Not sure if their research has ever made it into an actual product but it’s something to look into.

1 Like

Thanks, I’ll try your suggestion! While I’m at it, I’ll ask you one last thing: between a 6700 XT and an RTX 3070, which do you think is the best card for emulation / my PC configuration?

I can buy both for about the same price (6700 XT @280€ - 3070 @300€) and I wonder if one offers any advantage over the other:

  1. the RTX 3070 is much better for ray tracing, even if it is certainly not a high level implementation: is it of any use for shaders or some emulation cores?

  2. are the extra 4GB of VRAM offered by the 6700 TX more important?

  3. what about the width of the bus? (192 bit for the 6070 TX vs 256 bit for the RTX 3070)

  4. any other reason to prefer one over the other? (drivers quality / updates, etc.)

Thanks again for your precious advice!

1 Like

Hmmm…this is a tough one as both have their strengths and weaknesses. I think you’d be fine with either one for these purposes so don’t over think it too much.

I generally have a preference for nVidia because of the better feature set and they’re usually ahead of the competition in bringing new technologies to market. The abilities of their software stack as well as things like better hardware encoding, very easy and high quality image and video capture and transcoding (even in HDR) are significant advantages for me.

On the flip side, the two main things I like about the AMD in the same bracket are the lower driver overhead so, if I were building with an older or weaker CPU, I’d be leaning towards AMD and that 4GB of extra VRAM is a glaring omission on the part of nVidia.

This shouldn’t have much effect if your goal is primarily emulation but can really hamstring things with modern PC gaming once RayTracing is not a factor.

So for me, I’m a bit disenchanted. I refuse to buy nVidia because they aren’t giving me a good enough featureset at the kinds of prices I’m willing to pay and I refuse to buy AMD because they’re not giving me a good enough featureset at prices I’m willing to pay.

So I’m just using what I have for now.

It’s really tough for me because it’s not only about the RayTracing or the extra VRAM, it’s also about that video encoding support as well as things like Shadow Play.

If I never owned nVidia, it might have been easier to choose AMD but I might have to give up some of those things which work well either for nothing or for the sometimes half-baked and clunkier AMD alternatives.

So my choice is to wait until nVidia actually delivers something which is more compelling. Sad to say, I prefer to do this rather than purchase AMD for myself.

For someone else who might not use their PC the way I do, I’d recommend AMD in a heartbeat.

1 Like

Thank you so much for your detailed explanation!

So, if I understood correctly, RayTracing is not a factor for shader/emulation and 12GB vs 8GB VRAM isn’t either…but the latter can come in handy if I want to try some modern game.

Encoding and video capture are not relevant to me, taking into account the purpose of this rig.

All in all I’m very unsure because RTX 3070 is slightly better performing than 6700 XT but I don’t know if there could be other things I’m overlooking, for ex. heat generation: my case is very well ventilated but quite small, it was not originally purchased with in mind such (relatively) powerful GPUs.

1 Like