RetroArch - Native CRT Support


#141

I have been struggling with this for the last few days… I finally got CRT_Emudriver installed, but everytime I tried to run the new RA file, the resolution changed and my screen would never kick in…

I was actually getting farther with CRU resolutions for some reason…

Anyway, I figured out my problem. My CRT doesn’t support 15Khz.

  • 1920x240 has to be at 120Hz for it to pull up.
  • 1920x480 can be 60Hz, but of course with double the horizontal lines it’s at 31Khz.

So, mostly I am screwed lol.

What’s interesting is the latest file would actually take my 1024x768 resolution, switch it to the 1920x480 @60hz resolution from CRU, but then the resolution wouldn’t change to the 1920x240 @120hz resolution from CRU, instead just showing the half-horizontal-size image.

So, outside of adding 31Khz support to this… I don’t think there’s anything else of value I can add to this testing-wise, or get much out of usage wise :frowning:


#142

If you are using a PC CRT monitor then just adding the interlace shader to most of your games at 480p should simulate interlaced and progressive switching pretty well. For 15khz CRT’s that will never be an option that’s really the biggest reason this is so helpful


#143

Here’s a few testing updates as well. One important thing is that I’m finding this build seems to take a hit in performance. Testing side by side with my original retroarch.exe I’m receiving frame drops in places I wasn’t at all previously such as the DK64 and Animal Forest 64 intro, most other cores don’t seem to experience the slowdown but because N64 is so demanding it is very apparent there. The second thing I was able to notice was that I tested your fork that runs at 1920 super res side by side with the 2560 fork and there appears to be scaling artifacting issues

Here it is using the 1920 exe

Here is using the 2560 exe, same config with both


#144

I did assume there would be a small overhead with my code. But it would be marginal. When it comes to N64 as you are aware angrylion is really, really, realy, unbelievably CPU hungry lol so I’m not surprised it notices there!.

The reason 2640 is the defacto super resolution is because nearly all horizontal resolution multiply to an exact integer to it. In the case of 1920 not so much. Again 3840 increases this change of exact multiplication too.

I would.say from all my tests and knowledge 2640 is the option you want too use.

Just to clear my mind. You have not noticed a slowdown in any other cores?


#145

Correct I haven’t noticed that in the other cores


#146

OK Guys I have submitted the PR. Hopefully soon this will be in mainstream Retroarch. :grinning:


#147

I’ll leave it with the dev’s but if they decide not to put this in mainstream , I will release my fork.

One issue is that function placeholders will needed to be placed in other areas of the code to compile for other platforms like android and ps3.

@hunterk I have started looking into Linux, my core code should work great for switching. I need to create desktop resolution switching and x-server mode-line additions. But before I do I will need to setup my machine with a dual boot with Linux. This should not take too long though :sunglasses:

Here is a teaser of the pending release. I have incorporated mame resolution detection with a nearest set. as well as removing most of the overhead @Abwezi. Switching is super fast.


#148

whew, looks great! I’m sure it’ll make it into mainline, it just may take a little bit of work to get it merge-able. No worries, though.

For linux, doing it all through xrandr is easy enough and we can add precalculated modelines for it all at once, but that won’t work for wayland or KMS. I’m not sure what the best way to deal with those guys is, but if we have to pick one or the other, my vote is for wayland/KMS.


#149

Well all the switch info is generated within my CRT core code. This means it will send the resolution and hz to any function we choose to generate a resolution switch. Whether that be XRRScreenSize / XRRScreenConfiguration as long as mode-lines are available or xrandr --newmode / xrandr --addmode / xrandr --output.

To be honest its is 98% complete for any switching method, only small changes or additions will be needed.


#150

Amazing work, man. Can’t wait to use the final version.


#151

Just a small update here. Still waiting for the devs to accept this switching version.

I have started work on the Linux version. Here is the video. Currently Retroarch looses focus on the resolution change. However, everything else is looking good.


#152

Whew! I love it!

Btw, the merging should happen soon. There are a few other big PRs that we’re working through at the same time, but this one is in the queue.

EDIT: Here’s a good reference for using DRM for modesetting, which should work in KMS and wayland (and also X?): https://01.org/linuxgraphics/gfx-docs/drm/drm-mode-setting.html


#153

Does this only work if you have a single graphics card? I have 2 screens and 2 GPUs. An Nvidia powers my LCD and an AMD powers my 15khz CRT. The CRT works fine with Switchres + Groovymame (Windows 7) so I know it works. I also have all the appropriate mode lines. However, I cannot get Retroarch to switch CRT resolutions correctly. I always appears squashed (like it’s applying the literal Super resolution). The only way I can get it to display correctly is if I enable 2560x224 (for example) on my desktop. Then I can’t really see the RetroArch menu properly, but the games load at the correct resolution.


#154

If you have multiple monitors you would need to make sure retroarch is set to use the monitor you are using as your CRT in video settings. Also make sure integer scale is set to off.

I’ve also had issues with the retroarch.cfg becoming corrupt (before this codebase too) so try it with a fresh install of RA or just backup the cfg so it creates a new one.


#155

My monitor is set correctly. The resolution just doesn’t switch. Going by what I read in the log files, it seems it’s trying to switch the resolution on the wrong GPU (the one that’s powering my LCD, not my CRT).


#156

Actually, it looks like it does set full screen on the correct monitor, but it just doesn’t switch to the proper resolution:

Log excerpt (edited for length):

[INFO] [Video]: Video @ fullscreen

[INFO] [GL] Found GL context: wgl

[INFO] [GL] Detecting screen resolution 640x480.

[INFO] [GL] Vendor: NVIDIA Corporation, Renderer … <= this is the wrong card

[INFO] [GL] Version: 3.3.0.

[INFO] [GL] Using resolution 640x480


#157

Make sure you switch your menu driver to RGUI so it stretches properly. Also make sure your aspect ratio settings are set like so


#158

Hi, I have not considered multiple GPUs. I would have to have a look into this. My assumption would be that it is only trying to set your default output, in this case you desktop on your Nvidia card. If you set the The ATI card as your default output, changing from your NVIDIA. This should solve this problem. Let me know if it does.

Most people will have multiple monitors connected to one card, then CRTEmudriver can take care of the rest.


#159

It seems there is no trivial answer. Either I have to run multiple monitors off one card, or RetroArch has to provide configuration support and implement the options using NVidia and/or AMDs extensions.


#160

I not sure but I don’t think this is limited to Retroarch! Most applications will use you default display device to output any fullscreen resolutions.

It could be as simple as using a batch file to set which card you want as default before launching Retroarch.