Optimal shaders, AR and visual settings

In any case, I got to thinking. Would it be possible to somehow change the CRT shader code to make the scanlines thinner, so they look akin to a CRT computer monitor in 640x480 mode? I actually can more or less get such a look on the version of the shader I ported to PCSX-R by increasing internal resolution. Check this out:

I do think it would be interesting to have a version of the shader that simulated a CRT monitor rather than just a CRT TV.

Edit: CRT HDTVs also had a 480p mode that looked similar to that. Just thought I’d bring that up.

That appears to turn off the shader’s Lanczos filter. Although yes, a CRT Monitor version would be interesting.

I liked the look of the CRT.D3D.br variant of the shader (which is used for Dosbox), since it had much more lighter scanlines. The differences between the two lie in: monitorgamma = 2.2; vec4 weights = vec4(distance / 0.3);

I left monitorgamma alone, but changed vec4 weights to 0.4. Results:

(Ignore the Agrias glitching sprite for the time being. And note that this .jpg version is a little darker than the .png original)

I have not changed the gamma. The older scanlines made the image darker, but since I’ve made them lighter, the overall image is lighter. Not sure how I should compensate.

Hi,

I’ve tried to integrate several shaders for RetroArch on my Raspberry Pi, but none of them seems to work. I always end up with the following error message:


GLES context is used, but shader is not modern. Cannot use it.

Ideally, I’d like to use blargg’s NTSC (RGB) with RetroArch on a Rpi. Does anyone know if I could achieve it and how?

Thank you in advance!

@geekmiki Yes, many of the shaders are based on older OpenGL conventions. However, you can use maister’s cg-to-xml python script to convert the Cg shaders to GLES versions that should be more compatible. These are the same shaders that are distributed with the android version of RetroArch.

As for blargg’s NTSC, it’s a filter (i.e., runs on the CPU) rather than a shader (i.e., runs on the GPU). Since the RPi has a rather anemic CPU already, it’s unlikely that you’ll be able to run it on top of any emulators.

Yep, increasing internal resolution basically gets rid of the blur, which kind of sucks, but still, it does look remarkably similar to a good Trinitron CRT monitor.

Thing is, the only way I could get that effect to work without any scanline distortions whatsoever was using gpuBladeSoft. On the OpenGL2 plugin, either the scanlines disappear completely, or they’re there, but there’s two to three spots on the screen where they disappear. I’m guessing both plugins handle texture size differently or something. Weird stuff.

I can more or less get the effect on RetroArch as well by multiplying the texture size by two and veeeeeeeeeeeery slightly increasing vertical resolution (otherwise the lines disappear), but this results in there not being any lines at the very top and bottom of the screen.

I’m thinking if I could identify what part of the shader code makes it so every scanline is two pixels thick, I could somehow change it so they’re one pixel thick instead. Unfortunately, I suck at programming and figuring out code. I more or less end up shooting at the dark and randomly changing shit until I get the desired effect.

Working the other way, is there a way to remove the scanlines and phosphor emulation from the shader entirely? The Lanczos filter and gamma correction would be of interest to many others, especially those who don’t care about CRT emulation.

I’ve tried a standalone Lanczos filter, but it was much blurrier than the one found in the CRT-Geom shader.

I’ve also tried out the bsnes-gamma-ramp filter.

As with pretty much everything else, bsnes has taken some sophisticated steps to achieve an authentic gamma ramp that reflects the actual appearance of games.

So what actually does it do? I don’t see much difference.

You can remove the phosphor emulation by commenting out the following lines:

vec3 dotMaskWeights = mix( vec3(1.0, 0.7, 1.0), vec3(0.7, 1.0, 0.7), floor(mod(mod_factor, 2.0)) );

mul_res *= dotMaskWeights;

As for the scanlines, unless I’m missing something, the only way I’ve seen to remove them is to multiply both rubyTextureSize and rubyInputSize by 2, but doing so gets rid of the Lanczos scaling.

Also, I swear I’m gonna lose my mind. I really, really want this shader to output a monitor-esque image on RetroArch. I’ve been trying everything, but this is the closest I can get:

And the sad thing is, I only accomplished this by almost doubling the texture size, but not quite. That means the lines are a result of non-integer texture size scaling or whatever. And as you can see, it’s not perfect, as the lines slowly fade and finally disappear near the top and bottom.

And as it turns out, the only reason I got it to work perfectly on PCSX-R was through some weird fluke with FFT’s resolution as recognized by gpuBladeSoft at 4x scale. I could not replicate it on FFIX, for example.

So yeah, I’m pretty much lost. Although I like the shader as it is, I really wanted to get that CRT monitor effect working.

Edit: Ok, I pretty much lost sleep over it, but here it is at last.

Turns out getting it to work right was so simple, I’m actually a little pissed I didn’t think of it earlier. Thank god for the XML spec’s outscale attribute.

Link to the shader:

http://www.mediafire.com/download.php?42oj7soib3pzjnv

I might see if I can get a halation version up.

@hunterk

Thanks for your help! I used the cg-to-xml script and managed to convert the cg shader I wanted to used to xml (cgw-CRT-flat.cg). It’s working, but far from full speed. Actually, it’s not playable like this… It’s like 1 frame/sec :slight_smile:

I do realize that I’m trying to achieve this on a Rpi that has very limited resources. I am just wondering if the performance issues are related to the shader I’m trying to use or if it’s Rpi related (meaning Rpi+shader=impossible at full speed).

Thanks for your help!

@geekmiki np. I’m glad you got it going :slight_smile:

cgwg’s CRT shader is pretty srs bzns, so I’m not surprised it doesn’t run full speed. You might try something really lightweight, like a scanline shader, to see if anything hits 60 fps. It’s probably going to be a similar situation to cell phones where they just lack the horsepower.

Looks like this monitor variant works with halation as well. However, I find the effect a bit too strong. Is there any way to reduce it?

On line 430:

mul_res += pow(texture2D(rubyTexture, xy2).rgb, vec3(monitorgamma))*0.1;

You can play with the last number there to modify the intensity. Smaller values = less halation.

Hey everyone, I’ve been working on the OpenEmu project for a bit now. Specifically implementing Cg shader support and also Themaister’s Cg meta-shader format. We also added our own option that applies Blargg’s NTSC filter if set.

The NTSC filter definitely works with both Game Boy and Game Boy Advance if set correctly (basically everything with less than 256 pixels per row should work perfectly). Here is an example (can’t have more than 1 link yet it seems):

Exactly 512 pixels per row should also work perfectly. What’s an issue is everything between 256 and 512. For 320 pixels per row the filter works but the image quality is degraded by A LOT. For everything over 512 pixels per row you get a weird overflow effect.

Using the Mega Drive NTSC filter would solve the issue for 320 pixels per row, though ideally we would just write a new blitter. For everything above 512 pixels per row a new blitter is needed.

@PGGB Looks great :smiley:

I think a built-in NTSC option would be a good thing for RetroArch, as well, since it would pretty much eliminate the need to support the old cpu filter libraries.

That’s really cool that you’re supporting Cg shaders, too. I played around with a build from clobber over the summer and I gotta say: you guys have a really awesome project there.

NTSC filters tend to be very system specific. SNES one is hardcoded for 256px and 512px (NTSC + hi-res blending), and they act differently. There’s one for Genesis as well, and it’s hardcoded for whatever it has (256 and 320?). I don’t know all the details about the NTSC CPU filter, but not sure if you can easily make it totally generic.

Yeah you can’t make it totally generic, but the standard SNES blitter seems to work well enough for Game Gear/Game Boy/Game Boy Advance.

I’ve started porting the Mega Drive blitter (so we don’t need all those md_ntsc files and tables):

Not sure what causes those lines but this is just the result of some brute-force copy-pasting. :smiley: One difference between the MD and the SNES filter is that the SNES one reduces color depth, which I don’t think the MD one does.

Would it help to get away from blargg’s implementation? The bit on the nesdev wiki says it’s applicable to any resolution up to 2048.

Well that one is rather specific to the NES hardware. It’s basically the same idea (converting signal to YIQ and back) but doesn’t make use of a look-up-table to make the expensive computations once at launch.

The only thing really specific to the SNES in the snes_ntsc filter is the color palette and the blitters (optimized for 256 and 512 pixels per row). If you compare md_ntsc_impl.h and snes_ntsc_impl.h, they are exactly the same. Even the other files overlap a lot in terms of logic. So the system specific stuff is actually not that important. We can’t increase the color palette anyways (Blargg already reduced it from the SNES 15 bit to 13 bit, because the look-up-table increases exponentially in size) and I think it is even possible to write a universal blitter.

edit: Some progress, I isolated the code responsible for the black lines:

Now I just need to fix it so it doesn’t simple double the previous pixel.

Agreed fully. My ideal would be something like Nestopia, with preset for rgb, s-video, composite, and custom, with an option for scanline intensity.

Which is what I thought.

Soo… I’ve got some questions: In hunterks blag (http://filthypants.blogspot.com/2012/07/customizing-cgwgs-crt-pixel-shader.html) he talks about modifying CRT-geom settings at lines 87 and 135. However, neither of those lines or ones in their vicinity contain any useful settings in the halation varieties of the shaders. Instead, I could find them at lines 19 to 20, 57, 163 to 187 and 213 to 223. Now, using crt-geom-halation-interlaced-flat.shader as an example, what confuses me is:

On lines 19 and 20 “CRTgamma” is defined at 2.5 and “display_gamma” is defined at 2.2.

At line 57, “display_gamma” is again defined as 2.2.

But: At line 166 “CRTgamma” is defined as 2.4

At line 168, “monitorgamma” (not “display_gamma”!) is defined as 2.2.

wtf? What’s the difference between monitorgamma and display_gamma, and why is CRTgamma defined at different values? Furthermore, how would I get the most faithful gamma reproduction when playing SNES games? blargg’s NTSC CPU filter applies some gamma correction, cgwg’s CRT shader applies some, and byuu came up with a different gamma ramp as well to faithfully recreate the SNES gamma ramp which, for retroarch, is a separate shader yet again.

@A for Anonymous The halation version is a complete rewrite of the shader, so any of the specific line references won’t apply to it. That post only covers the non-halated version, since it works in higan as well as RetroArch and includes a lot of helpful comments.

Those gamma variables are redefined in each shader pass that uses them, so you’ll see them in multiple places. I’m not sure why they’re 2.5 in some places and 2.4 in others but that’s a pretty small difference, so it likely doesn’t matter too much. “monitorgamma” / “display_gamma” is referring to the same thing, i.e. the baseline for the gamma correction vs. the assumed 2.4 CRT gamma, which results in a difference of 0.2 either way.

Gamma is going to be something that is very subjective, since it varies from monitor to monitor and television to television (and even the same television over time as it wears out), and there’s no One True Gamma. If you’re using multiple filters/shaders with gamma correction in each, you’ll probably want to disable it in all but one of them, unless you like your image to be hella saturated. byuu’s NTSC gamma ramp code, which maister ported to a shader, is supposed to do pretty much the same thing as blargg’s NTSC gamma correction, AFAIK.

I suppose this is the best place to ask. On GLSL, is there any way to get the same effect as the outscale attribute, particularly the outscale_y attribute, that is present in XML shaders? I’m not that great with writing shader code, but I have something in mind I’d like to get working on GLSL shaders that do not take advantage of the XML shader format for scaling.

Basically, I want to add GLSL code that is equivalent to the XML spec’s outscale attribute. Is this possible?