Combining adaptive sharpen with CRT shaders

Hello All,

I’m confused with combining CRT-Lottes and CRT-Hyllian with Adaptive Sharpen shader. I’m combing several shaders and found that CRT-Lottes doesn’t work with Adaptive Sharpen while CRT-Hyllian does work with Adaptive Sharpen but [B]doesn’t work without it.

[/B]Here is attached shader presets I’m trying to set up.

Screenshots with appropriate names.

aa4.0-gtu-lottes | aa4.0-sharp2x-gtu-lottes | aa4.0-sharp2x-gtu-hyllian52 | aa4.0-gtu-hyllian

Some of the GLSL shaders or presets don’t work… try the CG versions instead in that case to make sure. I’m not sure what look you are trying to go for, but IMO the best CRT shader is CRT-Royale-Kurozumi.cgp in the CG > CGP folder. It’s a modified CRT-Royal meant to replicate Sony PVMs. Assuming your GPU can handle CRT-Royal.

Here it is on Neo-Geo, make sure to enlarge to 100%.

The shader preset that looks most like consoles on TVs I had growing up is crtglow_gauss_ntsc_3phase.cgp or .glslp (leave as composite for NES, switch first shader to ntsc-pass1-svideo-3phase.glsl for SNES, PS1, N64 etc.).

Thanks for your suggestion. But unfortunately I don’t like it very much. Yeah it’s a great shader and my GTX970 handle it with no problem. But I just don’t like it. Scanlines are way too strong.

The main idea behind my experiment is to actually improve the picture visual quality without ruining its authentic style look. And not to replicate that back in the day crappy look. That is why I’m using AA4.0 shader instead of xBR. I’m using CRT shaders not because I want to achieve that old school look but just to hide picture imperfections. Just look how much better CRT-Lottes handle with dithering compare to CRT-Hyllian.

In the same time I want to use as few shaders as possible. 12 shaders for CRT-royal o’realy? Is it worth to make it look like Sony PVM and leave all jaggies untouched?

Anyway back to the main question does anyone knows why I can’t combine Adaptive Sharpen and CRT-Lottes shaders?

Might be a glsl thing. I can get adaptive sharpen to work with crt-lottes using cg.

[QUOTE=Great Dragon;42530]Thanks for your suggestion. But unfortunately I don’t like it very much. Yeah it’s a great shader and my GTX970 handle it with no problem. But I just don’t like it. Scanlines are way too strong.

The main idea behind my experiment is to actually improve the picture visual quality without ruining its authentic style look. And not to replicate that back in the day crappy look. That is why I’m using AA4.0 shader instead of xBR. I’m using CRT shaders not because I want to achieve that old school look but just to hide picture imperfections. Just look how much better CRT-Lottes handle with dithering compare to CRT-Hyllian.

In the same time I want to use as few shaders as possible. 12 shaders for CRT-royal o’realy? Is it worth to make it look like Sony PVM and leave all jaggies untouched?

Anyway back to the main question does anyone knows why I can’t combine Adaptive Sharpen and CRT-Lottes shaders?[/QUOTE]

I already told you why it probably doesn’t work and Axiphel said the same thing.

I don’t think you understand the purpose of scanlines (or the word “authentic”). They certainly have nothing to do with making the image look “old and crappy”… quite the opposite, in fact. I see what you are trying to do now, I misunderstood and thought you were going for CRT simulation.

The Aladdin screens you posted don’t look good at all. You are trying to remove pixelated edges in retro 2D games. Those aren’t “jaggies”, they are suppose to be there. However, the beauty of the RetroArch shader system is that you can go for any look you want… but removing pixelated edges of sprites is not “authentic”.

Anyway, whether it improves the image or not is another story… but use CG shaders and what you are trying to do should work.

You all were right. The problem was in GLSL version of the shaders. Using CG shaders allows combining Adaptive Shapned and CRT-Lottes. However CG CRT-Lottes shader is an updated version of original shader with a bunch of new features but it makes it very GPU intensive. Combination of CG AA 4.0 + GTU + Lottes loads my GTX970 for 100% while GLSL combination of the filters loads GPU only for 25% (approximately).

So I’ll stick around with “aa4.0-sharp2x-gtu-hyllian52” combo which provides those strong black outlines for the edges that I really like.

Yeah, maybe “authentic” wasn’t the correct word I use. I just don’t want to make a picture looks less pixelated. As I don’t fill that nostalgic about that. Actually I was quite surprised when I saw Aladdin on a PC for the first time. I didn’t understand why it is so pixelated. As a kid I was sure all PCs are better than consoles and they have to produce better picture than my Genesis. And I didn’t see (or it’s better to say I don’t remember) that much pixelation on my crappy TV.

That shader Lex suggested is quite good and makes me fill nostalgic. But I want to take the best from the old age and combine it with the new age to make a picture shine without ruining it much. Yes it has nothing to do with authentic. I’m sorry I confuse you.

I think GTU is the performance heavy one. I can run Lottes full speed but GTU by itself produces audio crackles for me.

I’ve tested it with MSI Afterburner (GPU Load)

CRT-Lottes.cg - 25% CRT-Lottes.cg + Blargg_NTSC_Filter - 64%

CRT-Lottes.glsl - ~5% CRT-Lottes.glsl + Blargg_NTSC_Filter - ~7%

The difference is huge. Also I noticed that Lottes produce different results with and without Blargg.Without Blargg it looks like very low resolution shadow mask.

with Blargg RGB | without Blargg Filter

GTU consume around 10% of power with Blargg and 5% without. So it’s not heavy at all.

CG Adaptive Sharpen produce artifacts. GLSL doing fine

Here is comparison of predefined aa-shader-4.o shader preset which includers Adaptive Sharpen shaders. CG | GLSL

Multipass version of Adaptive Sharpen working fine.

So now we have 2 CG shaders already which are not working correctly:

  • Adaptive Sharpen
  • CRT-Lottes

I’m using CRT-Lottes from “old school Analog TV pack” thread which is something in between the newest CG implementation and “outdated” GLSL. The basic different is the CG shader has that Bloom effect while GLSL doesn’t.

aa4.0 CG

No artifacts for me.

So what could cause the problem for me? I’m using the latest shaders from 1.3.6 release. But actually 1.3.5 has the same for me.

UPD: MD5 checksum of my shader 13a7d2053e08daac28800c56dc03d088 *adaptive-sharpen.cg