[QUOTE=Great Dragon;42530]Thanks for your suggestion. But unfortunately I don’t like it very much. Yeah it’s a great shader and my GTX970 handle it with no problem. But I just don’t like it. Scanlines are way too strong.
The main idea behind my experiment is to actually improve the picture visual quality without ruining its authentic style look. And not to replicate that back in the day crappy look.
That is why I’m using AA4.0 shader instead of xBR.
I’m using CRT shaders not because I want to achieve that old school look but just to hide picture imperfections. Just look how much better CRT-Lottes handle with dithering compare to CRT-Hyllian.
In the same time I want to use as few shaders as possible. 12 shaders for CRT-royal o’realy? Is it worth to make it look like Sony PVM and leave all jaggies untouched?
Anyway back to the main question does anyone knows why I can’t combine Adaptive Sharpen and CRT-Lottes shaders?[/QUOTE]
I already told you why it probably doesn’t work and Axiphel said the same thing.
I don’t think you understand the purpose of scanlines (or the word “authentic”). They certainly have nothing to do with making the image look “old and crappy”… quite the opposite, in fact.
I see what you are trying to do now, I misunderstood and thought you were going for CRT simulation.
The Aladdin screens you posted don’t look good at all. You are trying to remove pixelated edges in retro 2D games. Those aren’t “jaggies”, they are suppose to be there.
However, the beauty of the RetroArch shader system is that you can go for any look you want… but removing pixelated edges of sprites is not “authentic”.
Anyway, whether it improves the image or not is another story… but use CG shaders and what you are trying to do should work.