xBR algorithm tutorial

An informative whitepaper for a technique similar to xBR:

Thought you would be interested, Hyllian :slight_smile:

Yes, Hunter, an interesting read.

It reminds me more of that old microsoft research paper about pixel art upscaling. Though it doesnā€™t use splines.

I donā€™t know if it could run using only retroarch shader specs. If Iā€™m not mistaken, it seems to use polygons instead only framebuffers.

Finally managed to got xBR lv3 and lv4 right (only in test app for now).

Some comparisons: http://imgur.com/a/Xbu5W#0

For those who donā€™t know what levels mean in xBR filtering, hereā€™s an image about it:

Which SABR shader would be best when running at 960x720 resolution? I tried this one: http://gitorious.org/bsnes/xml-shaders/source/10dc41dc22b7e1f80ae5c7ec66458cc5bcaf4fb4:shaders/OpenGL/v1.0/SABR.shader and itā€™s very good although I get a line down the middle and a few artifacts here and there.

Could someone suggest what would be best at 960x720? Also, it must be a shader that uses XML 1.0 as itā€™s for NeoRaine.

Thanks!

Thereā€™s only one SABR, so you have no choice other than that.

canā€™t wait to test out the lvl4 shader. It looks really nice!

The translation for shader paradigm will be a hell lot harder! Because performance is a big issue for these two new levels. While LV1 and LV2 do not interpolate beyond their pixel borders, lv3 and lv4 bleed their interpolations beyond the center pixel. They bleed into their pixel neighbors. So it adds a new level of complexity to implement.

Ah well I hope you can make it happen someday. In the mean time, is your test program still publicly available? Iā€™d like to try them out on a few pictures if so.

This is a only 4x version of the LV4 I just developed:

https://anonfiles.com/file/95553996f5223220944d8d258202c13e

Hi Hyllian, I donā€™t know if youā€™re checking pms in this forum. I have a request regarding the standalone exe you released a while back in the zdoom forum. since Itā€™s probably not too interesting for others here I thought it would be better to discuss it in private.

xBR LV3 added to the repo:

https://github.com/libretro/common-shaders/tree/master/xbr/xbr-lv3-multipass

Its the first complete xBR LV3 Ive ever done.

EDIT: Only tested on PS3. It`s working flawless.

Sweet man, I just saw it almost as soon as you posted it. Just converted it to glsl so I can try it kms. Canā€™t wait to test it out. Thank you!

edit both the cg and glsl version seem to have some glitches in linux with glx and kms driver. Canā€™t really test in windows since I got rid of it but it probably works better there, lol.

I think maister will have some more work on his cg2glsl script. :stuck_out_tongue:

I already posted this on byuuā€™s forum, but it may be helpful to repeat: I had problems with it on my radeon card using the OpenGL driver, but using the D3D driver in Windows works great. So, probably chalk it up to weirdness with AMDā€™s OpenGL shader compiler.

Try this 3xBRLV4 No Blend standalone app: https://anonfiles.com/file/ca64833f5adb4bd6bb1a9aa6e8ddb62c

Original:

3x:

wow, thank you very much. state of the art xbr right here :wink: Iā€™ll compare it to the one I used before.

btwā€¦ where did you find dat dragon? this sprite is awesome.

You know, Iā€™ve never been able to get this working properly on Windows. Using the GL video driver on a GTX 570 with the latest WHQL NVidia driver (320.49). I load it in my config file with video_shader = ā€œ./Shaders/xbr-lv3-multipass/xbr-lv3.cgpā€ and this is the output I get: http://abload.de/img/retroarch-0919-230603ind1u.png

Iā€™m not usually a fan of smoothing filters like this anymore, but Iā€™d at least like to try it out since this seems better than what I used years ago, HQ4X. hq4x.cg just crashes RetroArch for me, btw.

@Awakened have you tried it with the D3D driver? On my AMD system, it works fine with D3D but OpenGL looks similar to your screenshot.

Canā€™t seem to get the d3d9 driver to work anymore. I tried disabling stuff that only works with the GL driver like hard sync and turning off the shader in my config, but I always get this:

RetroArch [WARN] :: [D3D9]: Failed to init device with hardware vertex processin
g (code: 0x8876086c). Trying to fall back to software vertex processing.
RetroArch [ERROR] :: [D3D9]: Failed to init D3D9 (Failed to init device, code: 0
x8876086c).
RetroArch [ERROR] :: Cannot open video driver ... Exiting ...
RetroArch [ERROR] :: Fatal error received in: "init_video_input()"

Edit: Huh, got d3d9 to work by setting windowed fullscreen to false. The shader works with that driver. Thanks for the suggestion!