Gameboy shader

From Emulation General:

Alright, I got the Gameboy shader into decent enough shape that people should be able to use it without too many problems. You can get it here:

https://github.com/HarlequinVG/shaders/tree/master/gameboy_shader

Make sure you run it in OpenGL mode with integer scaling enabled and auto aspect ratio. Additionally, you’ll need a Gameboy emulation core for RetroArch which outputs grayscale video (ie it doesn’t emulate GBC’s automatic colorization of classic Gameboy games). If you need one, you can get an older version of Gambatte here:

https://www.dropbox.com/s/htuztl6go2o6sha/libretro-git-gambatte-x86_64.dll

Graphically, it still has a ways to go. I’m currently working on a couple new gimmicky solutions that might do the trick. I’ve also toned down a few of the effects like motion blur and shadows for this release, but you can open up the .cg shader files found in the /shader_files/ directory in notepad and play around with the variables for those if you want.

Oh, and the included backgrounds are pretty lame, but I haven’t been able to find a decent image yet.

Very nice use of CGP I must say :slight_smile: I’ll give it a shot later today.

EDIT: Btw, with Gambatte we now have core options to disable/enable GB colorization. Fixed a couple of issues I found, and I’d like to know if “GBC - Grayscale” in gambatte is true GB grayscale.

Ha, didn’t expect anyone to post this here since I’ve only discussed the project on /vg/. Basically, I’m working on a shader that will hopefully simulate an actual classic dot matrix Gameboy screen to whatever extent it’s possible on a LCD monitor (eventually). It currently includes functionality for a user defined color palette, motion blur, textured backgrounds, drop shadows, and a matrix effect overlay softened by color and alpha blending with neighboring pixels.

The intent is that eventually all these features will be used to generate an “authentic” looking simulation of the classic Gameboy, but they can also be defined by the user to allow whatever color palette, background, and other settings they prefer. The image quality is still a long ways off from my goal - the grid effect especially still needs a lot of work.

This is my first Cg project and essentially my first non-trivial shader project, so if anything in the source looks weird, well, it probably is. If anyone has any suggestions for features or improvements, I’m always open to ideas.

–edit-- From some of the feedback I’ve received it seems that the shader is using a couple features not supported by some GPUs. They seem to be the same features preventing the shader from running in D3D. If you’re following the instructions and it’s not working this is most likely the cause. I’m currently working on a more portable solution. –edit2-- Solved the issues. See the post below for the link.

Here’s a shot showing off some of the shader’s features: –edit3-- Updated with an image of the most recent version at default settings

1 Like

Well, a real dot matrix shader was long overdue, and I actually find it a bit baffling that no one had made one thus far. There are a couple that supposedly went for a similar look, but they are not authentic in the slightest. This one is basically the equivalent of cgwg’s CRT shader to me: it might not be perfect, but damn does it get close to the real thing, close enough for me in any case.

1 Like

@Harlequin The screenshot looks awesome.:smiley:

The code looks great, too. Organized, easy to read and tons of comments!

Unfortunately, it fails with my radeon 6310 using the open source driver in Linux, so hopefully the next revision will treat me a little better.

1 Like

Did a quick hack job to fix the compatibility issue for now (https://github.com/HarlequinVG/shaders/tree/backwards_compatible/gameboy_shader). Try this if the first version still doesn’t work for you after following the directions. Going to work on a more elegant solution that still retains the backwards compatibility. It should run with D3D as well. but it will look awful so I would recommend sticking with OpenGL for now. I don’t have any alternate rigs to test this on, so I still can’t guarantee it’ll work with all GPUs.

–edit-- Still getting reports of errors with Radeon cards not supporting Cg’s partial derivative functions. Going to take me a bit to fix that. –edit2-- Yeah, it definitely won’t work with Radeon cards currently because the arbfp1 profile doesn’t support some of the Cg standard library functions I used. Working on a fix but it’s going to take a little time. –edit3-- Making progress. The shader will actually run on Radeon cards now, but the dot matrix overlay will not be properly applied. Working on identifying the cause.

–edit4-- Shader should be working with AMD cards now. Make sure you download the version linked in this post if you’re using a Radeon. It’s more of a really shoddy patch job than a fix, but it should behave identically to the one linked in the OP (hopefully). Credit goes to the guys on emulation general who tested my changes out line by line (and variable by variable in some cases) to track down the issues.

1 Like

Yeah, still failing here. No hurry on the fixes.

It’s always such a hassle writing shaders that perform comparably on all 3 GPU flavors.

1 Like

Damn it AMD, get it together!

Looking through the source atm since I cannot run it for reasons stated above.

I notice you use samplers in vertex shaders which for one doesn’t work on arbvp1. Then I saw this:


    fixed4 color_array[4] = fixed4[](COLORS.color_0, COLORS.color_1, COLORS.color_2, COLORS.color_3);    //why is this not working if the color palette uniform is removed...?

This is because RetroArch doesn’t doesn’t check for vertex sampler uniforms, only fragment (this is the first shader for RetroArch which uses that :P). I suppose this is a bug, and I’ve fixed it in:

Testing that would be nice.

As for partial differentials, you can probably use IN.texture_size and/or IN.output_size uniforms for the same effect.

1 Like

Ah, got it working on my Intel HD4000 and it looks fantastic. Great work, Harlequin. EDIT: I can confirm that it works on my Radeon card now, too. Thanks for working on the fix :slight_smile:

1 Like

I like it.

I spent some time experimenting with different approaches to blending trying to make the image look a little more like a dot matrix and a little less like someone just drew a grid over the screen. I should have a new version out in the next couple days that implements some of these ideas. I’ve also gone through and simplified a couple of the trouble areas that were causing compatibility issues with different GPUs.

Here’s a shot of what I’m currently working on.

Very nice … Was able to test just now. I’ve updated the cg2glsl script in tools/ in RetroArch repo. This shader appears to have been correctly converted to GLSL (ES) as well (lots of new edge cases found in this one). Ran it through a GLSLP (replace .cg with .glsl).

Ha, glad that my awkward implementations are at least helping out the project. Aside from a little work with C# and XNA I have practically no programming experience, so I’m basically going off nVidia’s documentation and luck.

I think I found an issue with with uniform IN.output_size while playing around with borders. It’s incorrectly reporting the size of the entire display window rather than just the output size when running in OpenGL with integer scaling enabled and scale_type set to viewport at 1x scale (haven’t really tested other scale_types and scales). For example, with integer scaling enabled, a Gameboy game running fullscreen on a 1920x1080 monitor outputs an 1120x1008 image (7 times the size of the video input). IN.output_size reports this value as (1920, 1080) under the conditions listed above, whereas under D3D it reports it as (1120, 1008). Both seem to report the correct numbers with integer scaling disabled (1200, 1080).

I noticed the issue while working on the shader file here (it’s just outputting debug stuff so the image isn’t going to make a lot of sense anyways): https://github.com/HarlequinVG/gameboy_shader_v0_2_2/tree/master/experimental

In other news, I kind of realized that the new methods I was working on were sort of drifting away from my goal of an accurate DMG shader and more towards an idealized Gameboy shader. I spent several hours last night and today photographing the Gameboy’s screen and analyzing it in Photoshop, and I believe I’ve come up with several changes that should bring the shader back on track. The most notable is that the colorizing mechanics (which much of the shader relies on for processing) are going to have to be revamped to a new system. Still hoping to have something out in the next day or two.

Should be fixed in this commit: https://github.com/Themaister/RetroArch/commit/d2ea83729eab86154aa00eefc50a9cfdac99b8e6 References for viewport scale as computed before didn’t take into account integer scale (so you’d see window size instead of viewport size). Seems like you hit edge cases a lot, which is nice. :stuck_out_tongue:

Yeah, I figured that would be a rare case. I nearly ruptured something in my brain trying to reason out this part out:


#define video_scale floor(IN.output_size.y / IN.video_size.y)
#define scaled_video_out (IN.video_size * video_scale)
#define uv_ratio (IN.output_size / scaled_video_out)
#define texel_offset ((IN.output_size - scaled_video_out) / 2.0)
#define tex_offset (texel_offset / (IN.texture_size * video_scale))  //make it stop

They’re values used to resize the video input to the largest integer scale that will fit in the current output and calculates the offset for centering it in the viewport, basically giving me free screen estate when run in non-integer scaling mode to use for a border. The plan is to simulate some of the excess screen space on the actual Gameboy screen, which should really help soften things like text directly bordering on the edge of the screen (see the “B” in the screenshot above, or opponent names in Pokemon during battles).

I’m still not really sure if this approach has any benefit over just using an absolute scale type, though.

Here’s a shot of my progress on the new border and colorizing system. You can find the current shader files here: https://github.com/HarlequinVG/gameboy_shader_v0_2_2/tree/master/experimental/shader_files. I think I approached the border implementation a dozen different ways before I found something that worked consistently and didn’t mess up the grid effect at arbitrary viewport sizes. Still working on integrating other features back into the shader.

The colors look surprisingly accurate side by side with my Gameboy, although I could understand why some people wouldn’t like them. You can still change the palette, but you might have to play around with colors a bit before you find something that works. Also of note is that since it now uses more of a value gradient for colors rather than hardcoded value-to-color mapping, the shader will actually work with non-Gameboy games. SNES seems to suffer from a strange image stretching issue, but NES and Genesis work fine. Not sure why anyone would want to play other systems with this shader, but it’s an interesting novelty I guess.

–edit-- Updated with new picture since I made some progress. A lot of it’s sort of patchwork right now so there might be compatibility issues with some GPUs and the latest commit.

Oh man … That takes me back.

Looks kind of decent on higher resolution games too. Not sure if there’s really any demand for dot matrix shaders, but I might release a version that supports color after I’m done with this. Should work well with the GBC and GBA at the very least.