New CRT shader from Guest + CRT Guest Advanced updates

Yes, at the disposal of temporal effects, you can use more pronounced artifacting values to compensate.

1 Like

did you tried the last update ( 2025-11-30-r1)?

I did use your settings and get

1 Like

Not yet.

Wow! Really?

Not sure if things have improved but I wouldn’t be surprised based on @guest.r’s rep.

The real challenge isn’t isolated to getting that ā€œmā€ to look right though. It’s getting it to look clear and legible with minimal noticeable flickering and strange artifacs while at the same time getting all checkerboard dither patterns to blend into new shades/colours without the flickering giving away the illusion of more native colours.

When you increase Artifacts and other settings which improve the clarity of the font, they either sharpen the image and reduce the blending effect or increase flickering/noise/movement in the blended areas.

Reduce those settings and the font get’s more blurry and blended. It’s because the ā€œmā€ in the font itself is not much different from the vertical line dither pattern that’s used in the Sonic The Hedgehog waterfall but with fonts certain characteristics can be used to differentiate them from dither patterns used in sprites and backgrounds that we actually want to remain blended so that’s where @guest.r’s clever algorithm comes into play.

So I’m definitely going to try it out when I get a chance.

This could also be a factor:

1 Like
#reference "shaders_slang/crt/crt-guest-advanced-ntsc.slangp"
cust_artifacting = "0.800000"
cust_fringing = "0.000000"
ntsc_phase = "5.000000"
ntsc_charp = "10.000000"
ntsc_charp3 = "10.000000"
ntsc_cscale = "3.299998"
ntsc_cscale1 = "1.699999"
ntsc_sharp = "10.000000"
ntsc_fonts = "1.000000"
1 Like

Just sharing these to provide a little more insight into some of the things we have discussed concerning Font Preservation.

Unfortunately they’re in the wrong colourspace/format or something. That happens sometimes when using shadow play to capture HDR images. Sometimes it’s not enabled and you have to either restart or disable then reenable it.

Anyway, corrected screenshots can come at a later date. These screenshots show that XBR-LVL-2 settings might have been interefering with the font preservation process and contributing to the artifacts that I was seeing.

All of these tests were conducted with ver 2025-11-30-r1.

Note: 2025-11-11-r1 didn’t have any issue with XBR-LVL-2.

Summary

1 Like

I realized I can just use a vertical line overlay and get the same or better results. The overlay is still a fiddly and less than ideal solution, obviously, but it illustrates the concept.

Sometimes it just works, other times I have to turn integer scale off and adjust the custom aspect ratio width one pixel at a time until the overlay is aligned with the triads correctly.

overlay 70%

3 Likes

I have only been able to get this overlay trick to work with XRRGGBB, I’m not sure why.

XRGB, XRYCB don’t work, it’s like the overlay gets blended by the shader (even though this shouldn’t happen). I’m stumped.

1 Like

Why does ā€œInternal Resolutionā€ on the HD shader max out at 8? Shouldn’t this match ā€œInternal Resolution Yā€, which caps at 10? For example, if I’m adjusting for 384x216 on a 3840x2160 monitor. I’m using the ReShade port. Is it the same in the Slang version?

1 Like

Hey! Nice for stopping by. With 384x216 you are good with slightly increased internal resolution. You would only need tremendous ammounts of Internal Resolution if you exaggerate input resolution beyond reasonable means. Monitor resolution is not connected to the a nice looking setup regarding Internal Resolution in it’s basic way.

For example, you run a 256x224 Mario game at 8k. Your preferred target Internal Resolution would still be arround 1.0.

Large Internal resolution is basically needed in you use the shader on point-upscaled graphics, where one original pixel is a rectangle of 8x6 pixels. Then you would need ā€œInternal Resolutionā€. But usually a more elegant way is to adjust the input x and y resolution.

But, it you have some special configuration going on, you can still edit the shader and increase the UI parameter values for Internal Resolution how large you like it.

2 Likes

Hi guest! Thanks for your detailed response. I’m going off this post of yours that said to match them:

The game I had in mind is a modern PC game with retro graphics, Cast n Chill, that seems to have a native pixel resolution of 384x216. Since this is upscaled to my monitor’s resolution, I thought I’d have both internal resolution params 10x with Resolution_X and Resolution_Y at 3840x2160. But actually, in practice, I’ve gotten good, sharp results for a game like this doing 5x for internal resolutions and resolution x/y at 3840x1080. If I do the math and divide, the result is 768x216? I could also do 10x for internal resolution and make resolution x/y 7680x2160 and the result would be similar… is that supposed to be any better? I’m not sure if I’m doing this all correctly… mostly just putting pieces of tips together I’ve found lurking the forum here, heh. Anyway, thanks for clearing that up a bit.

1 Like

Increasing Internal Resolution also increases filter width or height (with no scanlines). The ReShade version offers some more options like setting the x/y internal resolutions and you can use this for some visuals.

Usually using masks and filtering an emulator internal resolution of 5x is quite enough, so you can just multiply 384x216 with 5.0 and set the x/y shader parameters accordingly. 10x for X and 5x for Y can also work nicely, it’s a bit slower though.

Next good question is if you want scanlines or not and if there are some upscaled elements (like xBR or 3D elements). With scanlines, just set y-internal resolution to like 5.0 and you are good.

With upscaled elements the Internal Resolution parameter should be a bit lower, or there will be much of blur involved.

I usually use core/emulator internal resolution 4x, no scanlines, xBR filtering for textures and Internal resolution slightly below 3.0. Works very nice for me. With this specific shader, emulator internal resolutions don’t add too much to visuals beyond some point. I know that some folks like to max. emulator internal resolutions for 4k displays etc…for official changes with guest-advanced-hd for ReShade you can post in @DevilSingh’s thread, like to increase the Internal Resolution parameter range.

2 Likes

It looks like it might be just a 2D game. In which case you wouldn’t need to mess around with internal resolution at all (if you are a beginner, I would not). ā€œUpscalingā€ is really relevant for games with 3D elements, as those can be rendered at higher resolutions.

You can try to set X Resolution to 3840 or lower. Lower will make the image blurrier. Y can be set in 216 steps. This affects how scanlines are set.

2 Likes

Funny this has come up - I’ve also been experimenting with CRT shaders in modern pixel art games.

I’ve found that Guest HD has a large enough number of parameters to get the scan lines and mask I like; however on my mini PC it is a bit of a struggle to maintain FPS with Guest on.

So, using Megatron is usually a good compromise if I can convert to HDR, however I can’t get the same mask size and shape as I do with Guest. Frustrating!

So my question is, how does the math work with internal resolution, mask size and mask zoom? I think I can approximate the guest HD mask in Megatron by using Reshade’s global resolution, but so far I’ve not managed to find the right resolution.

I realise this is probably a tough question to answer…I’ll follow up with some example parameters in a bit.

The other question I have is how does the guest shader deal with HDR? Prepending with an HDR plugin and using the AutoHDR add-in in Reshade doesn’t seem to produce the right colours.

Of course the ideal scenario is for @DevilSingh to port Guest Lite to Reshade :slight_smile:

Thanks as ever for all your efforts!

1 Like

After a bit of testing, I think the mask size I’m after is basically the height of the resolution divided by 3.

So for 1080p, Reshade’s height should be 360; for 2160p it should be 720. This works for Megatron and I might see if using those lower resolutions for Guest improves the performance.

I hope this makes sense to someone…

Edit: fixed my numbers…

Vaguely. Changing height Reshade height is affecting the scanline gaps.

Your math on the other hand does not make any sense. Just check your numbers.

Done. Brain fart.

I was running Jack Move in 4k and it looked best with a res of 540 in height…

Just read through the Reshade thread and the mind boggles at getting these numbers right!

Just to follow on from my previous post, Blasphemous is on Steam for Ā£2 at the moment, and I thought I’d give it a pop with Guest HD in Reshade. It’s basically perfect for this application; as the whole game has essentially been developed entirely in pixel art (including the text) and so it scales perfectly with the shader.

Another benefit is due to pixel art scaling, the game looks the same in 1080p as it does in 4k; so you can run it at lower resolutions for improved performance. This goes for Guest HD in Reshade too - the game’s native res is 640x360; so setting Reshade’s global preprocessing resolutions to that, and then leaving internal resolution at 0, Mask size at 1 and mask zoom at 0 results in perfect masks and scanlines.

Enabling VGA double scan provides a lovely high TTL result as well; just like playing on a decent VGA monitor (as the setting is intended for, I assume!)

Here’s a close up of mask 7 on my LG BX OLED. I really like seeing the masks rather than just the straight scanlines :slight_smile:

One last general question for @guest.r - apologies if this has been asked before, but what is the difference between each version of the shader? NTSC is self explanatory; but I’m unclear the difference between HD and Advanced; for example.

It’d also be great to one day have a wiki or explainer of each setting for those of us just starting out on our fiddling-with-shaders-and-not-actually-playing-the-games journey :smiley:

Thanks again!

2 Likes

Hey there!

I’m going straight to the differences. The HD version also includes a vertical filter, which gets enabled either with interlacing or with high resolution scanlines parameter ON. HD filter can also have much greater ā€œstrength and reachā€, which can come handy with some situations.

The main benefit of regular advanced version is TATE mode (vertical flow of scanlines), otherwise it’s a version with a complete set of features. It also includes raster bloom and smart horizontal filtering, but i guess this can be more or sometimes less important…

1 Like

That’s great, thanks! I know what TATE mode is but not sure on the other stuff. HD works well with modern games though, that’s for sure.

As far as stress on the system is concerned, which is most and least intensive? It’s become a hobby of mine to get the most out of the least; and interest in this is become more universal with the advent of the Steam Deck

1 Like

HD version can be least intensive, if you stretch the filter range like with Internal Resolution, then things become slower. NTSC version can be most intensive and demanding, especially if you crank up x-resolution and y-resolution.

With Retroarch there are also fast and fastest verisons where fastest is also suitable for ā€œlow-powerā€ mobiles. Fast version though, has a much better feature set.

2 Likes