That’s a good point I didn’t consider at all. I know CGA graphics used a composite connection, but it seems EGA was actually RGB, like the later VGA standard. Which begs the question, how did dithering work on games with EGA graphics?
Did it? I Honestly ignore it.
Not to my knowledge at all. All my life gaming on MS-DOS computers dithering was there but never blended. Sharp pixels were par for the course.
This is what I know.
There’s a recent thread which talks about this in detail.
Apparently folks did some wild stuff with CGA graphics back in the day. Getting more colours and stuff through composite.
If you still want to experiment you can try some Blargg Video Filter presets as well. Included with RetroArch are some “Blargg_NTSC_SNES_Custom_pseudo…” presets and I have a more comprehensive and updated filter preset pack called CyberLab Custom Blargg NTSC Video Filter Presets on my thread.
Those patterns essentially get lost in the mask/scanline interaction. You still see them on a VGA monitor /31 kHz. Alternatively, you could use TV-Out, but this is non-standard for EGA and probably not common on typical VGA cards used with DOS either. Common feature ofc with later Windows cards, but then, you’d probably mostly use it with S-Video.
I already tried that, didn’t make a difference. But I think I found the problem, which was of my own making: I was using the ScummVM core, which for some reason or another doesn’t play nice with the dithering effect. Even trying to use dedicated dithering shaders like mdapt didn’t do a thing. But now I switched to the DOSBOX Pure core and. suddenly, everything works flawlessly. The results are stunning. Here’s an example with Loom.
Raw, naked source resolution (PAR), with uncorrected aspect ratio:
Monitor-New_aperturegrille_gm
Monitor-FXAA_sharp-aperturegrille
Monitor-VGA-DoubleScan
In all examples, I enabled dedithering and changed the strength to 1.
Sure thing!
The checkerboard patterns suit some games, but specially for a lot of Sierra adventures I’d rather the colors blended.
Apparently folks did some wild stuff with CGA graphics back in the day. Getting more colours and stuff through composite.
They sure did, and I was able to emulate a lot of the crazy things they did with shaders. Here’s the raw CGA input of Agent USA vs what I was able to accomplish with the artifact-colors shader:
Koko’s shaders are actually quite capable of replicating some of these effects. Here’s what I was able to do with the B&W version of Borrowed Time (Apple II) after fiddling with the NTSC Color Artifacts parameters a bit:
But I think I found the problem, which was of my own causing: I was using the ScummVM core, which for some reason or another doesn’t play nice with the dithering effect
Not an expert, but I think the Scummvm core always output 480 and not the original resolution, so that might be the cause of “problems” with shaders (and scanlines).
off topic: in the scummvm core, Loom for amiga does not work for me (dos version does), is it working for you guys?
That’s great, I asked you for 1:1 res screens to try to do what you achieved by your own.
Indeed, like @Hari-82 suggested, probably a 2x res ruined everything.
Btw, great work, care to share your findings in regard of cga to color parameter settings?
Btw, great work, care to share your findings in regard of cga to color parameter settings?
Of course! Here’s how I’ve got the B&W Borrowed Time to look like that:
#reference "shaders_slang/bezel/koko-aio/Presets-4.1/tv-NTSC-1-classic_take.slangp"
NTSC_MIX = "3.000000"
NTSC_FILTER_SCF = "3.75"
DO_SPOT = "0.000000"
The Subcarrier Frequency Mul (NTSC_FILTER_SCF) parameter is the key. As you increase or decrease it the colors will change wildly, until you find a sweet spot where they get “stabilized”.
Here we have The Bard’s Tale (Apple II), in B&W:
Now, the same game with the “tv-NTSC-1-classic_take.slangp” preset and the parameters specified above:
If I change the “Subcarrier Frequency Mul” from 3.75 to 3.74, I get this mess instead:
There are other spots where the colors get stable, though, although the palette gets quite different. I’d guess that for some other games/systems you’d have to fiddle with this value until things get right.
Here’s what I get with a value of 5.25:
And here’s the rather cursed/cool 6.00:
For comparison’s sake, here’s the raw image of the game in color mode:
Oh, btw, switching gears a bit. Do your handheld shaders come with color-correction and LCD ghosting effect built in? I’m trying to find if I should use the options on the emulator or if I could rely on the shader for that.
No palette or lut based corrections. The Gameboy mono preset is using hue shifting, gba relies on core corrections.
Ghosting is possible, but used by gameboy mono preset only.
I see. So, if I understand correctly, I don’t need to use the emulator’s LCD ghosting effect simulation with the gameboy mono preset?
Yes, already active in all gbmono presets because it was so evident.
I left it disabled in other handhelds, because most lcd already emulates that …by design
Exploring interactions between warped glow, glow light sharpeness and fxaa.
The antialiasing works by emboldening brightest areas. but seem to respect enough the original shapes; also the tipical fxaa “the dog licked the screen” effect si mostly gone :
The settings used are fxaa enabled, glow to blur bias 0.0, glow light gain: 4.0, glow light sharpness: 1.74, sharp x,y: -1.0, warped glow x,y:0.75 and warped glow dynamics: 0.44
Exceptional, congratulations! PS in these screens which preset did you start from?
It’s ng/monitor-screen-hmask-screen-slotmask-taller
Just tweaked VGA doublescan presets and added Green and amber variants.
-edit-
Also, new release 1.9.30 merge request submitted to Retroarch shader Repo, should be available soon:
- REMOVED: Removed support for Retroarch versions prior to to 1.16 due to the use of the rotation uniform.
- REMOVED: Removed dot matrix debug parameter
- REMOVED: Y-Tilt FOV option gone.
- NEW: Adjust content/bezel size by also taking width into account
- NEW: Add presets for PSP
- NEW: New parameter: “Interlaced scanlines brighness push” used to lower input gamma When emulating interlaced screens: Useless for fast response screens such as OLeds
- NEW: Scanline inflation turned into a generic Warped glow setting which can be applied to x axis too.
- NEW: Add a new subfolder “Presets_HiresGames_Fast” containing speedy presets
- NEW: VGA Dobulescan monochrome green and amber presets
- CHANGE: Corners now have a 4:3 aspect
- CHANGE: Y-Tilt does not require you to tweak Bezel tilting
- CHANGE: Lowered default flicker power across the presets (but the flicker one).
- CHANGE: General Presets tuning
- CHANGE: Docs update
- FIX: Keep Deconvergenge constant across input resolutions
- FIX: CVBS color bleed: keep the size constant across input resolutions intended for Hi resolution content (640x480+) that don’t use 2x internal processing
- FIX: Back/Foreground image: correctly scale offset so that it is in “sync” with content position
- FIX: Fully commented config-user.txt, sync’d with config.inc
- FIX: Usual work to enable specific workarounds for D3D11,12
- FIX: Lowered RESSWITCH_SAMPLING_POINT because too near to the screen border
- OPTIMIZED: Slightly optimized fn_border
- OPTIMIZED: Optimized halo vs scanlines blending
- OPTIMIZED: Skip unneded alpha channel assigningments to FragColor
- OPTIMIZED: Debanding: spare useless calls to scale_to_range()
- Overall performance vs previous release lowered from 111 to 108 fps measured by my standard benchmark on Haswell. this is mainly due to new warped glow feature.
How much faster are the “HiresGames_Fast” presets? And what are the compromises?
Basically no compromises as long as the core resolution is high enough.
To simplify, for low res content, the shader does 320x240->640x480->“process”, where “process” needs at least 640x480 to look good.
If the content is already hires, fast presets just do: 640x480->“process”, where the classic presets would upscale everything to 1280x960.
Working internally in 960p is very heavy and even if there would be visual benefits to the expert eye, they still do not justify the performane hit.
It can depend on the hardware, but I’d say fast presets almost halves the gpu consumption at 1080p.
Currently working on a way to proper display as much mask as possible while retaining as much brightness as possible, no color clipping and accurate input->output tone ramps.
The idea is to fully take advantage of what the mask can give by pushing the input gain and make the mask clip (3X). Only from that point, a brightness helper is used to reach full non masked white.
When the scene is dark, the mask alone can be used to accurately reproduce it. but when the scene or part of it is too bright, original color is used to help.
You know, this is nothing new. I’m just trying to find an handy way to ensure that the mask is helped only when needed.
Due to the (de)linarization process, one can expect to see pure mask till about 60% of the input brightness.
I’m using Green/Magenta masks, because I found that at least on 1080p/relatively low dpi screen, make the most of using an RGB mask is not giving good results, maybe they would look better on a higher viewing distance/or lower sized monitors:
Hi @kokoko3k - I have attached photos that speak for themselves, you are the messiah of 1080p !! Thank you for your heartfelt work !!