Take a look.
At the top we have the 2007 Vectrex emulator “Vecx” running at its default resolution of 330x410. Said emulator lets you specify a resolution, which I did for the second example, which is the same graphics rendered at 825x1025. In both cases, while the user doesn’t seem to have any control over the visuals, the lines are drawn thick and reasonably anti-aliased; they are not strictly comprised of pure black or pure white pixels.
The bottom example is the output from the vecx core. The lines are drawn digitally at Vecx’s default resolution of 330x410 (regardless of the output resolution) and upscaled with a bilinear filter. This is a poor base from which to begin adding filters, and no filter on the planet can adequately erase its digital nature.
I have tried plugging in Vecx’s command line functions for specifying the render resolution. This would not solve the fundamental issue of the lines being drawn digitally, but it could alleviate it somewhat. In any event, it doesn’t work. And the core does not seem to have any options available to it.
Right now, it seems like the correct solution is to go with the old emulator, and lose out on shaders. What am I missing?