My 2c worth of observations, from my own personal perspective; feel free to disagree with everything I say guys 
Looking at CRT shader screenshots in this and other threads, it seems to me that many people are playing games on big-ass TV sets from several meters away, in which case you need heavier scanlines and masks and your eyes will do much of the “blending”. Conversely, if you’re playing on a normal PC setup (24-27" monitor at arm’s length), you really need to tone the strength of these effects back, you want more “blending” in the shader, so to speak — especially if you’re not even using fullscreen, but something around 960x720 or 1067x800 like myself. Less strong masks == much less colouration side effects.
The other contributing factor why these look nothing like you remember is the relatively recent Sony PVM/BVM hype train. These were broadcast reference monitors costing several thousand or tens of thousans of dollars; I guarantee you that no one used these monitors with personal computers or consoles pre-2000 (and I’m generous there, probably the cutoff is around 2010-2015). It’s just that 5-10 years ago professional facilities were literally throwing them away or selling them off for $5 or something, and the hype started… They have more vertical resolution and sharpness than any regular small TV set or monitor people used with these machines in the 80s and 90s. Also keep in mind that they were not made to produce a pleasant looking picture, but an accurate one, so they can spot errors in a professional studio environment easily. Well, it’s the same as with audio – I have a $1500 pair of “budget” pro audio monitor speakers sitting on my desk, and while they are super accurate, anything that’s not perfectly mixed will sound a bit like shit on them, making 90% of records unpleasant to listen to. But that’s exactly what they’re supposed to do — underline all your errors with a big fat red felt pen marker!
My point is, no one who drew the artwork for these games had a PVM/BVM or similar quality monitor, and no one who played them in the last century had one either. The artists definitely took the lower-quality but pleasant blending, blooming and glow artifacts into account. Personally, I find the PVM/BVM look quite ugly and it’s nothing like the TVs/monitors we used back in the day; the scanlines are waaaaaaay too thick and prominent for a start, and there’s very little pleasant blooming and blending going on. Again, they’re precise but don’t have the qualities that someone who actually grew up in the 80s would want from a CRT emulation shader.
You don’t really see scanlines on 200/240p content on a 15kHz 14" monitor (13" viewable area). They’re a bit more prominent on 200-line NTSC modes, so my guess the cause for the PVM/BVM craze is twofold: 1) many of the console folks are from the US, hence used to the more promiment scanlines on NTSC displays, 2) US people had no affordable small TVs with SCART input like we had in Europe, therefore they had to resort to composite output with most consoles. Many of these console-only gamers don’t really have any other point of reference than composite, so they don’t know how a proper 80s RGB monitor like the Commodore 1084S would look like. Hence I would bet good money on it that many of them only know composite, high-res CRTs post-1995-2000 and modern LCDs. Then to people who only know the modern LCD look, the cleanliness of the PVM/BVM shaders might look appealing (quite similar to the simplistic “blank every second line” scaline emulation of early emulators) and proper authentic CRT shaders might look “wrong” or “broken” to them.
Okay, flamesuit on…

PS: This is in no way criticism of any shader authors; most of their shaders can be tweaked to taste anyway.