We can call it what we want but you can’t get subtractive color on a backlit display. Not without using a physical mask.
What would using multiply blend with a mask look like though?
We can call it what we want but you can’t get subtractive color on a backlit display. Not without using a physical mask.
What would using multiply blend with a mask look like though?
Unfortunately, the computer I’m tinkering with is not strong enough for your preset, but I appreciate the suggestions and will try adjusting the noise resolution.
Yes. I tested my moniter’s subpixel layout, it’s RGB.
How we theorize it shouldn’t be capable of stopping anyone from laying stripes of cyan, meganta, and yellow, ontop of an image, which is an experiment I still want to see the results of.
@Cyber This has always been a problem with nearly all masks (the expections being, 8 and -1 for obvious resons) well before I started using deconvergence, across two computers and moniters, and all high strength images posted to this site, The same problem exists with PowerSlave Exhumed’s crt shader options.
It’s worth noting it’s not exactly a “tint” per se. There was this one time where I think I accidently set one of the shaders to scale the image, when that happend the “tint” became unevenly distributed, and in larger images took on a kinda “swirly” form that reminds me of spatial aliasing. This disapears completly when “mask strength” and “low strength” are set to 0.50 max. Changing the mask size can also solve the problem in some specific cases.
I always just called it color loss. I think it’s just an inherent problem to using masks. 11 and 12 which mix rgb and cmy colors, however, deliver “different” results.
So I want to see happens with a CMY mask. I don’t expect it to be a smoking gun against something I think is intrinsic to masks. I do ,however, thnk it would be fun to configure.
The easiest way to play around with them is with CRT-Royale, which uses png images for the masks, which you can modify in photoshop/gimp/whatever to CMY instead of RGB.
I use a CMY mask on my setup (or rather YMC to be exact since it’s a BGR display). It results in a much-needed boost in brightness. However, it’s a bit of a mess at the subpixel level. Only reason it sort of works for my setup is because I output 1080p to a 4K display, so it’s more like YYMMCC, which appears to mitigate the subpixel issues somewhat.
In my past experiennce, if I were to switch from mask 10, 6, or 13 to 11 or 12 I’d have to recalibrate the bightness setting to be a tad more dim. I expect I’d find this to be even more the case with a CMY mask.
Color loss when using masks is easily mitigated by increasing scanline saturation, I’ve found.
Colors are well-saturated here, right? (view at original size)
No.
Saturation by it’s very nature reduces the color range.
Compare that image to raw and you’re bound to find something isn’t quite meant to be so saturated.
It was color loss (or perhaps better worded “change”) in skin tones that originaly drove me to create an account on this site. Many techniques that compensate for masks can easily send skin tones straight to the uncanny valley.
What masks do to colors isn’t desaturation so increasing saturation doen’t really help.
However, I have actually warmed up to scanline saturation lately, but this is beacuse of what it does to the beam shape, not the colors.
Efectively replicating the aesthetic of CRT displays while keeping the colors of the raw image is my holy grail.
Sorry, this is a bit of a rushed “face palm” post.
The colors of the raw image aren’t what gets displayed on a CRT, though. I would never use the raw colors as a target.
Honestly I’m kinda baffled as to what you’re even talking about.
Why are “skin tones” even an issue when talking about 240p video games?
I don’t have a crt laying around and my memories are mainly from childhood so I disable shaders to reference the raw image displayed on the lcd screen.
In “Panarama Cotton” Cotton’s small “corner face” can easily turn too red. In many rpgs characters faces can go from a healthy beige to yellow jaundice. What seems to work great for scenery may ruin a portrait.
I don’t see these issue in either pictures and video footage of actual CRTs, or the CRTs at the local arcade bar.
Most of these issues are brightness/saturation related though. The mask’s effect is something different (though it does contribute to jaundice). This is my last day of work before my days off, I’ll post some screenshots showing exactly what my beef with the masks is along with hunterk’s suggested royale test later (if complications with that png file don’t stop me).
My 2c worth of observations, from my own personal perspective; feel free to disagree with everything I say guys
Looking at CRT shader screenshots in this and other threads, it seems to me that many people are playing games on big-ass TV sets from several meters away, in which case you need heavier scanlines and masks and your eyes will do much of the “blending”. Conversely, if you’re playing on a normal PC setup (24-27" monitor at arm’s length), you really need to tone the strength of these effects back, you want more “blending” in the shader, so to speak — especially if you’re not even using fullscreen, but something around 960x720 or 1067x800 like myself. Less strong masks == much less colouration side effects.
The other contributing factor why these look nothing like you remember is the relatively recent Sony PVM/BVM hype train. These were broadcast reference monitors costing several thousand or tens of thousans of dollars; I guarantee you that no one used these monitors with personal computers or consoles pre-2000 (and I’m generous there, probably the cutoff is around 2010-2015). It’s just that 5-10 years ago professional facilities were literally throwing them away or selling them off for $5 or something, and the hype started… They have more vertical resolution and sharpness than any regular small TV set or monitor people used with these machines in the 80s and 90s. Also keep in mind that they were not made to produce a pleasant looking picture, but an accurate one, so they can spot errors in a professional studio environment easily. Well, it’s the same as with audio – I have a $1500 pair of “budget” pro audio monitor speakers sitting on my desk, and while they are super accurate, anything that’s not perfectly mixed will sound a bit like shit on them, making 90% of records unpleasant to listen to. But that’s exactly what they’re supposed to do — underline all your errors with a big fat red felt pen marker!
My point is, no one who drew the artwork for these games had a PVM/BVM or similar quality monitor, and no one who played them in the last century had one either. The artists definitely took the lower-quality but pleasant blending, blooming and glow artifacts into account. Personally, I find the PVM/BVM look quite ugly and it’s nothing like the TVs/monitors we used back in the day; the scanlines are waaaaaaay too thick and prominent for a start, and there’s very little pleasant blooming and blending going on. Again, they’re precise but don’t have the qualities that someone who actually grew up in the 80s would want from a CRT emulation shader.
You don’t really see scanlines on 200/240p content on a 15kHz 14" monitor (13" viewable area). They’re a bit more prominent on 200-line NTSC modes, so my guess the cause for the PVM/BVM craze is twofold: 1) many of the console folks are from the US, hence used to the more promiment scanlines on NTSC displays, 2) US people had no affordable small TVs with SCART input like we had in Europe, therefore they had to resort to composite output with most consoles. Many of these console-only gamers don’t really have any other point of reference than composite, so they don’t know how a proper 80s RGB monitor like the Commodore 1084S would look like. Hence I would bet good money on it that many of them only know composite, high-res CRTs post-1995-2000 and modern LCDs. Then to people who only know the modern LCD look, the cleanliness of the PVM/BVM shaders might look appealing (quite similar to the simplistic “blank every second line” scaline emulation of early emulators) and proper authentic CRT shaders might look “wrong” or “broken” to them.
Okay, flamesuit on…
PS: This is in no way criticism of any shader authors; most of their shaders can be tweaked to taste anyway.
Don’t worry I won’t pull out the flame thrower. I’m just going to put on my Hercules costume, give you a thumbs up, smile and say “cool story bro”.
Edit: And sometimes I think MY posts are too long.
I think the cause of the PVM craze is that a significant number of people think 240p content looks great on these displays.
Not trying to start a debate though, I like a variety of CRTs. They all look great in their own way.
Sure, there’s no arguing with taste. My point was just that it’s inauthentic as 12-year-old weren’t playing Super Mario on pro broadcast monitors… hard to argue with that And the stronger scanline & mask effects necessary for more realistic PVM/BVM emulation contribute to more image degradation such as colour shifts and weird saturation artifacts, that’s all.
So by toning those effects down, you’ll get a) less unwanted artifacts, b) more period-correct CRT emulation. But it will look less like a broadcast monitor [c) period-accurate emulation and preserving the past as it was and not by re-imagining it is one of my interests ]
I brought the topic of “authenticity” up because @HereticZero wrote: “I don’t see these issue in either pictures and video footage of actual CRTs, or the CRTs at the local arcade bar.” Yeah, so those monitors in the arcade are probably quite different than what the PVM/BVM emulation presets are trying to achieve.
I think both of these points are pretty spot on and not actually contradictory at all.
To Nesguy’s point I think any casual browsing around here, Reddit, YouTube comment threads, etc - it’s clear a lot of people really like this heavy scanline, PVM look. There’s no shortage of PVM overlays to go with said shaders.
To Rincewind’s point, I agree - the PVMs were point of fact not the screens we all had. I was a kid in the 80s/teen in the 90s … TVs obviously had scanlines - you can even see it in magazine photos from the era - or when hooking up old hardware to period crts. But they looked more like what you’re describing. Toned down and less aggressive scanlines really.
I don’t even think anyone here is arguing really…just tossing in my two cents.
Screen size is a big factor when it comes to scanlines. Basically small BVM = Medium PVM = Large Wega.
The color shifting from masks is something that probably needs to be studied further, I don’t think there should necessarily be any color shift if you match the mask to the display’s subpixels correctly.
I’m still not sure what saturation artifacts we’re talking about, though. It’d be nice if we could get more specific about this. I’m not convinced it’s caused by the masks themselves and not some other combination of settings in conjunction with the mask. Just about every setting affects every other setting so it can be hard to pinpoint what’s causing what.
I wouldn’t really call myself a “shader author” as I’ve just started messing around with other people’s work as an attempt to produce a semi-passable 1080p (S)VGA emulation. I just know enough to be dangerous, and this is what I’ve experienced when messing around with masks and dim-every-second-line style primitive scanline emulation:
Scanlines and masks darken the picture, obviously. The general way to counteract that in most shaders is the “brightboost” / “colorboost” param, or lowering the output gamma. Lowering the output gamma makes the colours appear faded, but the more evil is the “brightboost”. Usually how it works is the linearised RGB components are just multiplied by the “brightboost” param, then the result is simply clamped to the 0.0–1.0 zero range. Now, it’s easy to see how that can result in all sort of colour shifts, that’s a simple performance oriented colour hack that is in no way colour-accurate. During my experimentation I noticed that in particular reds and oranges will will get a greenish yellow tint when “colorboosting” too much.
If you think about it, multiplying all pixels with a green/magenta or similar mask won’t just leave the colours 100% intact. It’s a clever hack, but that’s not something the designers of LCD panels expect people to do with the image… So naturally, colour accuracy will always suffer. As I see it, there’s really no way around this, except for keeping the mask strengh low to minimise the colour/hue shifts.
I think we all need to realise that these shaders/hacks are performance oriented, like most realtime graphics programming. Doing things “properly” would need a lot more processing power, would probably need converting back and forth between LAB space not just simply doing simple maths on linearised RGB directly, and there’s always the basic fact that we’re screwing around with the subpixel structure of the screen the manufacturers did not really anticipate…
Personally, I’ve given up on “colour accuracy” and have just accepted these colour/hue shifts as the nature of the beast. Besides, “colour accuracy” compared to what? There’s always gonna be a change in relation to the raw output of the LCD screen, but these CRT monitors all had their own colour rendering inaccuracies, hue shifts, and so on. So it’s not even clear what we call the “accurate” representation of colours.
I would love to hear some “real” shader authors views on the subject of colour-accuracy.
THE CMY ROYALE TEST RESULTS ARE IN
but first lets look at some images depicting my actual problem
Note: There are links instead of previews because some of these images NEED to be seen in 4K, so click on the image after the link. Your moniter may also be a factor (I use an Acer CB282K), but the problem has been consitent across 3 typical RGB LCD displays.
Also Note: I had to disable HDR, otherwise the “raw” screenshot came out pitch black. I have encountered this issue with HDR enabled.
1: https://ibb.co/2djVG5H The raw image scaled to 4k. The stone wall on the top left is a great reference point
2: https://ibb.co/6N69nV4 crt-guest-advanced default settings, pretty great, not for everyone (beacause everyone likes to tinker) and not much mask.
3: https://ibb.co/KxWzjWm Crank mask 0’s strength up to MAX,set scale to 2 and things get GREEN, at least in my eyes and on my moniter.
4: https://ibb.co/ZYjc3ky TATE mode has a funny way of solving everything (spread across subpixels?). The only problem here is TATE mode itself.
5: https://ibb.co/SXfCZn8 Mask 7 (but not 8) at scale 1.0 also has a “green” problem. This baffles me given the mask’s monochrome nature. And It looks like it added a neat lens flare to Balzar’s eye for some reason.
6: https://ibb.co/4T3j2Kg Mask 6 and it’s RGB brethren all have this issue, almost regardless of scale.
7: https://ibb.co/ZdVTWF3 My current global preset; NTSC and a work in proress.
THE ROYALE TEST
8: https://ibb.co/Rzr2XrC First we have the default royale settings. It’s also pretty great, though there is still some “greening” going on, particularly at that stone wall.
9: https://ibb.co/3hPrbGm Swithed to the modified CMY aperture grille mask 0 (note: it’s just the smaller png put through a simple color inversion) AND DAMN IS IT BRIGHT! That, however, seems to be it’s only problem (at least in my eyes).
10: https://ibb.co/vhX1SYh Reduced contrast to 0.25 (in addition to my ds4 reconecting…)
In conclusion: I think CMY has great potential, but not as much potential as CMYB (CMYK proper).
Yeah, so you’re seeing the exact same issues that I described — the results of simple colour/brightness hacks to combat dimming caused by scanline/mask dimming.
My suggestion is not to worry about them; it doesn’t really matter that the colours are a bit different than the raw input on your LCD. Like I said, real monitors have totally different colours than your particular LCD… or even another LCD of a different brand/type, etc.
I don’t think you’ll find a solution to them either (to have 1:1 match between raw colours and the perceived colours when using the shader).