Sure. Sounds more like you don’t have the knowledge.
Fair enough
Sure. Sounds more like you don’t have the knowledge.
Fair enough
I like coming back to this topic from time to time just as an extra reminder that it’s still so worth hanging on to a fatass CRT
Regarding arcade CRTs vs CRT TVs, there’s nothing special about arcade CRTs other than the fact that they’re RGB monitors, in the sense that they natively take RGB inputs, which was rare for CRT TVs outside of Europe. But you can turn a CRT TV into an arcade monitor for all intents and purposes by “RGB-modding” it, if it happens to have an OSD that uses RGB that you can tap into. Or find a multisync RGB monitor, or a cheap PC / 31kHz CRT combined with a scanline generator.
Regarding white balance / temperature, I’m of the opinion that, especially for gaming (we’re not talking about art film here), people should go with whatever looks nicer to their eyes. To my eyes D65 never looks good, no matter for how long I’ve been used to it, it still feels too yellow or reddish. Not that I’m a fan of D93 either. I would much prefer a point in between. I say just pick a white that “really feels white” to you and work from there.
This has been going on for too long and I don’t even know what you think you’re responding to or the reasons why, anymore.
How many of us in this discussion have wide gamut monitors? I’m assuming none of us do. It will be several years before this becomes widespread in gaming displays, so, practically speaking, it’s impossible to match CRT colors with current LCDs and you’re splitting hairs to try to score points. Sure, with a wide gamut display, it’s possible, but the vast majority of LCDs are currently sRGB. It just seems like you’re torturing this particular comment in order to score a point, and I’m not sure why.
Color accuracy in retro gaming is not analogous to color accuracy in film and modern games and the entire goal just seems subjective, and I don’t mean that as a slight against your efforts. Hell, I adjusted my monitor’s color profile as a result of this discussion, so obviously I agree with a lot of what you’ve said. I’m not going to be convinced, however, that there is a single objectively superior approach to color correction as it applies to retro gaming.
as proven by the many side by side comparisons I have posted.
It’s been stated multiple times that side by side comparisons with photos of CRTs are of very limited usefulness when it comes to comparing colors. Again, this isn’t to slight your efforts, but you should stop trying to use such comparisons as “proof” of anything.
alright, enough squabbling
I’m not even sure what we’re fighting over, or why.
I think Squalo has made some valuable contributions/comments regarding color accuracy.
I must add that with a wide color gamut display everything will look wrong on your display unless the programs are “color managed”, which unfortunately RetroArch isn’t (just like 99% of the other software). It must be interesting to know how people deal with HDR monitors these days which if I’m not wrong use a version of the wider gamut Rec.2020.
On my HDR TV with Shield ATV set to Rec.2020, they use a “good enough” generic color-grading algorithm for non-HDR content. I think Windows probably does something similar.
Since all of my displays and TVs are set to D65, D93 looks way too under-saturated and too bright to me even after giving my eyes time to adjust. I agree that something between the two looks best for retro gaming. D65 is too obviously over-saturated and dim, but D93 is too bright and under-saturated. It’s a lot like the trade-off you get with raising/lowering gamma.
After playing around with different color temps, I think 7900k looks pretty good; smack dab in the middle of 6500k and 9300k. I’m doing it wrong according to both the D65 and D93 purists, but who cares?
For those who are not aware, there is the application DisplayCAL (opensource) which allows to create 3D LUTs compatible with reshade.
https://translate.google.com/translate?hl=fr&sl=fr&tl=en&u=https%3A%2F%2Fdisplaycal.net%2F
So, I finally managed to get 9300k to look good on my display with my eyes (with some guidance from Squalo’s posts!) Being the picky bastard that I am, I wasn’t quite satisfied with how the greens looked with the suggested color mangler settings; I prefer greens that are a bit more saturated.
The problem with 9300k is that it desaturates the reds, making the resulting image dull and under-saturated. There are multiple ways to address this. One way is to simply boost the red channel.
By boosting the red channel 10% with image-adjustment (or color mangler), the resulting image looks bright and colorful at a temp of 9300k, without any clipping. Whites are white, grey ramp shows no red tint, greens look well-saturated. Ultimately, what looks best is going to be determined by the characteristics of the display being used and individual preference.
I believe that some CRT TVs also oversaturated the reds for similar reasons.
Edit: ultimately, the shader chain required for all this is needlessly complicated and results in only a marginal improvement (if any) vs. simply adjusting the color channels manually via the display’s OSD (IMO!). Just boosting blue to 100% and leaving everything else as-is seems to be as good as anything else I’ve tried (if not better). Sky and water colors in games are blue instead of purple, whites are white, and everything is reasonably well-saturated and bright.
As I understand it, CRTs themselves have no white point adjustment, just adjustments for the black level and red, blue and green guns. You adjust brightness until black is black and then adjust each color using an RGB gradient, and whites should be white after that. You’d wind up with different color temps depending on the monitor’s specs, usually landing somewhere between D65 and D93 but usually on the warmer side post-calibration.
( off- topic fun fact ) https://www.geek.com/games/john-carmack-coded-quake-on-a-28-inch-169-1080p-monitor-in-1995-1422971/ weight 45kg (99.5lb)
Hello,
new version of crt-guest-dr-venom is ready for the repo (GLSL). There are many changes and new options, most of them are described in a nice readme file by dr. venom.
To illustrate most of them:
I think it’s important to mention that you can have all settings/looks you had in the previous version, except of alternate scanlines which were upgraded.
You can grab the shaders and presets here: http://eab.abime.net/showpost.php?p=1318791&postcount=235
THANKS!!!
@Squalo I would really, really love to try out your color settings, however I can make no sense of the preset you shared. I’ve tried stacking colormangler on top of imageadjustment, both editing in real-time & saving the preset to edit the text file. In both cases given parameters have confusing names, or worse are actually missing. Even after updating .slang & .glsl shaders (I use .slang personally) I cannot find the parameters for gamma_boost_[r][g][b]
, sat_[r][g][b]
etc.
Thanks! I really love the look you’re going for, as someone who mostly goes for composite or s-video signal shaders for consoles.
I haven’t pushed the per-channel gamma adjustments to any of the repos.
Ah, I just found the code further up the thread. D’oh!
So does that mean it’s simply not available for .slang yet? Or can it be converted relatively easily?
I just pushed the versions with per-channel gamma up to the repos, so they should make it to the online updater soonish.
@guest.r really amazing work there, with a ton of cool features to play around with! I moved a few things around to play a little nicer with the repo structure (e.g., I renamed the readmes to show up in the github web interface and moved the NTSC preset to the ‘presets’ dir) and will start working on getting it all slang-ified ASAP.
slang versions are up.
I went ahead and manually transposed the arrays and made them constants in the color-profile pass. It’s probably a negligible optimization, but whatever
Thanks again, for me it would sure be a tough nut, since i’m not coding in slang currently.