Unfortunately this doesn’t fix the issue.
Edit: happens with regular, hires and ntsc version
Unfortunately this doesn’t fix the issue.
Edit: happens with regular, hires and ntsc version
UPDATE:
A version with lut fixes.
https://mega.nz/file/Fg5EnRSL#AE08sZnngrfRDGMuhHjXh-J2weDuoFN00i49mP_lwkQ
You should try the fixed version, but preferably add an extra LUT in the code for better issue resolving:
Something like this:
if (int(TNTC) == 0)
{
color.rgb = color.rgb;
}
else
{
float red = ( imgColor.r * (LUT_Size - 1.0) + 0.499999 ) / (LUT_Size * LUT_Size);
float green = ( imgColor.g * (LUT_Size - 1.0) + 0.499999 ) / LUT_Size;
float blue1 = (floor( imgColor.b * (LUT_Size - 1.0) ) / LUT_Size) + red;
float blue2 = (ceil( imgColor.b * (LUT_Size - 1.0) ) / LUT_Size) + red;
float mixer = clamp(max((imgColor.b - blue1) / (blue2 - blue1), 0.0), 0.0, 32.0);
vec4 color1, color2, res;
if (int(TNTC) == 1)
{
color1 = COMPAT_TEXTURE( SamplerLUT1, vec2( blue1, green ));
color2 = COMPAT_TEXTURE( SamplerLUT1, vec2( blue2, green ));
res = mixfix(color1, color2, mixer);
float mx = max(res.r,max(res.g,res.b));
float l = mix(length(imgColor.rgb), length(res.rgb), max(mx-0.5,0.0));
res.rgb = mix(imgColor.rgb, res.rgb, clamp(25.0*(mx-0.02),0.0,1.0));
res.rgb = normalize(res.rgb+1e-10)*l;
}
else if (int(TNTC) == 2)
{
color1 = COMPAT_TEXTURE( SamplerLUT2, vec2( blue1, green ));
color2 = COMPAT_TEXTURE( SamplerLUT2, vec2( blue2, green ));
res = mixfix(color1, color2, mixer);
res.rgb = lift(color, res.rgb);
}
else if (int(TNTC) == 3)
{
color1 = COMPAT_TEXTURE( SamplerLUT3, vec2( blue1, green ));
color2 = COMPAT_TEXTURE( SamplerLUT3, vec2( blue2, green ));
res = mixfix(color1, color2, mixer);
res.rgb = pow(res.rgb, vec3(1.0/1.20));
float mx = max(res.r,max(res.g,res.b));
res.rgb = mix(imgColor.rgb, res.rgb, clamp(25.0*(mx-0.05),0.0,1.0));
float l = length(imgColor.rgb);
res.rgb = normalize(res.rgb + 1e-10)*l;
}
else if (int(TNTC) == 4)
{
color1 = COMPAT_TEXTURE( SamplerLUT4, vec2( blue1, green ));
color2 = COMPAT_TEXTURE( SamplerLUT4, vec2( blue2, green ));
res = mixfix(color1, color2, mixer);
}
color = mix(imgColor.rgb, res.rgb, min(TNTC,1.0));
}
And in the preset ofc:
textures = "SamplerLUT1;SamplerLUT2;SamplerLUT3;SamplerLUT4"
SamplerLUT1 = shaders/guest/lut/sony_trinitron1.png
SamplerLUT1_linear = true
SamplerLUT2 = shaders/guest/lut/sony_trinitron2.png
SamplerLUT2_linear = true
SamplerLUT3 = shaders/guest/lut/other1.png
SamplerLUT3_linear = true
SamplerLUT4 = shaders/guest/lut/displayCAL-2.2.png
SamplerLUT4_linear = true
…and increase the parameter range:
#pragma parameter TNTC "LUT Colors" 0.0 0.0 4.0 1.0
#define TNTC params.TNTC
But you should try the updated version first.
You can count the lut ‘squares’ and determine the lut’s size, should work.
I still think this is the most convincing photo I’ve ever taken, and now I’m just trying to remember how I did this. This was way back when guest-dr-venom was released. Maybe this is guest-sm? Part of it is the display; I’m using an LCD with standard subpixels in this photo. I really don’t like all these new displays with weird subpixel layouts. Anyway, now I’m trying to recreate this with new guest-dr-venom.
What are your suggestions for keeping the phosphors as well-defined as possible, like in the above photo?
Both do not resolve the issue
I copied the LUT below, could you give it a try? Just load up SNES Super Aleste start screen with the blue bar gradient and you’ll see the blue bar missing. The size is 64. (btw, this lut worked with your venom2 versions from before today! (When only changing path to LUT2 and changing LUT size to 64.0 in prepass)
Here is the default Venom 2 (No grade) with lut 1 for the color bars and lut 3 on mortal kombat. If I change my gpu to integrated I have no errors with lut 3 but lut 1 is completely black. I don’t get these error with lut’s from other shaders. Lut 2 plays nice with everything.
What are your suggestions for keeping the phosphors as well-defined as possible, like in the above photo?
I changed the trinitron 5&6 masks in the shader, so you cen get something like this. There are 2 control parameters for the trinitron masks, just a reminder.
I copied the LUT below, could you give it a try? Just load up SNES Super Aleste start screen with the blue bar gradient and you’ll see the blue bar missing. The size is 64. (btw, this lut worked with your venom2 versions from before today! (When only changing path to LUT2 and changing LUT size to 64.0 in prepass)
Size of 64 is correct for this LUT. For old behaviour you can copy this over the LUT 4 code:
color1 = COMPAT_TEXTURE( SamplerLUT4, vec2( blue1, green ));
color2 = COMPAT_TEXTURE( SamplerLUT4, vec2( blue2, green ));
res = mixfix(color1, color2, mixer);
float l = mix(length(imgColor.rgb), length(res.rgb), 0.4);
res.rgb = normalize(res.rgb + 1e-10)*l;
This (the last two lines) are the changes from before… But strangely enough, it works with mine adapter. Probably rounding issues.
Here is the default Venom 2 (No grade) with lut 1 for the color bars and lut 3 on mortal kombat. If I change my gpu to integrated I have no errors with lut 3 but lut 1 is completely black. I don’t get these error with lut’s from other shaders. Lut 2 plays nice with everything.
Thanks, the issue should be resolved now. It seemed OK with mine simple testing, since the drivers caught the exception and covered it up at it seems.
Size of 64 is correct for this LUT. For old behaviour you can copy this over the LUT 4 code:
color1 = COMPAT_TEXTURE( SamplerLUT4, vec2( blue1, green )); color2 = COMPAT_TEXTURE( SamplerLUT4, vec2( blue2, green )); res = mixfix(color1, color2, mixer); float l = mix(length(imgColor.rgb), length(res.rgb), 0.4); res.rgb = normalize(res.rgb + 1e-10)*l;
This (the last two lines) are the changes from before… But strangely enough, it works with mine adapter. Probably rounding issues.
Unfortunately still the same issue.
Any chance you could revert the whole LUT change, or possibly adapt the LUT code from the grade shader? See here:
https://github.com/libretro/slang-shaders/blob/master/misc/grade.slang
I’m assuming the issue is with the LUT code. Or could it possibly be with these changes:
layout(set = 0, binding = 2) uniform sampler2D Source;
layout(set = 0, binding = 3) uniform sampler2D OriginalHistory0;
vec4 imgColor = COMPAT_TEXTURE(OriginalHistory0, vTexCoord.xy);
vec3 aftglow = COMPAT_TEXTURE(Source, vTexCoord.xy).rgb;
float w = 1.0;
if ((imgColor.r + imgColor.g + imgColor.b) > 3.0/255.0) w = 0.0;
float l = length(aftglow);
aftglow = AS*w*normalize(pow(aftglow + 0.001, vec3(sat)))*l;
imgColor.rgb = imgColor.rgb + aftglow;
versus old
layout(set = 0, binding = 2) uniform sampler2D StockPass;
layout(set = 0, binding = 3) uniform sampler2D AfterglowPass;
vec4 imgColor = COMPAT_TEXTURE(StockPass, vTexCoord.xy);
vec4 aftglow = COMPAT_TEXTURE(AfterglowPass, vTexCoord.xy);
float w = 1.0-aftglow.w;
float l = length(aftglow.rgb);
aftglow.rgb = AS*w*normalize(pow(aftglow.rgb + 0.01, vec3(sat)))*l;
imgColor.rgb = imgColor.rgb + aftglow.rgb;
OK, last version for today, also found a squishy bug. Afterglow could have interfered with LUT reading, i moved it to the end of the shader. Else there is not much i can do for today.
UPDATE:
https://mega.nz/file/Fg5EnRSL#AE08sZnngrfRDGMuhHjXh-J2weDuoFN00i49mP_lwkQ
Thanks the mask 8 looks nice and the gamma is ok maybe a little bit dark in my case although I may need to adjust my monitor. I’m currently testing the NTSC shader of guest on NES and Genesis systems and its looks nice especially with the new one mask.
Lut 1 is fixed but Lut 3 is still causing artifacts. (Again only relevant to us sad 1650 mobile users.) My favorite masks are 3 and 7 for a 4k panel. 8 is similar is to chunky at size 2 and very similar to mask 7 at size 1. I’m curious if it is possible to have a mask size 1.5 for 144p resolutions?
Gamma on my display is a bit weird like 2.0 or so, it’s just the best gamma for this display. In theory you should be able to easily correct this with the gamma controls in the shader, but yeah it could also be your monitor.
How do the Trinitron mask controls work, is it the same as Lottes mask low/high?
This is something that’s been bothering me for a while: It seems like the phosphors are bleeding into each other way too much with all the aperture grille masks. IMO this looks very unnatural and is difficult to look at after a while. Is there a way to keep only the vertical phosphor bloom and eliminate/adjust the horizontal phosphor bloom? On a real aperture grille CRT the phosphors only bloom vertically.
BTW this is not specific to guest-dr-venom by any means, it’s something I’ve noticed in all shaders when you use a three color (RGB) mask.
Maybe this is a good reason to stick to two color masks (like mask 0 in GDV), where the effect I’m describing is less noticeable/annoying.
How do the Trinitron mask controls work, is it the same as Lottes mask low/high?
You can pick 2 mask strengths, one for darker colors, one for brighter and the colors in between will get a custom mask strength. If you even the strengths, you get something similar to mask 0 etc. Lottes mask low/high only compensates for brightness loss and each pixel will get the same mask.
This is something that’s been bothering me for a while: It seems like the phosphors are bleeding into each other way too much with all the aperture grille masks.
There could be three reasons for this, one is display technology, where the bleeding should be mitigated on a stiff VA panel and much more present on a plasma, poorer qualiy IPS, tricky to tell in general. The second reason could be that masks aren’t luma compliant, therefore using a same weight mask on white for example results in different perceptive experience for each color component. Hard to fix for masks in general and a reason 2-size masks are more visually pleasing in many cases. Third reason is probably gamma related, with low gamma masks are stiffer over brighter colors.
Is there a way to keep only the vertical phosphor bloom and eliminate/adjust the horizontal phosphor bloom? On a real aperture grille CRT the phosphors only bloom vertically.
You can try with an increased mask strength and then compensate with brightboost or bloom. Scanline saturation should play a role with vertical bloom, but is better visible on higher resolution displays. With size 4-5 beams it can develop itself only partially, same for deconvergence.
OK, last version for today, also found a squishy bug. Afterglow could have interfered with LUT reading, i moved it to the end of the shader. Else there is not much i can do for today.
UPDATE:
https://mega.nz/file/Fg5EnRSL#AE08sZnngrfRDGMuhHjXh-J2weDuoFN00i49mP_lwkQ
Unfortunately the lut issue I described is still not fixed.
It seems the issue may be a result of the changed color temperature code?
Maybe the issue is partially related to your lut?
Depending on the method that you made it
No the lut is fine. It’s a standard DisplayCAL “Video 3D LUT for ReShade”. Worked with all versions of guest-venom2 until the update of yesterday. Also works fine with grade.slang. I think the issue is with the updated code for color temperature in guest-venom2 from yesterday.
EDIT: If I replace “pre-shaders-ntsc.slang” in the current update with the version from last saturday the issue is fixed. Since there seem mainly a rewrite of the color temperature code in this file, I suspect it’s causing the issue as decribed.
Hard to tell, it’s reading the color from the stock pass and applying the lut. If stock pass is replaced, this could be something to consider. Color temperature code is calculated afterwards. Afterglow applied last. And your lut is working fine here. But you can really use the older shaders if you don’t mind using the old desaturation and color temperatures.
Hmm… This is rather dissapointing, can’t we get to the bottom of this? I’m on nvidia if that helps.
I’d rather not stick on a partial update of your shader.
If i do a diff on both files I see an overseeable amount of changes. Could you possibly make two or three versions of “pre-shaders-ntsc.slang” that implement separate sections of the changes. That way we could lock in to what’s breaking it on nvidia. Mind you, this is all working fine on grade.slang also (which also has the D50-D93 type of whitepoint code).
OKAY I’ve nailed it down to the following change:
If I delete this:
c = pow(color, vec3(2.2));
color = c;
And replace this:
mat3 m_out;
if (CS == 0.0) { m_out = ToSRGB; } else
if (CS == 1.0) { m_out = ToDCI; } else
if (CS == 2.0) { m_out = ToAdobe;} else
if (CS == 3.0) { m_out = ToREC; }
by this:
float p;
mat3 m_out;
if (CS == 0.0) { p = 2.4; m_out = ToSRGB; } else
if (CS == 1.0) { p = 2.6; m_out = ToDCI; } else
if (CS == 2.0) { p = 2.2; m_out = ToAdobe;} else
if (CS == 3.0) { p = 2.4; m_out = ToREC; }
color = pow(c, vec3(p));
Then the issue is resolved. Does that help?