There really isn’t a need when there are so many NTSC Shaders and presets already.
The Blargg Video filters always tend to alter the colours slightly.
You’re using my first generation video filter presets. Why don’t you grab my CyberLab Custom Blargg NTSC Video Presets from the first post of this thread and try my third gen stuff?
Do note that Blargg video filters do not work with all cores and all games. It has something to do with colour formats.
I personally have not tested them much with PSX Games but others have and have demonstrated that they work.
They didn’t work with the games I tested using Beetle PSX but @RetroCrisis showed them running well in a video in which he used Beetle PSX-HW.
Neutrality in terms of Colour and Brightness was one if my goals for my third gen video filter preset so you can try them out and let me know how it went.
A picture or screenshot would help in me understanding what is the issue you’re experiencing.
Are you combining these with shaders? If so there are many shader presets which already have NTSC filtering built in so there’s no need to use an additional video filter.
Of course I have the latest version of CRT filters, I forgot to mention it. By the way, I think they are great, very good job.
So far, I use CRT filters for games on the Sega Genesis, SNES, and NES.
I noticed that a preset like:
Blargg_SNES_Pseudo_PCE_PSX_SNES_COMPOSITE.filt
Also works with PSX games (Swanstation core)
However, this yellow tint is quite visible in many games.
Sometimes the difference is very small, for example Castlevania Chronicle:
But other times, like here, Metal Gear Solid is already very visible. Moreover, this yellow tint makes the dithering even more visible.
Yes, I’m one person out of 100 who likes dithering on PSX, as long as it’s not so noticeable
As for shaders, it’s hard to find something for me. Too many of them have very agressive scanlines etc. So far, one of the few that I like is from koko aio (e.g. tv_pal_my_old).
Although the preset has the option to enable ntsc, it does not look the same as blargg filters.
Edit:
I also tested the Beetle psx hw. Both games run on the GL Core and Vulkan drivers. Whether the video filters were turned on or not, I didn’t notice any difference. They just don’t work with Beetle PSX.
I use windows 11.
I don’t doubt you. Perhaps @RetroCrisis might be able to verify at some point.
He showed off some PSX blending in a video and when I asked which core he used, he said he always uses Beetle PSX-HW.
Anyway, you might be able to mitigate that colour shift to the warm side might by adjusting the white point in your shader parameters more to the cool side. So by increasing it.
You can probably use some test patterns to assist. 240p Test Suite might come in handy.
I tried adjusting the Hue setting in the Blargg Filter parameters but it didn’t do what I wanted to so I just left that alone.
You can load almost any shader preset and easily lower the scanline intensity though. You can lower Sharpness, NTSC/GTU resolution as well if you prefer things to be less sharp.
Also choosing a finer sized Mask with all else being equal usually makes things softer.
Have you seen my Le’Sarsh_4K_Optimized presets or my latest Near Field presets though?
This may be a rookie question, but it looks like with the Cyber shaders, lower line resolutions (224, 240) result in much larger scanlines than higher line resolutions (440, 480). The scanlines essentially take up half the screen real estate, which is distracting on a large screen sitting close.
Is that accurate to how CRTs use to display 240 content? Was it inherently darker than 480 content because of that? Or is there a setting I may be missing in the shader config? Or am I just sitting too close to my monitor ?
Shader Presets, I wouldn’t want to take any credit for the excellent work of the shader developers.
To a large extent but remember the larger sized equivalent screens that we are accustomed to today were not as common back in the heyday of the CRT.
Plus many parents would have told their kids not to sit too close to the screen.
In general, the lower resolution of the CRT, the less pronounced the gaps between the scanlines and also the smaller the size of the screen the less noticeable the gaps between the scanlines would appear.
On large screen TVs displaying 224/240p content if you sat close enough you could definitely notice the scanlines and coarseness/more jaggies compared to a smaller screen.
No, because the makers of the consoles and computers which would have output this content would have known these things and calibrated the output of their machines accordingly.
Have you tried my new CyberLab Megatron NX Near Field presets?
This is exactly what I’ve done to create them.
You see, when I created the vast majority of my presets, I did so while sitting at a far viewing distance using a 55" TV. A lot can happen to the image by the time it reaches your eyes at that distance.
Things like edges that might seem harsh and jaggy might tend to blend by the time your eyes resolve what you’re looking at.
Colours and Brightness that might seem retina searing and loud might actually appear a little less intense but almost perfectly saturated at that distance as well.
Lastly, the effects that sell the CRT illusion, which for some might be the scanlines and the mask might be just visible enough to notice at that distance.
Then there’s the whole idea of an OLED TV’s infinite contrast, which might allow one to get away with a less bright image overall while still providing a stellar viewing experience.
The final and probably most determining factor in the reason why my presets generally looked the way they looked was this thing they call focus. Anything less sharp than what you see in those presets there and things looked like a blurry mess to me. Maybe it’s my eyes.
Within recent times I’ve changed my viewing setup so I can’t really enjoy most of my previous presets in the same way, so I’ve created these new Near Field Presets.
I did attempt to make some less harsh presets which I called Le’Sarsh_4K_Optimized. Have you tried them?
I also followed that with my “Fine” presets.
All of them are quite customizeable though so if you need assistance adjusting any of them that you might otherwise like, feel free to post an example and we can discuss what might need to be done.
The last thing I’ll say is that one of the trademarks of my presets is being able to go close to the screen and see the phosphors like one might have remembered on a real CRT.
Besides the loss of brightness due to the scanlines this further darkens the image considerably.
Traditional CRT Shaders have tricks which get things brighter but they always compromise the accuracy of the Mask or the Scanlines to achieve those results or should I say, they compromise the integrity of the Mask and Scanline effect because what I’m doing is not geared towards any type of reference level accuracy.
I don’t use too much of those tricks but there are limits to how high in can push the Post CRT Brightness or Gamma Correct because sooner or later the image will end up clipping.
The same goes for Saturation.
The good news is that HDR or pushing a bright SDR display to its brightness limits can mitigate some of these issues and this is the main reason why I have fully embraced the Sony Megatron Color Video Monitor shader.
However, RetroArch has HDR support built in and it’s a global effect that can be used with almost any other shader so that’s something that can be used to enhance my or other older, inherently, relatively darker shader presets provided your display hardware is up to the task and you can co figure it properly.
I’m working on a 42" HDR OLED (LG C2), on my desk, so about 12"-18" away.
I have your near-field on my box, but haven’t gotten them to load yet. Config issue I’ll work on tonight.
I’m working on SH1 on PSX. It’s a challenge because the geometry and textures are both potato, so scaling up breaks the immersion. But the art is also amazing, so trying to extract the most out of it at native res is a fun challenge.
In-game is 224 lines, while menu and map screens are both 480 lines.
Here’s a couple of examples of the scan line difference at 224 and 480 (I can’t recall the specific preset I was using here)
As an aside, I took one of your cybertron shaders, disabled scanlines, and then rotated the phosphors 90deg. The net effect is very mild horizontal scanlines (control wires from the trinitron?), which helps with prominent horizontal aliasing at 224, without having to upscale. Lighting is too dark, and it loses maybe too much detail, but the blend is phenomenal and it works great at near-field.
I’m curious as to why? How are you sure that this is the “correct” brightness?
Don’t you think that the devs of the game would have expected some brightness to be lost via scanlines or rather tested their game on a CRT with actual scanlines and adjusted the output of the game to suit?
Also, what preset is this you’re using? Have you delved into my CyberLab Neo-GX stuff?
There are many ways to achieve a brighter image using the shader parameters but since you’re using an HDR capable display, one of the easiest and best methods would be for you to simply enable HDR and follow the steps in one of my previous posts.
Thanks for your reply and suggestions.
When it comes to shaders, there are many interesting ones and there is always something to choose from. However, what I care most about are image filters that can simulate specific “interference”, “artifacts”, “noise” - generated by the composite cable connected to the TV.
Unfortunately, although I have tested various crt shaders, I have never found one that generated it as well as the blargg filters.
Here’s an example of what I’m talking about:
Crt - composite image on the snes9x emulator with the blargg filter enabled. The image is very faithful to the real CRT TV.
I know it’s probably not possible, it would be great to see such filters for PSX, C64 and Amiga one day.
CRT-GUEST-ADVANCED-NTSC can do all of this arguably just as well as Blargg NTSC Video filters.
Makes me wonder what exactly you’ve tested, how long ago you tested and how many shader presets you tested. It also makes me wonder if you adjusted any if the “knobs” before coming to this conclusion.
I’ve seen this video before. It’s not really what I would consider reference quality probably due to the compression artifacts and low resolution.
Just remember things are always in motion in this field and there have been great strides in NTSC emulation over the past 12 months.
One of my few remaining issues with the Guest-CRT-Advanced-NTSC solution is some strange combing looking horizontal artifacts that I notice when things like the rainbow effect in Sega Genesis games are also present.
Can’t remember if that happens in real hardware but I’m not sure if it will be visible if I enable merge fields but then that wouldn’t be accurate to Sega Genesis.
I don’t think I’ve noticed it on the Blargg Genesis Composite Video Filter but the default setting used in Genesis Plus GX as well as Nestopia and Mesen has too much dot crawl for my taste.
I have so many NTSC presets, including entire line of CyberLab Neo-GX and CyberLab Megatron and Megatron NX presets and to me that type of look that you’ve described, isn’t as exclusive and elusive as you make it seem.
Also, there is not any single look or image that is representative of what a CRT looks like as they have a wide variety of characteristics depending on the make, model, type of tube, mask, resolution, calibration, age and more.
No preset for this screenshot, just raw pixels from the emulator, with 32-bit rendering and no dithering.
Great question about developer brightness intent. It’s difficult to get a perfect gauge of brightness. In addition to raw pixels, I’ve been looking at (really rough) CRT captures to try to get a ballpark reference. I played this game a lot as a young man, so I’ve also got some (fuzzy) memory reference.
@keylimesoda Just use CyberLab Megatron NX 4K HDR Game PSX Composite Slot Mask Smooth Near Field.slangp or one of my PSX NTSC, Neo-GX PSX or CRT-Royale PSX presets and call it a day.
Be sure to calibrate your display brightness using the in game Options menu.
[ERROR] [slang]: Texture name ‘PrePass’ not found in semantic map, Probably the texture name or pass alias is not defined in the preset (Non-semantic textures not supported yet)
Is very likely because your preset has been saved as a full preset, but then the shaders updated and are now expect different things to be in the preset.
For the future I would recommend saving your presets as simple presets which would avoid this issue.
I don’t think that’s how developers did things.
It would be impossible to account for all sorts of CRTs that way.
SDR was the standard so they mastered for 80-120 nits.
Most CRTs can hit this brightness level.
And they obviously made their games with SD sets in mind considering their games were always 320px wide at best, not with higher resolution/TVL tubes in mind. Because what most consumers had at home were sets between 200 to 400 lines.
When retro devs say they accounted for scanlines they are referring to active scanlines and they are talking about stuff like color bleed(also known as the half dot technique by Japanese devs) for example used in arcade games like Mortal Kombat or Metal Slug where the devs would sometimes use particular CRTs to get the image they wanted.
Nobody actually wanted the blank scanlines in their games, they were considered an artifact, but at bigger screen sizes and sharper screens the blank scanlines were a lot more prominent but that was the trade-off for making say an Arcade game look more impressive via screen size.
Thanks, I used to do that but I then I wanted my newer presets to be more or less stand alone and in a steady state.
What I think actually happened in this case is that when I started building my latest preset pack. I started with a preset that referenced the old folder structure rather than the new one.
I knew I had encountered that and rectified it in past preset packs so I didn’t expect it to rear its head again.
I think it should work well now for users with new installations since I updated all of my presets using the new folder structure.
My hope is to one day include self contained shaders along with my presets as you do with the Mega Bezel. That way the output would remain the same once a user is using a particular version of my preset pack.
I think it all boils down to your usage scenario and preference. My old OLED eventually started to get some burn-in of scanlines.
I don’t know if I would go OLED again, although so many great strides have been made in burn-in prevention that the new OLEDs may not be at much risk relatively.
Then, there is the brightness advantage and shaders and presets like the Sony Megatron Color Video Monitor really shine when given as much brightness as possible.
Then when it comes to everything else, for example Movies and stuff, this particular TV is supposed to be providing an “OLED like” experience when it comes to micro contrast while beating any OLED in the highlights.
I think it seems good enough to at least give it a try.
At least the Sony that is, due to its next generation dimming processing, which seems to be providing a high level of granular control.
More than any other Mini-LED I’ve seen or read about to date.
Then as an added bonus, it might even have an RGB subpixel layout.
So the bottom line is that it should be better in certain areas and worse in certain areas compared to OLED but in areas where the previous generations of LED/LCD tech have fallen flat against OLED, this TV should at least be somewhat competitive.
Remember, all CRT Shaders are not the same. A shader like Sony Megatron Color Video Monitor (or any other shader that is setup to use maximum mask strength and scanline opacity) is going to end up with an inherently darker image.
High brightness, aided by HDR or otherwise is the current brute force method to counter that and the results are quite impressive.
Mini-LED TVs, while not being as precise as OLED TVs, also have perfect blacks.
I don’t think the latency factor will be much of an issue unless the screen display is really horrible at it.
An OLED TV’s extremely low latency can actually be a disadvantage in some scenarios leading to a more jittery/juddery experience due to frames not transitioning and blending smoothly into one another.
This is noticeable on things like slow scrolling end credits in movies for example.
While the MiniLED won’t have per pixel perfect black levels the contrast ratio should be just as impressive if not even more so due to having true blacks in the areas which should be black and at the same time being able to push higher peak and sustained brightness in the highlights.
So theoretically, both should be quite impressive but it all depends on if the local dimming algorithm can keep up with the action and just works without being noticeably late to kick in or making a mess of things.
Sony has shown where their algorithm/processing is a cut above the rest by demonstrating a highly detailed greyscale map of the image using their Mini-LED backlighting system with good granularity between light to dark per Mini-LED zone.
Remember something only has to be good enough by a certain threshold before we reach the point of diminishing returns.
Now this is all hype and speculation until someone or some people actually go out and try the TVs themselves in these scenarios.
High Brightness headroom is extremely important when used with CRT Shaders to be able to use BFI.
Besides improving motion clarity on the whole the mask tends to appear as a blur on sample and hold display technology when thing start scrolling and moving fast on the screen.
Higher levels of BFI reduce this effect while improving motion clarity so that’s a possible win for the Sony Bravia 9 Mini LED in theory but we will have to see who is brave enough to put it into practice.
@Nesguy bought a 1,000 nits Mini-LED monitor but was not impressed by the local dimming in CRT shaders but the high brightness was still sufficient to provide him with a wonderful experience overall to the point where he no longer complains err…posts often about negative or unsatisfactory experiences.
This Sony Bravia 9 is supposed to represent the next level or generation of this local dimming processing but it has been hyped and marketed quite a lot so we’ll see.
Then there’s the no burn-in and possibly more accurate subpixel layout.
I would definitely give it a try if I was in the market for a new TV but either way, you should be satisfied.
Just remember this, with HDR and high brightness, coupled with highly intense scanline gaps, you introduce an almost worst case scenario for uneven wear of OLED subpixels, which is what I think eventually took place with my TV.
Modern OLED TVs are supposedly more robust but who is to say it can’t happen with them as well?
@hunterk provided me with a shader that shifts the entire screen up and down by 0.5 pixels over time, so this in theory should help the unused subpixels to be used but it might need some fine tuning because if used in timer mode the pixel shift can be jarring and noticeable during gameplay and unfortunately it doesn’t work with Sony Megatron Color Video Monitor out of the box at the moment.
Someone who knows more about these things than myself might have to be be able to figure that one out.