Sony Megatron Colour Video Monitor

DCI-P3 isn’t the correct colour gamut for this scenario we’re using HDR10 which is Rec. 2020 colour gamut that is used in HDR. Not sure why DCI-P3 is being used - probably just as an intermediate colour gamut to do the fake lighting in. Not really sure. You’d have to tonemap that back out to Rec. 709 which is the opposite of what we’re doing

You might want to get latest as I changed the default contrast last night to be slightly less as it was producing oversaturated results.

Tbh this is a per core/per arcade thing as a SNES needs a different contrast ratio to the SFA3 arcade say.

1 Like

There’s an issue here: try using Rec. 2020 and it looks wrong. What I know: HDR movies are shot in Rec. 2020 and, as an example, my TV is DCI-P3. It always converts Rec. 2020 to this gamut. Rec. 2020 is a standard that no TV, AFAIK, has ever been able to completely display. I understand the logic but there is some nuance there.

Most of the artifacts and crosstalk need to happen during the de/modulation, so yeah, they fall under the purview of the NTSC/PAL shaders, as opposed to a general “wire simulation”.

1 Like

Yes maybe your TV is internally converting it to DCI-P3 but we’re before that here the GPU output for HDR is definitely Rec. 2020.

The parameters of the shaders allow for some flexibility (e.g. adjustable sharpness) but results in reality depend not only on the connection, but also the system. E.g. connecting a Wii or Playstation 2 via composite should be better than a Genesis/Mega Drive or NES/Famicom, some systems generate colors via NTSC composite artifacting (PC-CGA, Apple II) etc. You e.g. see this reflected to some extent also in the various CPU filters that Blargg did (which you can activate in addition to shaders, core specific ones can be different than the standard video filters options you’ll get under RA)

1 Like

This makes sense and I suppose ties in with what I was saying earlier about having this kind of thing before the inverse tonemapper etc.

1 Like

yeah, i don’t think it would provide any benefit to run it through the HDR pipeline.

For what it’s worth, I have a couple shader presets using 100% GDV Mask Type 4 and Scanline Type 2 and they look fine to me even without HDR. They might look even better wih HDR though.

Many of the blurry shader presets I see remind me of VHS more than they do of Composite or S-Video straight from a Console to TV/MONITOR. Particularly when I see excessive fringing and lower saturation. The slower the recording speed as well as the more times you recorded over the same tape, the more these artifacts became amplified.

2 Likes

@MajorPainTheCactus

Maybe you could post some close-up photos showing the RGB subpixels? I’m hoping the bloom obscures the pixel grid to the point where it’s not visible.

I feel that until @guest.r, @HyperspaceMadness, @hunterk, @Duimon, have access to an HDR display we’ll not be able to get some progress on this question,🤷 I can provide my views on issues but nothing can compete with first hand experience.

2 Likes

OTOH, the HDR thing means there aren’t a lot of problems to solve. You just draw the stuff and it works. All of the complexity and problem-solving was to make up for insufficient brightness!

6 Likes

So what do you expect they will do better on an ultra bright display? Add more blur? :slightly_smiling_face:

2 Likes

I’m going to try it with my monitor, although it seems like it supports hdr10 but max brightness is 400 nits, so I’m not sure what it will look like compared to a 600 nit or higher brightness

4 Likes

So it’s using one of the standard pixel grids currently it’s defaulted to the 4 pixel one of: red, yellow, cyan, blue. Although you can use the simple 3 pixel one and I put in a whole load of others to use - I haven’t implemented any slot or shadow mask patterns (because it’s a PVM shader).

However I’m not quite sure what you are after as on my actual PVM the pixel grid is visible it just depends on your viewing distance, size of screen and colour/brightness of the pixel. In fact as I’m sure you know it’s what gives the CRT (or at least PVM) look.

So what I will say is that at about 3 feet the screen looks about the same as on my PVM as in on both it starts to get difficult to distinguish the phosphor grid.

Using 4 pixels at 4k it is a lower horiz resolution as I think you pointed out at 540TVL so this is going to show the phosphor grid more than my 600TVL PVM.

Also if you want higher end PVMs like the 20L4 or BVMs which have more black between scanlines then I think you’re going to need more brightness as currently on my Eve Spectrum which is about 700 nits this shader is much duller than my actual 2730 (and I’d assume they were as bright or brighter than that much older model)

1 Like

First of all it might make sense to present your issues and then we can see if some progress can be made?

I mean no offense. I mean that, from an end-user point of view, once we get some of the most prevalent shader authors involved in the latest technology, we can probably gain access to better shaders.

Look we can always improve shaders for sure this is never going to be a ‘done’ thing. Someone somewhere is going to like one look better than another and so there is no shader that will satisfy all tastes.

What would be helpful is to describe what you mean by ‘from a user point of view’ - are you after: slot/shadow masks, curved geometry, convergence problems, NTSC/PAL signal artifacts, RF/Composite/s-video artefacts, phosphor afterglow, more colour control, more control of what is there already?

You have to bear in mind this is a very different way to doing things than before and there isnt going to necessarily be a mapping from all the things done in SDR shaders to HDR ones.

You can of course implement all the blooms, noise generators you like in HDR as well but other than for replicating a poor signal then they probably aren’t needed as much or to put it another way what part of the CRT screen are you trying to replicate that you aren’t doing with brightness?

1 Like

You already have many options, i don’t really get what you are trying to say here or just starting some flame about “prevalent authors and better shaders”, because that’s a bit disrespecting you know and looks like some bait to start some flame war. After all you could always use Guest.r-Dr.Venom-advanced, disable what you don’t need and go on. If you expect to look exactly like a BVM monitor just because it’s 4K and bright, probably you are in bad luck lol.

PS if you want a BVM just buy a 1280*1024 CRT PC monitor. It will cost you 50 $. Then apply interlacing shader :wink:

2 Likes

At a slight tangent has anybody seen a BVM in person (I haven’t) but I can imagine the scanlines are super bright as the size of the gaps between them looks massive - I wouldn’t be surprised if it’s a 25%(scanline)75%(blank) ratio and that’s not taking into account the scanline being brighter at the center than the edges.

You sadly can’t get this across in photos at least taken with camera phones and displayed back on phone displays.