Integer scale on or off?

Yeah, it’s really subtle. Here’s a zoom-in of your pic with the brightness and contrast increased:

Ideally, they should all look like the first dark line. That is, one dark line in the middle with equally lighter lines above and below. Skip down 3 scanlines and it’s just 2 mid-gray lines next to each other. Skip down a couple more and we’re back to normal. Rinse/repeat all the way down the screen.

It’s an effect caused by the averaging of the pixels into an uneven integer multiple (that is, 4.5x at 1080p). The remainder comes out as uneven gradients.

EDIT: here’s the same image with the green-tinted lines taken out to make it easier to see the effect:

2 Likes

Now I see the difference perfectly. :sunglasses:

1 Like

Very interesting. Thanks for making it clear.

1 Like

I tried your settings with integer scaling turned on with a 1080p screen, but the aspect ratio looks 16:9 and not 4:3. How do I make it 4:3?

Hunterk, please, see if I understood.

NES output is 256 x 240. For 1080 screen if integer scale on, the max. resolution will be 1792 (256 x 7) x 960 (240 x 4). To don’t have black borders, I have to increase the value for more than 1920 x 1080, resulting in 2048 (256 x 8) x 1200 (240 x 5). But is 16:9. How maintain 4:3 aspect?

Thanks!

Lol! I think we just asked the same question.

This is what I was describing (that is, still some pillarboxing but no letterboxing, and even scanlines):

There’s no logical way to keep a ~4:3 pixel aspect ratio on a 16:9 screen without cutting off lots of screen from the top and bottom. However, we do have support for overlays, which you can use to add images around/on top of the screen, and border shaders that can apply moving designs to the unused area.

1 Like

Yes, ndavis82. :laughing:

Makes sense. I still really can’t tell the difference with integer scaling off, the scanlines still look crisp and nothing looks warped, and at least with scaling off the screen will always maintain a 4:3 ratio. I think I’ll just keep it that way. Thanks for taking the time to explain.

ndavis82, I decided to maintain the integer scale “on” with custom aspect ratio: 1536 x 1120.

Better than the fully stretched image.

:smiley:

1 Like

Yes, but it doesn’t work well for every system. I tried it on Neo Geo and it was cutting off too much of the top and bottom.

I tried only on NES.

It works well for anything that was designed around consoles+consumer TVs. Neogeo games were designed around arcade monitors and would have that same area (sometimes more!) cut off if connected to a consumer TV.

1 Like

Good point. I suppose I could do a core override for Neo Geo and leave the others at Custom. Thanks.

I play Neo-Geo on integer 5x and 1080p without having anything important cut off from the screen. The trick is just to actually deactivate integer scale and put the values manually. If you raise Y position 20 pixels above where it would normally fall, the information at the bottom (credits, level) will reappear entirely without cutting off anything important from the top of the screen.

Try these settings:

Aspect Ratio: Custom. Integer Scale: Off. X Pos.: 160. Y Pos.: -40. Width: 1600 (5x). Height: 1120 (5x).

Why do CRT shaders look bad without integer scaling to begin with? Aren’t they aware of the target size of the image? Why would they scale their scanlines instead of rendering them at the target resolution?

In other words, isn’t it the shader that does the upscaling? Or is it RA that does it?

2 Likes

They don’t look bad in general. They are aware of the target size of the image, but without integer scaling sometimes a line ends up 4 pixels thick, sometimes 5 pixels etc. which can lead to an uneven look. With integer scaling it’s made sure every line has the same thickness, which leads to an optimal look, if you don’t mind some black bars at top/bottom of the screen.

Using a display with greater vertical resolution i.e. 1440p usually solves the problem. Even when using integer scaling, the shrink becomes very small.

@RealNC The sahders must draw their magic taking in account the pixels of the original image. For example, if you have a 240p image in a nes game, the sahder will have to create scanlines between all and every single of those 240 lines… But, later on, the final image will have to be drawn to your screen, being 1080p, 4k, or whatever. If your screen output resolution (1080p for example) is not a multiplier of the original image output, that scaling the sahder is doing will never be perfect, because not every scanline will be able to use the same amount of pixels of your tv. Sorry if I’m not explaining well myself, but it’s a simple concept. Your screen can only output a single color for every single pixel. So, to draw the nes image with all of the sader additions (scanlines and other effects) equally between all of the lines, it needs to use the same amount of real pixels on your tv. If you are not using integer scaling, it will not be able to do it, and so it will have to use some of the pixels (on your tv) to draw a blend, like hunterk showed on previous post.

Recently I upgraded my gaming TV to a 4K screen, and the artifacts caused by non-ionteger scales are less visible (practically invisible on lowres ~240p content.

Of course, if you have a lowres TV or monitor, and you don’t like the integer-scale option and don’t like the artifacts on scanlines, you could still use an overlay instead of a shader. You can make an image (png) on photoshop or gimp, with black lines between transparent lines, and apply this image as overlay on your RA. It will make scanlines perfect, but they will not be synced with the lines which separate vertical pixels in your games (some black lines will be in the middle of a pixel for example).

Greetings.

I do use a 1440p display, but emulating systems like the Gamecube results in huge black bars with integer scaling. Or when cropping the “garbage pixel” areas on the NES or SNES then you don’t have 240p anymore. It’s impossible to get a usable screen size. It seems that the most logical thing to do in this case isn’t possible: supersample/downscale then feed that to the shaders.

DOSBox ECE does this. Although only on the CPU (there’s no shaders) so it’s slow. But basically, you get results that look very close to integer scaling. Example:

Integer scaling: https://i.imgur.com/eWNFvUG.png
Supersampling: https://i.imgur.com/dIQOuwf.png

If RA could do this (on the GPU though), then you could supersample a 256x224 SNES source (cropped) to 320x240 and feed that to the shader, which would then produce the final 1440p integer scaled result. That would give you a fullscreen 4:3 image.

So basically take 256x224, integer scale it to 1024x896 (or whatever works best,) then use a suitable downscaling filter (something that has good sharpness) to reduce it down to 320x240, and then give that to the shader. The resulting image quality would obviously not be exactly the same as integer scaling, but probably much better than the current plain non-integer scaling.

1 Like

I understand what you mean, but it’s still a compromise. It can cause some native pixels/lines to get bigger, but not all of them. Sometimes it’s interesting to use another form of superscaling which i found to give good results with WinUAE. The idea is to increase vertical target resolution and then downsample. Due to discrete nature of displays there is no general perfect solution.

Example how to modify the preset:

scale_type0 = viewport
scale_x0 = 1.0
scale_y0 = 2.0

It should play well with most masks like in crt-geom, hyllian, aperture…the drawback is the speed gets worse as we increase the supersampling resolution.