Correct Geometry - Aspect Ratio for different systems

[QUOTE=Dogway;21286]There are many things off with that approach.

Setting integer ON means upscaling will stop as soon as integer values can’t be found, in other words you will be left with black margins. On top of that, by not cropping “overscan” (technically NAB) you are thickening those borders. In the end you are playing at best in a 50" display with the experience of a 32" one.

Now to fix that you say you go to TV service menu (you should never touch these tbh) and stretch the image with whatever TV’s internal scaling method is (on the other hand nothing retroarch scale shaders can’t do). And since game resolutions (integer scaling) and overscan (borders) varies on a system or even game by game basis you end up having to tune your TV for each one if you don’t want either to crop game content or have black borders…

Personally I prefer to touch configuration files only once for each system and stick with it, no messing with TV, besides not everybody have multiple HDMI inputs, for instance I only have one.[/QUOTE] I respect your opinion to have the maximum gaming area possible for your games. Especially if the monitor/TV you’re playing on isn’t that big to begin with.

If you want the true best picture when playing games, you’ll want to have even more specific configs per game, taking overscan into consideration. Some early NES games actually don’t suffer at all from viewing the complete 256x240 screen. Super Mario Bros. has no obvious graphical issues, for instance. In that case, the playing field is actually expanded, much like adding true widescreen (aka not cropping) to an older game. The vast majority of games, though, will have some sort of graphical issues on those scanlines. In the case that a game exhibits junk sprites in the top or bottom 16 scanlines, I actually scale the game 5x on my 1080p screen. There are several games that do this on the NES like Contra, Mega Man 2, and Batman: The Video Game. Doing that will give you (240 - 32) * 5 = 1008 vertical pixel screen to play with.

I’m not kidding when I say that I shuffle between around 20 different configs just for NES games. Luckily no other system, sans possibly the PlayStation, has to be as bothersome in this regard. I use Advanced Launcher to run each game, so I don’t have to bother with loading each specific config (they’re actually just append configs) when I launch a game, just set up the loading parameters once and be done with it. The way RetroArch handles loading them now might make things easier for others with less configs, but I don’t think it’ll quite work for my level of extravagance!

I don’t trust scalers with the low-res 2D games I play, so I personally always keep integer scaling ON. Any extra space there is on-screen I cover with a CRT TV border, which I scale to several different sizes based on how much cropping is needed for the specific game. That way at least it seems like I’m seeing the frame of a TV as opposed to extra black pixels. The TV image I use is of a silver, flat screen Philips CRT, which IIRC is decently close to 4K native resolution. I can’t seem to find it on Google Images anymore, but I think I still have the source image on my PC.

And yeah, having to mess with the TV’s actual settings for emulating different games or systems is a bit too much IMO.

Setting integer ON means upscaling will stop as soon as integer values can’t be found, in other words you will be left with black margins. On top of that, by not cropping “overscan” (technically NAB) you are thickening those borders. In the end you are playing at best in a 50" display with the experience of a 32" one.

NES frame is 256x240.

240 x 3 = 720, a perfect integer multiple. There will be no black border at the top/bottom unless this is included as part of the NES output frame. (in which case, you’re supposed to see it, e.g., Castlevania)

Now to fix that you say you go to TV service menu (you should never touch these tbh) and stretch the image with whatever TV’s internal scaling method is (on the other hand nothing retroarch scale shaders can’t do). And since game resolutions (integer scaling) and overscan (borders) varies on a system or even game by game basis you end up having to tune your TV for each one if you don’t want either to crop game content or have black borders…

Except you don’t have to go to the service menu at all. On my 60" Panasonic Plasma the option is called “overscan” and it’s in the user menu under picture settings. On my 24" Vizio the option is called “wide” and it’s in the user menu under picture settings. And this setting is saved per input so that I only have to set it one time and then never touch it again. Even if you don’t have these options you can achieve the same thing with the “horizontal size” and “vertical size” options on all TVs, and you just have to set it once and then forget about it. Besides, you only have to do this when playing NES because the NES displayed junk pixels in the overscan area. If you’re cropping overscan on other systems, you are not only butchering the image, you might be losing important information.

There is NO black border, by default, with the suggesting settings I posted. If you have a black border it’s because you have set your monitor to 1080p by mistake. The entire NES output frame, including overscan, scales up perfectly to 720p. Almost ALL console systems output 240 horizontal lines, so you can use this for almost all console systems. VERY rarely you will encounter a game that uses a different res (Colony Wars on psx for example) but this happens so infrequently that it’s not a hassle at all to create a rom-specific config, and it’s easy to do this via emulationstation on the retropie.

Really not getting this “game resolutions and overscan varies on a system or even game by game basis.” Setting up a system-specific config is extremely easy to do and you have to do it anyway if you want to use different controllers for different systems, so I never even gave it a second thought. Games on a particular system vary so rarely that it’s not a hassle to set up game-specific configs when necessary.

Personally I prefer to touch configuration files only once for each system and stick with it, no messing with TV, besides not everybody have multiple HDMI inputs, for instance I only have one.

I’m only touching config files once as well as only touching my TV’s controls once. You’re making some pretty strange statements, here. I think it’s because you’re going for some weird 8:7 thing, while my suggested settings are for getting a perfect CRT-like picture, i.e., getting a 4:3 display on a 16:9 monitor with minimal scaling artifacts.

[QUOTE=bleakassassin;21301]

And yeah, having to mess with the TV’s actual settings for emulating different games or systems is a bit too much IMO.[/QUOTE]

Here’s what I originally said:

“you can use your TV’s controls to stretch the image to fill the screen, though this is somewhat cumbersome. However, most TVs save settings for different inputs so if you dedicate one of your HDMIs to 720p and then use the TVs crop overscan or stretch controls you don’t have to set it every time.”

You can see that taking part of the quote out of context greatly changes its meaning. I never suggested messing with the TV for different games/systems.

First of all, one should understand that literally the ONLY system where crop overscan is even necessary is the NES. This is because the NES displayed junk pixels in that area. No other system that I know of does this, so it’s not even necessary, and it’s actually butchering the image, to crop overscan.

So basically, the only time you would ever need to touch the TVs controls is when playing NES, and it’s a snap on most TVs to just select “wide” or “crop overscan” or whatever via the user menus. And this is only if you care about junk pixels - the picture ALREADY fills the screen with the suggested settings with crop overscan left ON. If you aren’t getting fullscreen by default with the suggested settings, it’s because you didn’t set your monitor’s output resolution AND retroarch’s output resolution to 720p.

You should set your output resolution to 720p, anyways. There is nothing to be gained by playing low res games in a higher resolution except additional scaling artifacts and a greater load on your GPU.

[QUOTE=Nesguy;21306]Here’s what I originally said:

“you can use your TV’s controls to stretch the image to fill the screen, though this is somewhat cumbersome. However, most TVs save settings for different inputs so if you dedicate one of your HDMIs to 720p and then use the TVs crop overscan or stretch controls you don’t have to set it every time.”

You can see that taking part of the quote out of context greatly changes its meaning. I never suggested messing with the TV for different games/systems.

First of all, one should understand that literally the ONLY system where crop overscan is even necessary is the NES. This is because the NES displayed junk pixels in that area. No other system that I know of does this, so it’s not even necessary, and it’s actually butchering the image, to crop overscan.

So basically, the only time you would ever need to touch the TVs controls is when playing NES, and it’s a snap on most TVs to just select “wide” or “crop overscan” or whatever via the user menus. And this is only if you care about junk pixels - the picture ALREADY fills the screen with the suggested settings with crop overscan left ON. If you aren’t getting fullscreen by default with the suggested settings, it’s because you didn’t set your monitor’s output resolution AND retroarch’s output resolution to 720p.

You should set your output resolution to 720p, anyways. There is nothing to be gained by playing low res games in a higher resolution except additional scaling artifacts and a greater load on your GPU.[/QUOTE] Perhaps my statement wasn’t clear.

The way I phrased my last statement wasn’t meant to be directed directly at you. It was a general statement that I stand by. I don’t use my standard TV remote because of a combination of other remotes I need to use, most significantly my DVR remote. It doesn’t have a button mapped for messing with overscan, and using the actual TV menus with it is next to impossible. I’m fine opting for using custom borders for various levels of overscan for NES games as opposed to changing a setting on my TV to do that. People who don’t want to configure files for RetroArch that much will be fine with their TV’s overscan feature provided it can be conveniently modified in their setup.

And I use my TV and PC for more than just RetroArch. There’s no way I’m setting it to 720p for that. If you want to use the minimum-required resolution for these TVs, you may as well set the resolution to 480p. Heck, the only reason you can’t go even further and use 240p is because most modern TVs don’t support resolutions that low. Minimum resolution talk just becomes a slippery slope once you go down it. I use integer ratios, so there are never any scaling artifacts with the games I play (I don’t emulate N64 games, and I haven’t given PPSSPP enough time yet to incorporate it into my setup while considering it).

Sure, those who use 1440p monitors, 4K TVs, or even 720p TVs will be fine, but I assure you that there are many people who are using 1080p screens and want to keep their resolutions at 1080p. If anything, using a 720p resolution will increase artifacts because of it not scaling perfectly to that resolution. Also, come on. These are 2D systems (and 3D systems ran at native res with no anti-aliasing). GPUs aren’t going to have much trouble at all rendering the games. Maybe if they go crazy with internal resolutions for emulators for 3D systems that support changing the internal res, but not so otherwise.

I’m wholly satisfied with how the picture is now. In fact, there’s an argument for wanting to use a resolution higher than that. In the case that you want/need a proper 4:3 output resolution for NES games, you need to scale to 4x the standard size. Then you’ll be able to scale the pixels by a 5:4 ratio evenly and get a 1280x960 picture. I do that for Mike Tyson’s Punch-Out!!

[QUOTE=bleakassassin;21310]Perhaps my statement wasn’t clear.

The way I phrased my last statement wasn’t meant to be directed directly at you. It was a general statement that I stand by. I don’t use my standard TV remote because of a combination of other remotes I need to use, most significantly my DVR remote. It doesn’t have a button mapped for messing with overscan, and using the actual TV menus with it is next to impossible. I’m fine opting for using custom borders for various levels of overscan for NES games as opposed to changing a setting on my TV to do that. People who don’t want to configure files for RetroArch that much will be fine with their TV’s overscan feature provided it can be conveniently modified in their setup.[/quote]

Okay, understood - sounds like you are in a unique situation though. An easy workaround would be to set the input you use for retroarch to 720p, and if you are playing NES games to find the crop overscan adjustment (or whatever it’s called on your TV). Use your normal remote for this, the TV will save the settings for that input.

The DVR remote most likely has an “input” button for scrolling through different inputs. So you can still use your DVR remote for everything.

And I use my TV and PC for more than just RetroArch. There’s no way I’m setting it to 720p for that. If you want to use the minimum-required resolution for these TVs, you may as well set the resolution to 480p. Heck, the only reason you can’t go even further and use 240p is because most modern TVs don’t support resolutions that low. Minimum resolution talk just becomes a slippery slope once you go down it. I use integer ratios, so there are never any scaling artifacts with the games I play (I don’t emulate N64 games, and I haven’t given PPSSPP enough time yet to incorporate it into my setup while considering it).

I use my TV for RA (with a raspberry pi) and for watching TV and movies. It saves the picture settings independently for each input. In fact, all the TV’s I’ve ever used do this, so it isn’t even an issue.

There are likely to be issues going to 480p that you won’t encounter with 720p. Most TVs now put out 720 and 1080p natively, setting it to 480p might introduce input lag. In addition, many shaders need HD resolution to work right.

Using integer ratios at 1080p means you will get large black borders on the top and bottom of all 240p systems. If you want to use an ugly overlay to deal with that, that’s one way of doing it. My solution avoids this. I personally think overlays are a horrible solution and kind of silly.

Sure, those who use 1440p monitors, 4K TVs, or even 720p TVs will be fine, but I assure you that there are many people who are using 1080p screens and want to keep their resolutions at 1080p. If anything, using a 720p resolution will increase artifacts because of it not scaling perfectly to that resolution. Also, come on. These are 2D systems (and 3D systems ran at native res with no anti-aliasing). GPUs aren’t going to have much trouble at all rendering the games. Maybe if they go crazy with internal resolutions for emulators for 3D systems that support changing the internal res, but not so otherwise.

It’s a pretty rare situation to have only one input on a 1080p TV. Almost all TVs these days have multiple HDMI inputs that also save picture settings per input, so it’s not as big of an issue as you’re making it out to be. This sentence “720p will increase artifacts because it won’t scale perfectly to that resolution” doesn’t make sense. 240p consoles put out 240 horizontal lines of resolution which scales perfectly to 720p; it’s an exact integer multiple of 3.

And yes, GPU usage is very much a factor when talking about the raspberry pi/retropie, one of the primary uses of retroarch and probably (I’m guessing) the single largest user base. If you want to use shaders at all on the retropie you have to set the resolution to 720p as 1080p will cause stuttering with even the simplest shaders.

I’m wholly satisfied with how the picture is now. In fact, there’s an argument for wanting to use a resolution higher than that. In the case that you want/need a proper 4:3 output resolution for NES games, you need to scale to 4x the standard size. Then you’ll be able to scale the pixels by a 5:4 ratio evenly and get a 1280x960 picture. I do that for Mike Tyson’s Punch-Out!!

The entire purpose of the suggested settings I posted was to get a proper 4:3 output resolution on a 16:9 monitor.

256x240 is the size of the NES frame. This scales perfectly to 720p. The x axis will be off, but this doesn’t matter because with nearest neighbor as the filter you will always get artifacts on at least one axis. With 720p you get perfect integer scaling on the y axis and you then use an unobtrusive shader to do some slight blurring on the x axis and you’re good.

If you’re scaling the pixels by 5:4 then you are essentially butchering the image IMO. The pixels are meant to be in a 4:3 ratio, same as the display ratio because on a CRT the shape of the pixels was determined by the shape of the display - 4:3. CRTs only ever altered this in order to compensate for flaws in the CRT itself. On an ideal CRT like a high-end production or broadcast monitor you could display the entire overscan area and get perfect 4:3 pixels.

[QUOTE=Nesguy;21311]Okay, understood - sounds like you are in a unique situation though. An easy workaround would be to set the input you use for retroarch to 720p, and if you are playing NES games to find the crop overscan adjustment (or whatever it’s called on your TV). Use your normal remote for this, the TV will save the settings for that input.

The DVR remote most likely has an “input” button for scrolling through different inputs. So you can still use your DVR remote for everything. [/QUOTE] In my experience, DVR remotes tend to have inconsistent success with the Input button. Sometimes it works as intended. Sometimes I can’t use the directional keys to switch inputs. Even worse is when I also can’t use the numbers. Then I can only switch inputs by repeatedly pressing the Input button. It works, but it’s irritating.

My ideal long-term solution will be to stop subscribing to cable, of course. :stuck_out_tongue:

I use my TV for RA (with a raspberry pi) and for watching TV and movies. It saves the picture settings independently for each input. In fact, all the TV’s I’ve ever used do this, so it isn’t even an issue.

There are likely to be issues going to 480p that you won’t encounter with 720p. Most TVs now put out 720 and 1080p natively, setting it to 480p might introduce input lag. In addition, many shaders need HD resolution to work right.

Using integer ratios at 1080p means you will get large black borders on the top and bottom of all 240p systems. If you want to use an ugly overlay to deal with that, that’s one way of doing it. My solution avoids this. I personally think overlays are a horrible solution and kind of silly.

I had no idea that 480p could introduce input lag. I always thought that was something else that could be altered via a “Game Mode” option on the TV, not the resolution itself. Not knowing much about the technical aspects surrounding it, it really seems like HDTV manufactures just don’t give a damn about giving the best support to legacy hardware.

I can see entirely what you’re saying with the 720p thing. I’d just rather not have to mess with the TV’s actual resolution if I don’t need to unless I’m playing on a game system connected to my composite/component splitter box (my N64 and OG Xbox, mostly), but in those cases the resolution automatically adjusts itself. My PC is used for PC games at 1080p as well as for playing Blu-Ray rips. More importantly, I launch my games via XBMC/Kodi, so keeping the TV’s resolution the same as often as possible is ideal. Of course, using 1080p resolution means either an unevenly scaled image or an image with black bars on all sides. I guess I’m just fine with the latter and using a TV border overlay to cover the rest. It looks pretty fancy to me, if I do say so myself! I’ll agree that most borders look silly, but I’ve found in my personal experience that attempting to simulate a realistic feel via a border (as in, using a TV bezel and some background material like actual wallpaper and whatnot) actually doesn’t detract from the game’s image.

It’s a pretty rare situation to have only one input on a 1080p TV. Almost all TVs these days have multiple HDMI inputs that also save picture settings per input, so it’s not as big of an issue as you’re making it out to be. This sentence “720p will increase artifacts because it won’t scale perfectly to that resolution” doesn’t make sense. 240p consoles put out 240 horizontal lines of resolution which scales perfectly to 720p; it’s an exact integer multiple of 3.

I know that different inputs have different settings. It’s just that I’m using an all-in-one PC. I’m pretty sure I never said that scaling a 240p image to 720p would cause artifacts. Of course integer scalings will always work fine. If I did say that, it wasn’t what I meant. What I was trying to say was that scaling the 240p-image-already-scaled-evenly-to-720p image to 1080p would lead to a less-than-ideal image. But I guess the not-as-clear image wouldn’t technically have artifacts, it just wouldn’t be crisp. Not that I play my retro games unfiltered, mind you.

And yes, GPU usage is very much a factor when talking about the raspberry pi/retropie, one of the primary uses of retroarch and probably (I’m guessing) the single largest user base. If you want to use shaders at all on the retropie you have to set the resolution to 720p as 1080p will cause stuttering with even the simplest shaders.

I could have sworn that the shaders were CPU dependent. I guess not? Then yeah, those’ll add to the stress. I also was just talking about this with respect to more traditional PCs, particularly those that are multi-purpose. Do you think the Raspberry Pi 2 would be able to run emulated games with those settings without issues? Because that’d be the way to go for those seeking a low-powered route, then.

If I was using a Raspberry Pi strictly for RetroArch and similar programs, then I’d be completely fine using a 720p resolution for it.

The entire purpose of the suggested settings I posted was to get a proper 4:3 output resolution on a 16:9 monitor.

256x240 is the size of the NES frame. This scales perfectly to 720p. The x axis will be off, but this doesn’t matter because with nearest neighbor as the filter you will always get artifacts on at least one axis. With 720p you get perfect integer scaling on the y axis and you then use an unobtrusive shader to do some slight blurring on the x axis and you’re good.

If you’re scaling the pixels by 5:4 then you are essentially butchering the image IMO. The pixels are meant to be in a 4:3 ratio, same as the display ratio because on a CRT the shape of the pixels was determined by the shape of the display - 4:3. CRTs only ever altered this in order to compensate for flaws in the CRT itself. On an ideal CRT like a high-end production or broadcast monitor you could display the entire overscan area and get perfect 4:3 pixels.

My idea with the 5:4 pixels is that it would transform the 1024 x 960 image into a 1280 x 960 image, thus displaying a picture that has a 4:3 aspect ratio while performing the exact same 5:4 scaling to all pixels. I always thought that 4:3 was generally used to describe the aspect of the displayed image, not the aspect ratio of the pixels themselves. It’d be like me saying that a 640 x 480 image is a 4:3 ratio, but each pixel would still be 1:1 if the image kept to that ratio.

I’ll always slap on a good composite filter to finish the look of older console games, though, regardless of resolution and aspect ratio.

[QUOTE=bleakassassin;21323]In my experience, DVR remotes tend to have inconsistent success with the Input button. Sometimes it works as intended. Sometimes I can’t use the directional keys to switch inputs. Even worse is when I also can’t use the numbers. Then I can only switch inputs by repeatedly pressing the Input button. It works, but it’s irritating.

My ideal long-term solution will be to stop subscribing to cable, of course. :P[/quote]

I see. Sounds like a bad remote. You may not even have to switch inputs though - see below.

I had no idea that 480p could introduce input lag. I always thought that was something else that could be altered via a “Game Mode” option on the TV, not the resolution itself. Not knowing much about the technical aspects surrounding it, it really seems like HDTV manufactures just don’t give a damn about giving the best support to legacy hardware.

If it doesn’t output 480p native than the TV has to downscale and that will introduce additional lag. Some newer TVs can do 480p over HDMI, some can’t. My 60" Plasma upscales anything less than 720p to 720p. My 24" Vizio can do 480p, though. HDTV manufacturers in general do not care about picture quality or supporting old hardware, they only care about selling TVs.

I can see entirely what you’re saying with the 720p thing. I’d just rather not have to mess with the TV’s actual resolution if I don’t need to unless I’m playing on a game system connected to my composite/component splitter box (my N64 and OG Xbox, mostly), but in those cases the resolution automatically adjusts itself. My PC is used for PC games at 1080p as well as for playing Blu-Ray rips. More importantly, I launch my games via XBMC/Kodi, so keeping the TV’s resolution the same as often as possible is ideal. Of course, using 1080p resolution means either an unevenly scaled image or an image with black bars on all sides. I guess I’m just fine with the latter and using a TV border overlay to cover the rest. It looks pretty fancy to me, if I do say so myself! I’ll agree that most borders look silly, but I’ve found in my personal experience that attempting to simulate a realistic feel via a border (as in, using a TV bezel and some background material like actual wallpaper and whatnot) actually doesn’t detract from the game’s image.

Personal preference - I can’t stand overlays because I know I’m not taking full advantage of my screen size, and the TV border thing just looks fake. But if it works for you, go for it. :slight_smile: If I was going to insist on playing at 1080p and I had a dedicated input for RA I would still leave integer scaling on and turn off crop overscan, and I would use the TV’s horizontal and vertical size controls to adjust the image until it filled a 4:3 area and had a pixel aspect ratio of 4:3. But this makes less sense IMO unless you’re using some really fancy shader.

Also, you can leave your monitor at 1080p and have Retroarch automatically switch resolutions to 720p when you are playing a game, and automatically switch your TV/monitor resolution as well. I actually only have one HDMI input on my 24" Vizio and that’s what I’ve been doing. This only has to be configured one time. It switches back to 1080p automatically when I exit RA. On the Retropie, this is accomplished by holding down “x” as a game launches, which brings up a menu where you can select preferred video output resolution and retroarch render resolution.

I know that different inputs have different settings. It’s just that I’m using an all-in-one PC. I’m pretty sure I never said that scaling a 240p image to 720p would cause artifacts. Of course integer scalings will always work fine. If I did say that, it wasn’t what I meant. What I was trying to say was that scaling the 240p-image-already-scaled-evenly-to-720p image to 1080p would lead to a less-than-ideal image. But I guess the not-as-clear image wouldn’t technically have artifacts, it just wouldn’t be crisp. Not that I play my retro games unfiltered, mind you.

Yeah, you wouldn’t get more artifacts that way; you’d get less artifacts that way. If you scaled directly from 240 to 1080, then you’d get more artifacts because 1080 is not a perfect integer multiple of 240. The image is only slightly less crisp and this doesn’t really affect the picture quality.

I could have sworn that the shaders were CPU dependent. I guess not? Then yeah, those’ll add to the stress. I also was just talking about this with respect to more traditional PCs, particularly those that are multi-purpose. Do you think the Raspberry Pi 2 would be able to run emulated games with those settings without issues? Because that’d be the way to go for those seeking a low-powered route, then.

If I was using a Raspberry Pi strictly for RetroArch and similar programs, then I’d be completely fine using a 720p resolution for it.

Yeah, that’s primarily what I’ve been thinking of but I think my solution has advantages even for more powerful setups in that you get perfect integer scaling on the y axis along with a perfect 4:3 pixel aspect ratio and a perfect 4:3 display aspect ratio (slightly wider than 4:3 display aspect if you use the TV’s crop overscan/wide option), while filling the entire vertical area of the screen.

My idea with the 5:4 pixels is that it would transform the 1024 x 960 image into a 1280 x 960 image, thus displaying a picture that has a 4:3 aspect ratio while performing the exact same 5:4 scaling to all pixels. I always thought that 4:3 was generally used to describe the aspect of the displayed image, not the aspect ratio of the pixels themselves. It’d be like me saying that a 640 x 480 image is a 4:3 ratio, but each pixel would still be 1:1 if the image kept to that ratio.

I’ll always slap on a good composite filter to finish the look of older console games, though, regardless of resolution and aspect ratio.

On a CRT, the pixel aspect ratio is determined by the display aspect ratio (along with whatever calibration has been done at the factory to correct for flaws in the CRT). A block in Super Mario Bros. is 16x16 pixels. On a “perfect” CRT with perfect geometry that displays the entire 256x240 NES frame, a block in Super Mario Bros. will measure 4 units wide by 3 units tall. The pixels themselves are 4 units wide by 3 units tall.

The switching inputs would strictly be for using different devices hooked up to my TV, including two other game systems hooked up via HDMI. Also, my cable company is Charter. It always seems they get the short end of the stick when it comes to the functionality of their technology. Input/Menu handlings have almost never been ideal for me on that front.

Personal preference - I can’t stand overlays because I know I’m not taking full advantage of my screen size, and the TV border thing just looks fake. But if it works for you, go for it. :slight_smile: If I was going to insist on playing at 1080p and I had a dedicated input for RA I would still leave integer scaling on and turn off crop overscan, and I would use the TV’s horizontal and vertical size controls to adjust the image until it filled a 4:3 area and had a pixel aspect ratio of 4:3. But this makes less sense IMO unless you’re using some really fancy shader.

Also, you can leave your monitor at 1080p and have Retroarch automatically switch resolutions to 720p when you are playing a game, and automatically switch your TV/monitor resolution as well. I actually only have one HDMI input on my 24" Vizio and that’s what I’ve been doing. This only has to be configured one time. It switches back to 1080p automatically when I exit RA. On the Retropie, this is accomplished by holding down “x” as a game launches, which brings up a menu where you can select preferred video output resolution and retroarch render resolution.

I might warm up to the idea of doing this someday. I’ll look into it and see what I think of 720p if I ever get the urge to ditch my borders.

Of course, I bet by the time I get a hankering to do that, I’ll have a 4K TV in my possession. Then I won’t even need to think about scaling for most systems!

On a CRT, the pixel aspect ratio is determined by the display aspect ratio (along with whatever calibration has been done at the factory to correct for flaws in the CRT). A block in Super Mario Bros. is 16x16 pixels. On a “perfect” CRT with perfect geometry that displays the entire 256x240 NES frame, a block in Super Mario Bros. will measure 4 units wide by 3 units tall. The pixels themselves are 4 units wide by 3 units tall.

I’m not really that well versed in the technical handlings of CRT TVs. I won’t dispute your knowledge of the subject. However, I’m getting lost when it comes to the math of things. If I take a 256 x 240 image, scale it 3x, then make each pixel have a 4:3 ratio, I’m left with an image that has a 1024 x 720 resolution. The ratio of the whole image isn’t 4:3, but the pixels are 4:3. Is this how it actually is?

At this point I feel good enough with my “use what looks right” resolutions and cropping per game. It’s really only in a few circumstances where I don’t just scale the internal resolution.

[QUOTE=bleakassassin;21326]

I’m not really that well versed in the technical handlings of CRT TVs. I won’t dispute your knowledge of the subject. However, I’m getting lost when it comes to the math of things. If I take a 256 x 240 image, scale it 3x, then make each pixel have a 4:3 ratio, I’m left with an image that has a 1024 x 720 resolution. The ratio of the whole image isn’t 4:3, but the pixels are 4:3. Is this how it actually is?

At this point I feel good enough with my “use what looks right” resolutions and cropping per game. It’s really only in a few circumstances where I don’t just scale the internal resolution.[/QUOTE]

Actually, you are correct. You start with a 16:15 frame with square pixels and scale it to 4:3, which means each pixel gets multiplied by 5:4, thus the final ratio of the pixels is 5:4 when in a 4:3 display aspect with the entire frame displayed, as you said.

However, with 1280x960 you are still getting those ugly black bars at the top and bottom of the screen. :stuck_out_tongue: And you are still going to get pixel warping on the x axis using nearest neighbor filter. I prefer to fill the vertical area of the screen at 720p and then use a shader to deal with the scaling artifacts on the x axis, since you have to do that anyway.

Sorry about that - I made that post pretty late last night and forgot that the NES output frame was 16:15, not perfectly square. Carry on, sir.

[QUOTE=Nesguy;21344]Actually, you are correct. You start with a 16:15 frame with square pixels and scale it to 4:3, which means each pixel gets multiplied by 5:4, thus the final ratio of the pixels is 5:4 when in a 4:3 display aspect with the entire frame displayed, as you said.

However, with 1280x960 you are still getting those ugly black bars at the top and bottom of the screen. :stuck_out_tongue: And you are still going to get pixel warping on the x axis using nearest neighbor filter. I prefer to fill the vertical area of the screen at 720p and then use a shader to deal with the scaling artifacts on the x axis, since you have to do that anyway.

Sorry about that - I made that post pretty late last night and forgot that the NES output frame was 16:15, not perfectly square. Carry on, sir.[/QUOTE] Huh. Alright then. I’ll be honest, some of the details of how the NES interfaced wtih TVs still baffle me today, like with the video decoding being done TV-side. Upon finding that out, my expectations on its function were just completely shattered.

And don’t worry about it. Just so long as RetroArch keeps the image options at least as versatile as they are now, I’m sure everyone will find something that works for them!

It amazes me how this thread calls for people that want to talk about how they reproduce the “flawed” old TV look and tell me to do as them.

This thread is not for that, read OP carefully. It’s not about CRT, 4:3 DAR or nostalgic looks, there are hundred of thread sources to do that, however I didn’t see any thread discuss “correct” (as geometry correct) game graphics reproduction. If you clearly don’t take the time to learn what “weird” 8:7 means this surely is not your thread. I have no issues with people playing with their nostalgic idea of games. But please don’t come here to tell I got my stuff wrong, especially when you have no idea what you talk about.

@bleakassassin: yes, I basically defaulted to common denominator standards with some exceptions like 8:7 games in SNES or Genesis that have two type of games for AR. I haven’t done anything similar (game distinctions) for NES due to lack of time and knowledge on the system, Saturn seems to be a similar case to the SNES, haven’t checked deeply PSX yet, although so far haven’t noticed anything off in 1:1 PAR. Also the NES borders don’t affect aspect ratio so it’s something I didn’t worried too much. There are too many games there so it’s an arduous job as well. I consider the extra playable space a feature rather than a (AR) fix. Here I have a HD Ready TV, integer scaling is a no go, it’s the same with people using 1024px high monitors (like me), games will scale up to 896 (224*4), leaving you with 64 pixels top and bottom, it’s not too bad, using a FullHD display integer scaling leaves you with 92px top and bottom ((1080-896)/2), that’s a bit harder to swallow. This is one of those rare cases where having a 4K display has a practical meaning (I don’t buy the marketing bs of needing a 4K display). I launch my games with HyperSpin, I customized code to allow for loading per-game cfg files and made a setting for easily turning the custom AR in OP to integer in case I wanted, part of the script is in Page 2.

@Nesguy: You are so wrong I don’t know where to start… 720p, out of smartphones and pads list me devices with native 720p resolutions. The closest you will find are HD Ready TVs at 664p* (* read below), 224 x 2 = 448, this leaves you borders 108px top and bottom which translates to A LOT of space on such low resolution displays. Or maybe you are meaning setting your 1080p display to 720p? how do you do that, scaling (degrading) up 720p to 1080p display’s native resolution or leaving nice thick borders around? Do you know that your display has a fixed number of phosphors (native resolution), you can’t have multiple native resolutions to fill the screen, only by adding borders (FYI http://en.wikipedia.org/wiki/Plasma_display#Native_plasma_television_resolutions), what signal (ie 480p, 768p*) the cable is allowed to send is a different topic. Your are trying to convince people that they don’t have enough with cfg files and fiddle with remote controls. People don’t have 60" Panasonic Plasmas with overscan option in the menus, nor multiple HDMI inputs or use RetroArch on a raspberry. Do you realize that “horizontal size” and “vertical size” is scaling (degrading) your content? Do you realize that people play more than NES games? Do you understand what NAB is and its use? It’s meant to be hidden, it’s meant to be cropped, it has no meaning on current LED displays because it belongs to the CRT realm and every game system has a different padding size (read carefully OP and difference on active areas).

One needs to be conscious of what this means, you place your display according to your viewing distance assuming TV content plays at full size. Make content 4:3 and you will already start to feel it small, add overscan borders or those not filled by integer scaling and you will seriously consider replacing your sofa one meter closer. I don’t find this practical, you bought a 50" display, use it. There are even formulas for knowing the distance you need to be from your screen for immersion (measured on viewing angles), or certain resolutions, really google it.

*1176x664 is the minimum resolution the TV will inform to the PC over the HDMI interface, HDready displays have a native resolution of 1366x768, but it won’t work with that out of PC mode (analog VGA). So as I said, if you are on these kind of TVs (my case, hopefully new TV in a few months), it’s meaningless to do integer scaling, it’s too bothersome to care specially when at one point or another you will end with one dimension (width or height) not being scaled in integers anyway, or adding blur and analog filters shaders for that matter. Integer scaling should only be used IMO when you have issues with the look of scanlines.

[QUOTE=Nesguy;21325] On a CRT, the pixel aspect ratio is determined by the display aspect ratio[/QUOTE] So wrong, a “display” aspect ratio is a displayed aspect ratio. A display aspect ratio has no value, in CRT what matters is pixel aspect ratio (what drives anamorphism), it abides to a standard if you read the OP (again here the link for the lazy folks), it is 4752/4739. What TV brands did was to change (calibrate?) signal modulation and hence PAR. What you see on a TV is fixed, you can’t change the “displayed” width or height.

@NESguy LOL I warned you that this topic is hilariously contentious.

Re: CRT pixel aspect ratio, it’s probably better not to even bring it up in this case, since CRTs don’t actually have a concept of pixels, just signal bandwidth and sync rates. The wayback link covers converting from an analog signal to digital through sampling and that’s not really what we’re doing here. The only way a CRT is really involved is because of the canonical (but not really set-in-stone) 4:3 aspect ratio of the glass tube, which, as has been abundantly noted, is of little consequence outside of personal preference.

[QUOTE=hunterk;21378] Re: CRT pixel aspect ratio, it’s probably better not to even bring it up in this case, since CRTs don’t actually have a concept of pixels, just signal bandwidth and sync rates.[/QUOTE]

OP:

[QUOTE=Dogway;15087] “Although not 100% correct, here I will talk in pixels and resolutions instead of frequencies, for the sake of comprehension.”[/QUOTE]

[QUOTE=hunterk;21378] The wayback link covers converting from an analog signal to digital through sampling and that’s not really what we’re doing here.[/QUOTE]

That’s ALL we are doing here. I wonder if you have given a glance at the OP operations (ie. (12.272727 MHz /2) / 5.37 MHz = (1.14 = 8:7) )

I barely talk about DAR in OP because it has next to nothing incidence on content geometry. For example DAR won’t tell me about PAR. Discussions about it like in the last page adds more and more confussion and I made this thread precisely to avoid that. Now if you don’t care about content geometry (random not defined nostalgic look) then this clearly isn’t your thread so talk about it is offtopic. I surprisingly find there are probably more people that prefer to play with flattened graphics than with the correct original intended one… can’t be helped I guess.

I know the part I bolded is rather miniscule in significance, but the “extra playing space” isn’t always great. Or true, for that matter. There’s actually a lot of times where it leads to deception. For example, pits in the Ninja Gaiden games will have you die before you even reach the bottom of them. It actually doesn’t look right unless you crop the resolution to 256 x 208, but it looks and feels even worse at 256 x 240 than it does at 256 x 224. It’ll make you think you have more room to work with when in reality you don’t.

I always see PC monitors in a different light from TVs when it comes to playing games. Most PC games (then and now) utilizing higher resolutions run in 3D, so having sprites/textures be at the right resolution is not as important as being able to get the game to run at as high of a resolution as possible to make the overall graphics look the best. I can’t say for sure what the case is for modern 2D games. I think people just don’t care about integer scaling in those cases. I know I don’t. It seems contradictory since I do care for these old console games. I guess I just reach some cutoff point where I decide to stop caring about it. I don’t seriously play emulated games through a monitor, but if I did I’d probably not care as much about integer scaling just because of how small the monitor is to begin with.

PlayStation games don’t always look correct at 1:1 PAR. Or at least I don’t think they would. I’d imagine a lot of the SNES -> PS1 ports ran internally at 256 x 240 and just stretched to a 4:3 aspect ratio for TVs. Don’t quote me on that. There’s also some games made specifically for that generation of systems that might not look ideal using the standard display mode. One example I can think of is Castlevania: Symphony of the Night. The game itself runs at 256 x 240, and most of the game’s graphics are designed to look right at that resolution. Of course, it’s displayed on CRTs as 4:3, making things slightly wider than they should be. Things get more complicated when considering the cutscenes and prologue text. The cutscenes are meant to be viewed in a 4:3 aspect ratio. Not only is the prologue text also supposed to have this ratio, (I think) it runs at 640 x 480 resolution. All in all, it makes setting the right ratio for this game a pain because it’ll never always be right.

Most of the time 1:1 PAR does work, though. One game I’m still having troubles with regarding aspect ratio is Wipeout 3: Special Edition. Not only is it a PAL release, making vertical resolutions different from what I’m used to, but I’m also running the game with its widescreen mode. It makes it really hard to determine what settings I need to make the image look accurate.

I tested wipeout 3 and I couldn’t see anything out of place at 1:1. It’s true there aren’t many graphics to evaluate but everything looks in place. I’m mainly focusing on the red dots in the center panel, maybe you can upload some screenshots you think are wrong. If set to widescreen it will look stretched, so most likely you would need to stretch the image horizontally by 1.33

[QUOTE=Dogway;21352]It amazes me how this thread calls for people that want to talk about how they reproduce the “flawed” old TV look and tell me to do as them.

This thread is not for that, read OP carefully. It’s not about CRT, 4:3 DAR or nostalgic looks, there are hundred of thread sources to do that, however I didn’t see any thread discuss “correct” (as geometry correct) game graphics reproduction. If you clearly don’t take the time to learn what “weird” 8:7 means this surely is not your thread. I have no issues with people playing with their nostalgic idea of games. But please don’t come here to tell I got my stuff wrong, especially when you have no idea what you talk about.

@bleakassassin: yes, I basically defaulted to common denominator standards with some exceptions like 8:7 games in SNES or Genesis that have two type of games for AR. I haven’t done anything similar (game distinctions) for NES due to lack of time and knowledge on the system, Saturn seems to be a similar case to the SNES, haven’t checked deeply PSX yet, although so far haven’t noticed anything off in 1:1 PAR. Also the NES borders don’t affect aspect ratio so it’s something I didn’t worried too much. There are too many games there so it’s an arduous job as well. I consider the extra playable space a feature rather than a (AR) fix. Here I have a HD Ready TV, integer scaling is a no go, it’s the same with people using 1024px high monitors (like me), games will scale up to 896 (224*4), leaving you with 64 pixels top and bottom, it’s not too bad, using a FullHD display integer scaling leaves you with 92px top and bottom ((1080-896)/2), that’s a bit harder to swallow. This is one of those rare cases where having a 4K display has a practical meaning (I don’t buy the marketing bs of needing a 4K display). I launch my games with HyperSpin, I customized code to allow for loading per-game cfg files and made a setting for easily turning the custom AR in OP to integer in case I wanted, part of the script is in Page 2.

@Nesguy: You are so wrong I don’t know where to start… 720p, out of smartphones and pads list me devices with native 720p resolutions. The closest you will find are HD Ready TVs at 664p* (* read below), 224 x 2 = 448, this leaves you borders 108px top and bottom which translates to A LOT of space on such low resolution displays. Or maybe you are meaning setting your 1080p display to 720p? how do you do that, scaling (degrading) up 720p to 1080p display’s native resolution or leaving nice thick borders around? Do you know that your display has a fixed number of phosphors (native resolution), you can’t have multiple native resolutions to fill the screen, only by adding borders (FYI http://en.wikipedia.org/wiki/Plasma_display#Native_plasma_television_resolutions), what signal (ie 480p, 768p*) the cable is allowed to send is a different topic. Your are trying to convince people that they don’t have enough with cfg files and fiddle with remote controls. People don’t have 60" Panasonic Plasmas with overscan option in the menus, nor multiple HDMI inputs or use RetroArch on a raspberry. Do you realize that “horizontal size” and “vertical size” is scaling (degrading) your content? Do you realize that people play more than NES games? Do you understand what NAB is and its use? It’s meant to be hidden, it’s meant to be cropped, it has no meaning on current LED displays because it belongs to the CRT realm and every game system has a different padding size (read carefully OP and difference on active areas).

One needs to be conscious of what this means, you place your display according to your viewing distance assuming TV content plays at full size. Make content 4:3 and you will already start to feel it small, add overscan borders or those not filled by integer scaling and you will seriously consider replacing your sofa one meter closer. I don’t find this practical, you bought a 50" display, use it. There are even formulas for knowing the distance you need to be from your screen for immersion (measured on viewing angles), or certain resolutions, really google it.

*1176x664 is the minimum resolution the TV will inform to the PC over the HDMI interface, HDready displays have a native resolution of 1366x768, but it won’t work with that out of PC mode (analog VGA). So as I said, if you are on these kind of TVs (my case, hopefully new TV in a few months), it’s meaningless to do integer scaling, it’s too bothersome to care specially when at one point or another you will end with one dimension (width or height) not being scaled in integers anyway, or adding blur and analog filters shaders for that matter. Integer scaling should only be used IMO when you have issues with the look of scanlines.

So wrong, a “display” aspect ratio is a displayed aspect ratio. A display aspect ratio has no value, in CRT what matters is pixel aspect ratio (what drives anamorphism), it abides to a standard if you read the OP (again here the link for the lazy folks), it is 4752/4739. What TV brands did was to change (calibrate?) signal modulation and hence PAR. What you see on a TV is fixed, you can’t change the “displayed” width or height.[/QUOTE]

No need to get defensive. I just don’t get the 8:7 thing because the bottom line is that all these games wound up corrected to 4:3 and the developers of the graphics were almost certainly aware of this fact when they designed the graphics for these games. To each his own, though. Do whatever feels right.

“720p, out of smartphones and pads list me devices with native 720p resolutions”

Clearly I wasn’t talking about smartphones or pads…

“Or maybe you are meaning setting your 1080p display to 720p? how do you do that, scaling (degrading) up 720p to 1080p display’s native resolution or leaving nice thick borders around? Do you know that your display has a fixed number of phosphors (native resolution), you can’t have multiple native resolutions to fill the screen, only by adding borders”

I have no idea what you’re talking about, as both of my 1080p TVs can display 720p with no degradation of the image quality and no black borders at the top/bottom. I can see why you would loose quality if you were UPscaling from 720 to 1080, but not if you’re downscaling.

“People don’t have 60” Panasonic Plasmas with overscan option in the menus, nor multiple HDMI inputs or use RetroArch on a raspberry"

??? You don’t need a 60" Plasma. As I mentioned, almost all new TVs (i.e., made in the last 5 years) have this option somewhere in the regular user menu. And almost all TVs in the last 5 years have multiple HDMI inputs. And a TON of people use RA on the raspberry - I honestly don’t know where you’re getting that, because it’s probably one of the largest user groups.

“Do you realize that “horizontal size” and “vertical size” is scaling (degrading) your content?”

No it isn’t. If it was, it would be readily apparent on a 60" display. I have the evidence sitting in front me.

And with “nearest neighbor,” you will always get scaling artifacts on at least one axis (learned that recently), so some kind of shader or CRT filter is necessary no matter what if you want to hide scaling artifacts.

“Do you realize that people play more than NES games?”

Of course I do. You would only ever need to mess with overscan options when playing NES, since it is the only system that displayed junk pixels in that area. So it’s really not a big deal at all. At any other time you’d just leave the crop overscan option alone and not worry about it.

“It’s meant to be hidden, it’s meant to be cropped,”

On a CRT it is. CRTs had overscan because differences in voltage could make the picture shrink, revealing the black edges. There’s literally no reason to crop anything on a modern display except if one doesn’t like the look of junk pixels (i.e, playing NES games)

“So wrong, a “display” aspect ratio is a displayed aspect ratio. A display aspect ratio has no value, in CRT what matters is pixel aspect ratio (what drives anamorphism), it abides to a standard if you read the OP (again here the link for the lazy folks), it is 4752/4739. What TV brands did was to change (calibrate?) signal modulation and hence PAR. What you see on a TV is fixed, you can’t change the “displayed” width or height.”

What I should have said is, there are no pixels on a CRT. Therefore whatever shape the pixels are output by the NES is going to just be stretched according to how the TV was calibrated. The final look of the pixels is determined by the individual CRT and how it’s calibrated. On a “perfect” CRT the pixels wind up in a 5:4 ratio (talking about NES, here).

My suggestions are for getting a perfect 4:3 display aspect ratio on a 16:9 display (or if you’re cropping overscan on an HDTV while playing NES games, slightly wider than 4:3 since this preserves the PAR). The crop overscan stuff is just there for when you’re playing NES and doesn’t need to be messed with otherwise.

1080 w/integer scaling leaves you with black borders at the top/bottom

1080 w/o integer scaling leaves you with scaling artifacts on both x and y axes

720 w/integer scaling leaves you with scaling artifacts on the x axis, which you correct with a nice CRT filter or shader.

Btw, shaders won’t even work on raspberry pi unless you set resolution to 720p, otherwise you get stuttering. So that itself is sufficient reason to switch resolutions if you’re using Raspberry Pi / Retropie (which a lot of people are).

Also, I think you are committing an error when you talk about the “intended look of the graphics.” First of all, actual planning sheets from Nintendo from the NES era show 5:4 blocks, not 8:7 (see link below). Second of all, almost all the developers would have understood that the graphics would wind up being stretched on a 4:3 display, and would have taken that into account when designing the game graphics. Those are just my own thoughts on the matter, but it’s really not that important. It’s your thread and your party, so I won’t muck it up anymore :slight_smile:

http://www.wiinintendo.net/wp-content/uploads/2010/01/original-zelda-design4.jpg

[QUOTE=Dogway;21452]I tested wipeout 3 and I couldn’t see anything out of place at 1:1. It’s true there aren’t many graphics to evaluate but everything looks in place. I’m mainly focusing on the red dots in the center panel, maybe you can upload some screenshots you think are wrong. If set to widescreen it will look stretched, so most likely you would need to stretch the image horizontally by 1.33

[/QUOTE] I’d assume that the standard ratio would be fine for the game, particularly if playing an NTSC version. But yeah, I meant the in-game widescreen mode. I only use it for games that don’t exclusively crop the top and bottom of the screen to get to 16:9. This game kind of does a half-and-half between Hor+ and Vert-.

Also, I’m talking specifically about the PAL-exclusive Wipeout 3: Special Edition. Since PAL games tend to have a vertical resolution of 256 or 512 pixels, I don’t know if just using standard stretching would preserve the proper aspect ratio. Then again, when using the mode there appears to be two different ratios used. One for the menus, another for the gameplay itself.

@Nesguy: You come here thrash talk about the very same concept of the thread and then tell me to don’t get defensive. I put it very clear at the bottom of OP. “Be polite, and stay on topic”, you have done neither. The least I could do from my non-moderator position is to counter argument all the false statement you threw without even backing it up with real proofs of some sorts. All you do is commit your faith on the work a handful of almost teenage developers did 25 years ago.

Downscaling DOES degrade your image, even at a larger degree than upscaling when using scanline based shaders.

On the picture, a paper taken out from nowhere without any context makes no sense. But in any case you can’t know what you talk if you use PAR and DAR values indistinctly: 5:4 DAR of 256x240 = 300x240 8:7 PAR of 256x240 = 293x240

I’m not going to keep talking among other things because nothing you said made sense. If you don’t agree on what the thread’s topic stands by, it would be better you start your own thread. Many people share your own beliefs so you can discuss there, I will be glad to keep the thread clean of offtopic and specially misleading talk.

@bleakassassin: I used the PAL game. Try to add a PAR of 1.33 to the image to compensate for the widescreen stretching. I’m not sure what resolution the game has but multiply the width by 1.333 (4:3 * 1.333 = 1.777 = 16:9), then set a custom resolution. Or if you prefer divide the resulted number by the height to get the DAR value, and find one in RA that matches or is near among the DAR presets.

I found out that option also messes with the display of the Mednafen PSX core thanks to this thread. Turning it off for that core fixes the display getting cut off on the right some. And it gives you the same aspect ratio as stand alone Mednafen, which is supposed to be like a real PSX’s output.

Yeah, honestly I think it’s better to just leave crop overscan off in RA all the time. The only system where you need to worry about overscan is the NES and you can adjust for that easily using TV controls. Or just setup a custom config for nestopia or whatever. I don’t understand why you would crop overscan in any other situation. To me, it constitutes needlessly butchering the image.