Correct Geometry - Aspect Ratio for different systems

Hello. :slight_smile: Iā€™m using Libretro (Lakka) on my good old Raspberry Pi 1 to play Sega Megadrive games on Picodrive. My Raspberry is connected to a CRT TV via Composite (not the best quality I know but, hey, the real console had composite too!).

Iā€™d like to find out, with your help, the best resolution and aspect-ratio settings for my setup. So:

  • Iā€™m using Composite-out on RPi

  • In my config.txt (at boot)


sdtv_mode=0
sdtv_aspect=1
scaling_kernel=8

Now RPi should output at 720x480@60Hz (standard NTSC resolution, is this correct)? Whereas a ā€œnativeā€ Sega Genesis / Megadrive resolution is impossible via Composite out, whatā€™s the nearest result I could get?

  1. Should I add

framebuffer_width=320
framebuffer_height=224

ā€¦in config.txt?

  1. And how to combine these settings with the viewport and aspect-ratio options in Libretro (Lakka), considering that on CRT pixels are not square?

Thank you very much!

I think your best bet would be to set the RPi to 640x480, which will avoid having black borders on the sides when it stretches to fill your TV, just like it would with the real console. However, coming through the composite connection, it will always output 480i (that is, 480 vertical lines interlaced). This should look perfect on interlaced content, such as Sonic 2ā€™s multiplayer splitscreen but everything else will have unnecessary jitter. Unfortunately, this is unavoidable via the Piā€™s composite output.

Hi and thank you very much for your feedback.

Real console outputs (mostly) 256x224, so a framebuffer with the same values in config.txt would be better? Since it is a 8:7 aspect ratio, in Lakka/Libretro should I set 8:7 option or it wil be ā€œredundantā€ with the framebuffer resolution? And integer scaling should be set to on or off? :S

Sorry, but Iā€™m a bit 'confused by all these values and the way they interfere.

RPiā€™s composite output is hardwired to only output 480i and thereā€™s nothing you can do to change that. Whatever you change outside of that will always get up/down-scaled to that same 480i resolution. Youā€™ll just want the aspect ratio to fill the screen (at 640x480, this would be 4:3) and it will look right on your TV.

This thread looks good is it worth reading the lot :slight_smile:

Dogway, I donā€™t understand why you are using the horizontal resolutions that you are for the 4:3 DAR. For example, for 4:3 DAR with NAB, why arenā€™t you multiplying 256 by (4/3) since NAB means the overscan is shown? Similarly, why are you multiplying 224 by (4/3) when using the 4:3 DAR without NAB. For 4:3 DAR without NAB, wouldnā€™t you be multiplying 240 by (4/3) instead of 224 because there are 8 lines of overscan cut off on each side which would be 16, so 256-16 = 240?

Hello shatterhand, yes in hindsight I think I overcomplicated the writing. For PAR you multiply the nominator (width). For DAR I chose to keep consistent with the PAR height and kept the height value fixed, hence to calculate the DAR I need to multiply 4/3 to the height resolution (denominator), thatā€™s why itā€™s 240 and not 256 what I multiply to. I could divide 256 to 4/3 as well, but that gets me a different height across all the tests.

SNES

256*(8/7)x240 = 293x240 (8:7 PAR with NAB)
240*(4/3)x240 = 320x240 (4:3 DAR with NAB, old TV look)

or also written as:

256*(8/7)x240 = 293x240 (8:7 PAR with NAB)
256x240*(4/3) = 320x240 (4:3 DAR with NAB, old TV look)

When multiplying without NAB I use 224 because 240-16=224. NAB in this case is important only in height, since we have plenty of lateral room on widescreen TVs.

Important: To add on this to note that the last portion of my OP is all while crop_overscan as true. We want to maximize the whole vertical size of our display, since further on the road using an integer height resolution wonā€™t take advantage of the whole vertical area (some pixels off 1080). The reason I ignore the integer_scaling option is because that enforces both height and width integers, and that makes it difficult to set our (geometry correct) found PAR. My only take is to enforce height as integer resolution, to accommodate for pixel shaders.

I might release soon an small autohotkey app to input screen resolution and get the resolution and offsets for every system of the OP. This is only for those willing to play as intended; correct geometry (regardless of round circles and square blocks), not as remembered (simply use overscan, without cropping and use 4/3 DAR).

araker, I think PAL MD is 6.65Mhz (13.3Mhz/2), very close to NTSC 6.71, check here. This is H40.

Iā€™m not sure if this is the right thread to post this, but Iā€™m unable to make a new thread (new user restriction Iā€™m assuming):

In both the PSX Beetle and Beetle HW cores, none of the core options related to crop and overscan have any effect on the displayed image. This includes Initial/Last Scanline, Crop Overscan, Additional Cropping, and Offest Cropped Image.

I am trying to get rid of the garbage pixels that appear in the upper-left corner of the PSX game Ogre Battle, as seen here.

No matter which adjustments I make to any of those settings, the displayed image is exactly the same in both cores.

Is this a bug with the cores, with RetroArch, or am I missing some other setting?

Hi. Iā€™ve installed Lakka on a PC I have here and I want it to work with my VGA TV. The resolution is 1280x1024 and I canā€™t get it to work. It just keeps saying ā€œout of rangeā€ but it works through HDMI on my 4K TV. I tried editing the config but it still says itā€™s out of range anyway. Do I have to do anything special to get it to work?

I have an i3 2120 and an ATI HD 6950.

I wanted to make my own post but for some reason Iā€™m not allowed to do that.

What a wonderful thread!

A few things need to be clarified.

  1. PAR = Pixel Aspect Ratio.
  2. DAR = Display Aspect Ratio.
  3. Why we want to play the games as the developers actually saw them, and what our old TVs did to the image that distorted it.

This one is more of a question:
So the developers viewed the image on a 1:1 pixel ratio monitor (NOT a TV) through the system with itā€™s corresponding Clock Rate so they saw perfect geometric shapes while developingā€¦ but why didnā€™t they use an actual TV (not 1:1) ???

Same question but a little differently:
The devs saw the console image output on a 1:1 pixel aspect monitor like a computer CRT, while we saw the final game on a 4:3 pixel aspect consumer TVā€¦ correct?

Yes, and some developers (the few) accounted for the distortion, and others (most) didnā€™t. Back then people werenā€™t as anal as todayā€¦ well, probably we are still the same seeing how many -console- gamers plainly ignore (and/or give a damn) about game resolutions, 900p? fake 4K? 720p on XONE? sub 30fps -cinematic experience-? Even people in this forum donā€™t care and just set everything to 4/3 DAR whatever that leads to (not necessarily how they played it on their CRT)

Also some games back then mixed assets that accounted and not accounted for the distortion, probably from different developers within the team.

Thanks.

Two questions:

  1. What kind of monitor did they use to develop on? How did they connect the console development kit to this particular 1:1 pixel aspect monitor?
    In other words, what RGB monitor handled 15kHz 240p and has 1:1 pixel aspect like the devs had in the early 80s in Japan?

  2. Did our TVs stretched all horizontal resolutions like 256 and 320 right to the edges of the TV using the H-Sync pulse? If so wouldnā€™t it be more correct to actually view the games in 4:3 like we did back then?

Different studios used various processes and equipment during development, but frequent testing on consumer gear was undoubtedly part of the process, if not the main way of testing. This is evident in a number of effects that abuse composite signal and CRT display characteristics, along with frequent reliance on undocumented and/or obscure console hardware quirks and edge-case behiavor that would not be evident from non-consumer gear.

There are also shots of Nintendoā€™s SMB development docs, which show grid paper with pixels significantly wider than they were tall, so fat circles definitely werenā€™t a product of laziness or lack of awareness of what hardware end-users would be using to consume the content.

Also, while consumer CRT TVs had a nominal aspect ratio of 4:3, they varied wildly in actual geometry and overscan, not to mention that many had hardware dials for adjusting horizontal and vertical size, so there was no way to ensure that everyone would see the exact same thing.

Another issue, still, is that studios didnā€™t always want to create entirely new assets for PAL releases (PAL has a significantly different resolution vs NTSC but the same nominal 4:3 aspect to the physical displays), so rather than have everything look correct on NTSC displays but look excessively tall on PAL sets, they would sometimes purposely make them slightly fat on NTSC so they would only be slightly tall for the subsequent PAL release.

Yes, that is all perfectly clear on the consumer side.
The real question Iā€™m trying to ask is how can we really know on what display the devs drew their ā€˜perfect circlesā€™ā€¦
What display was that in the early 80s?

According to this guy who claims to have done some NES development, it was all bog-standard PC hardware that they did the development on, but that was all coding and they never actually saw the output on their PC display. With that in mind, it was probably tested on either a regular consumer TV (pretty likely, I think), a PVM-style broadcast monitor, or a composite-compatible PC monitor like the one that came with Commodore 64s.

You can see some development shots here: http://www.chrismcovell.com/secret/weekly/Stars_of_the_Family_Computer_p2.html

How did they connect the console development kit to this particular 1:1 pixel aspect monitor

Itā€™s worth noting that CRTs donā€™t really have a concept of pixels, so the pixel aspect ratio doesnā€™t really exist for them.

Thanks, but the question still remains.
The artists drew perfect circles in 256x240 resolution and saw these perfect circles on what display back in the early 80s?

Isnā€™t the whole point of this thread to see the game exactly as the devs/artists saw it, or in ā€˜correct geometryā€™?

Can we really be sure the devs EVER saw the true 1:1 PAR image from the console?
They may very well drew perfect geometrical shapes mathematically without ever seeing them as intended, as we can now.

Nope. Thatā€™s what I was getting at with my earlier post.

CRTs donā€™t have a concept of pixels, only Hblank and Vblank, and everything is stretched to match the screen. 4:3 is as legitimate as any other reasonably close aspect. Lately, Iā€™ve seen a number of people throwing around 64:49, which is 3% less wide than regular 4:3, as a good ā€œuniversalā€ aspect.

However, Dogway et al can do whatever they like and whatever aspect he comes up with is equally legitimate and could feasibly have been achieved via adjustments to H/V pots.

I would assume that most characters and sprites of the 2D era were hand drawn on a grid paper with 1:1 square ratio, only then programmed and viewed on a typical 4:3 monitor CRT.
So the devs actually created and saw the characters in 1:1 pixels aspect not on a monitor but rather on a grid paper.

There is a possibility they may have actually seen the 256x240 image on a 1:1 pixel aspect computer monitor but it was less likely a full RGB/VGA monitor if we are talking early NES 80s era.
The assumption that the artist never saw the image in itā€™s original 1:1 aspect on a CRT display was too bold on my part, as said, it was probably a PC monitor but not full RGB.

To really answer these questions we would have to ask a game developer from the early 90s or late 80s, how did they create the graphics for a 2D game and what was the original aspect ratio of the art, even if it was a grid paper and pencils.

With all that said and asked, I only saw the games in their 4:3 display aspect ratio so thatā€™s how I shall continue to play them even though the devs questionably saw them in 1:1 on an unknown monitor.

The devs were indeed drawing 1:1 square pixels on a NEC PC-98 monitor with a consumer CRT near it as secondary monitor.
For NES I would assume the development proccess was also on some older revision of the Japanese NEC PC98.