[QUOTE=Dogway;21634]@Nesguy: You come here thrash talk about the very same concept of the thread and then tell me to don’t get defensive. I put it very clear at the bottom of OP. “Be polite, and stay on topic”, you have done neither. The least I could do from my non-moderator position is to counter argument all the false statement you threw without even backing it up with real proofs of some sorts. All you do is commit your faith on the work a handful of almost teenage developers did 25 years ago.
Downscaling DOES degrade your image, even at a larger degree than upscaling when using scanline based shaders.
On the picture, a paper taken out from nowhere without any context makes no sense. But in any case you can’t know what you talk if you use PAR and DAR values indistinctly: 5:4 DAR of 256x240 = 300x240 8:7 PAR of 256x240 = 293x240
I’m not going to keep talking among other things because nothing you said made sense. If you don’t agree on what the thread’s topic stands by, it would be better you start your own thread. Many people share your own beliefs so you can discuss there, I will be glad to keep the thread clean of offtopic and specially misleading talk.
@bleakassassin: I used the PAL game. Try to add a PAR of 1.33 to the image to compensate for the widescreen stretching. I’m not sure what resolution the game has but multiply the width by 1.333 (4:3 * 1.333 = 1.777 = 16:9), then set a custom resolution. Or if you prefer divide the resulted number by the height to get the DAR value, and find one in RA that matches or is near among the DAR presets.[/QUOTE]
I don’t know if there’s a language barrier here, or what. Me simply expressing an opinion that I think the 8:7 thing is weird does not constitute trash talking, and if you think it does, then you are being needlessly defensive.
This “downscaling degrading the image” talk is just… I don’t even know. Maybe you could provide an example of what you’re talking about, because I see no scaling artifacts on a 60 inch display in 720p vs 1080p when using a CRT shader. One major difference though is that the game actually fills the vertical area of the screen in 720p. and it doesn’t in 1080p. Maybe you’re referring to a situation where one isn’t using shaders? If that’s the case, you will get scaling artifacts on at least one axis no matter what if using nearest neighbor as a filter. Are you blurring the games with bilinear filter, instead?
I tested this against a 720p native TV that I have, and the ONLY difference I can see downscaling from 1080p to 720p is that the downscaled image looks slightly “softer” than the 720p native image, and the transitions (such as from black to white) aren’t as sharp, which IMO actually improves the look of the image. But this is now getting to a level of detail that 99% of people won’t even be able to discern; I had to get very close to even see these differences.
The paper “taken out of nowhere” is an actual planning sheet from the game “Legend of Zelda.” It shows the display area is 4:3 (duh, it was going to be displayed on a CRT) and the pixels are 5:4. They’re 5:4 because the blocks are 5:4 and the blocks measure 16x16 pixels in LoZ. This indicates the “intended” PAR should be 5:4, not 8:7 on NES. Of course, there was no standard dev kit used by all NES developers, so any speculation as to the intended look of the graphics on the NES is ultimately just that, speculation.
It’s your thread, though. I won’t post here any more.