The reasons for the scaling up in the preset is for two reasons, one is scalefx and the other is GTU, a 3X scale is required for scalefx, and GTU normally uses a 2X scale, so each full pixel does not have to be blurred. Both these types of effects and all the MDAPT passes are really effects which are the most effective at probably max 640x480 core resolution.
For golden age video games this isn’t much of a problem because the core resolution is low, e.g. 256x224. This is the stuff I tend to enjoy the most so that’s why it evolved that way. (Although it’s strange that you were having a problem with TG16 as it has an arcade resolution)
For high res core output I don’t think we want/need the same abilities, and at a higher resolution you probably don’t even need a higher resolution to use GTU properly.
Because of this I think probably need different presets for these different use cases. This requires separate presets because changing the resolution of passes and which passes are included must be done in the preset, the shader can’t change this on the fly.
Although if you are saving simple presets for your per core/per game use (which you should be) you can swap the path in the simple preset and all your settings will load on top of the other preset.
I have some HIRES-CORE presets which will come in the next update that will be in an -Experimental- folder, meaning they are new and may change significantly in the near future, (names may change, individual presets may disappear, passes inside may be adjusted). They run significantly faster on high resolution core output. Hopefully some people can test them out and five feedback so we can get the best result so they can mature and become a stable part of the package.
Can you tell me what resolution is being output when these cores are in 2X resolution?
EDIT:
Also Note that the BASIC-BORDER and BASIC-BORDER-WITH-REFLECT do not have any increased resolution in them