Yes, this is an issue with CRT emulation and the most common way is to use a filtering approach which works properly only with low resolution input.
The main problem is that CRT emulation isn’t done on a low level, but uses HW features of an adapter (like point or bilinear filtering), a basic filtering approach (like adapted bilinear, quilez…), a standard filtering approach (like lanczos, spline, sinc, gaussian…) or something more advanced.
Currently i don’t know many CRT shaders that would consider advanced filtering, because special kernels need to be implemented which consider doubled (tripled) pixels and thus don’t skip details.
From my Retroarch repertoire the only shader capable of doing so properly is the guest-advanced-HD version, where it’s simple to change the internal resolution parameter, but it’s not automated for an input resolution for various reasons.
On the WinUAE side i needed to write special kernels to handle the hires (only hires, not superhires…) mode properly, but interlacing support (for example) is missing due to limited capabilities of the emulator’s shader environment.
There exists a ReShade version of the ‘HD’ shader, but you need to manually input the resolution every time it changes etc.
What i’m saying is that these problems aren’t addressed in general and it’s also hard to address them.
A good and persisting example is the SNES game Trials of Mana (Seiken Densetsu 3), which changes the resolution to the ‘hires’ when talking and going to menues. Most common way is to ‘derez’ the game to 256x224, but here details are going missing.
So, a single preset or a single configured shader to cover every resolutional situation properly hasn’t been developed yet afaik and trends show there is much more probability that any shader will require manual tuning for special cases.
Unless there is something done from scratch especially for these.