AGC is tricky here, since “RF signal strength deviations” are pretty predictable with home devices (consoles, home computers). Some readings state, that it’s very usable for broadcast input sources.
I read about “circuits” which also affect luma strengths if input signal is too boosted, for example better Sony monitors. I found some old docs stating that safe voltage was regulated with Philips TVs and some ITU stuff.
The issue is also that today’s “standard CRT TV circuit solutions” didn’t apply to vintage engineers, they had the freedom to solve problems with different means and different companies did things differently. Some stuff was also patented (like delay line from SECAM, trinitron…), so there was motivation to innovate.

This would take a lot of resources to handle.