I am far from an expert on the subject, but i think it’s that the 601/709 transfer function in question was an encoding standard rather than a decoding/display standard? And old CRTs were power gamma, so the decoding/display gamma was power by default?
And the “Why power 2.2?” part is, basically, because 2.2 was the default answer to “what is CRT gamma?” when sRGB was standardized, so it ended up being the target gamma as LCDs took over the market, and power 2.2 kind of became the standard in the PC space by default?
And for why other gamma transfer function options may be desirable:
Actual CRTs supposedly ranged from power 2.1-2.5 depending on brightness and such, with power 2.35-2.5 being the generally accepted average for “correct” in retrospect. In that range, 2.4 is the one modern displays generally include an option for, because…
From 2011 on, Rec. 709 has specified BT.1886, which is equivalent to power 2.4 on OLEDs and other hypothetical displays capable of displaying true black. Before that you would see… uh… “spirited” debate as to whether the correct decoding gamma for Rec. 709 was power 2.4 or power 2.2.
Some people (including whoever was in charge of implementing SDR tonemapping in HDR at Microsoft) have decided that the sRGB piecewise encoding gamma should also be the SDR decoding/display gamma on PC, so some games look most correct using that piecewise curve.
As mentioned, early LCDs had rather low gamma (as did some plasmas i think?). Power 1.9 seems to be the shorthand/compatibility option for that nowadays, but it was all over the place iirc.
Mac OS X defaulted to power 1.8 until Snow Leopard in mid-2009, when they switched to power 2.2 by default.
iPhones 1-3 were also 1.8, then the iPhone 4 was 2.68, iPhone 5 was 2.36, and the iPhone 6 on have all been 2.2(ish). Not necessarily the most relevant for a CRT shader, but i know some people like to use CRT shaders for pixel art/low-res games even if they were designed for LCDs.