I'm afraid I'm still confused. As i understand it, proper termination, meaning constant impedance, would depend on the specific application, just as you explained for the various analog connections. I was asking because I'm not aware of 3 pin XLR having an impedance spec but it seems to be the standard AES/EBU connector. I mentioned CATV as an example of connector impedance being non critical because many years ago I did not quite trust the experts and so I investigated it. It turns out that with F connectors, any mismatch in that analog situation simply results in a miniscule signal level drop.
Since we're talking now of digital, are there pitfalls to avoid or do I just get a piece of 110 ohm cable and solder on the XLRs?
It's all a little bit of a crapshoot... Digital cares about the characteristic impedance (not the same as regular impedance) in both the cable and the connector, otherwise you potentially get reflections within the line. Whether that creates a perceivable effect in the audio band is not something I wish to debate here. It is also worth noting that characteristic impedance has more to do with the physical construction/manufacturing of the cable and connectors themselves. I do not know of any boutique or hifi cable manufacturer who can create true 75ohm cabling on their own; it is not something you can accomplish with mods or braiding etc.
With coaxial cable (and properly crimped RCA or BNC termination; most of your boutique connectors do not fit the bill), you can maintain proper 75ohm characteristic impedance all the way through.
With AES, you can have 110ohm cable, but the connectors do not maintain the characteristic impedance. However, you do get balanced transmission lines and transformer isolation. Whether that creates a perceivable effect or improvement vs coax is another debate.