With my 98dB Rethm Saadhanas, I can run CD-direct into my 2-watt Yamamoto SET and remain below 50 on the CDP's display (99 max = 2V). I'm likely listening at about 1V source max. Any gain a preamp would introduce would be flat wasted, period, end of story.
With my 26dB Supratek Cabernet Dual preamp, I can drive 85dB speakers without any amplifier voltage gain, i.e. by way of Nelson Pass' F4 whose voltage gain is minus 0.5dB.
On my 91dB DeVore Nines, I need no amplifier voltage gain on most recordings with the 12dB ModWright LS-36.5 but on certain classical recordings with low median level, I can run out of gain by a few dB. By the time I switch to the 20dB Raysonic C200 preamp, I'm in fat city SLP wise and categorically need no voltage gain in the amplifier.
My room is essentially 14 x 20 with a nearly twice that open space adjoining. This by way of some real-world figures.
As your speaker sensitivity goes up, your gain requirements go down and once you're approaching 100dB for speaker sensitivity, excess gain tends to equate to noise (surf or hum without signal).
The first question you need to answer is speaker sensitivity, then desired SPLs in your listening position. From there you can figure out how much voltage gain your system has to provide. Then you can decide whether to get a low-gain preamp with a high-gain amp, the reverse, whether you even need an active preamp and so on. You can definitely have too much gain and most people in fact throw gain they paid for away in buckets by where their volume control sits when they listen...
Standard amplifier gain is 26dB, standard preamp gain somewhere between 16dB and 20dB. Add digital sources with 2V outputs (or more) and it's clear that for normal-sized rooms and 90dB speakers, there's ridiculous excess gain in the system...