If you know the maximum output of your amplifier in watts, and know the input needed to drive it to max power, then you can calculate the voltage gain of the amplifier.
For example, assuming an amplifier of 125W into 8 ohms. You would multiply the power rating, (125W) times the load impedance (8 ohms, assuming power is rated into 8 ohms) and take the square root of that result, in this case about 31.6. Then take this result times the square root of 2 (1.414) and that will give you the peak voltage swing of this amp at clipping (about 45 volts).
Then divide this result by the input sensitivity, (in this case 1.9V) the result being about 24. This means that 1 volt in will provide 24V out. 1.9V in provides about 45V out at clipping.
Of course you can convert voltage gain to dB using any one of the several voltage to dB sources you can find easily on the web.
Typical voltage gain of a normal power amplifier is about 25V, or 27.6dB.
Of course, two amplifier with the same input sensitivity can have wildly different power ratings depending upon the overall voltage gain of either.