Very interesting discussion.

It was always my understanding that the amount of watts
that were available were what determined how many db could
be delivered from a pair of speakers (assuming the speakers
were capable of playing at the volumes the amp could deliver).
So while current no doubt is very important to amp performance,
can it also replace output in watts to some extent?
For example, lets say you have a pair of main speakers that you
want to use for watching movies on DVD. And you want to watch
the movies, at least some times, at full Dolby Digital reference
volume levels. If I recall correctly, DD reference volume
specifies 105 db for max SPL (not counting low frequency effects)
for sound during the movie. So, if you have speakers of a given
sensitivity and you want full DD reference volume at your listening
position, you will need a certain amount of watts in order to be
able to do that. I've been operating under the assumption that this
watt figure was non-negotiable in terms of achieving a given SPL at
a given distance with speakers of a given sensitivity. But perhaps
current comes into play here, as well?
Now, let's suppose that you have an amp that cannot meet the watt
requirements for reaching this reference SPL at the distance
required with your speakers. But the amp has superb current
capabilities, something like the DM Stratos, for example. Let's
further suppose that the amp falls, say, 75 watts short of what
on paper would be required to get to that target reference SPL,
according to the standard formula. Could the amp's tremendous
current prowess overcome that 75 watt shortfall and still deliver
the reference SPL level at the listening position (let's assume for
the sake of the discussion that the watts needed at reference SPL would
be required for a longer period of time than the amp's capacitors
could cover in a temporary surge of peak watt output)?