Thanks for chiming in Roger.
My bad then. They are ceramic, and I always thought ceramic was same as high breaking. 
Can all other output tubes be placed the same as KT-120's? That is, a pair in either front or back row, with bias set to dim glow.
Ceramic fuses are often high breaking but a particular example that is
not is TUNING FUSES where they are ceramic but not sand filled. They are actually the lowest breaking (worst) ceramics ever made. The idiots put the silver wire in a Teflon tube which then contains the plasma of the vaporizing element. In a proper high breaking fuse the sand prevents the plasma from forming. I find it amusing that the people who make the most expensive fuses know so little about fuses.
Anyone tuning in late to my thoughts on fuses can check the original post which I have made sticky.
For the 2 tube/channel configuration any tube with a dissipation rating of 35 watts or more is suitable. That eliminates the EL-34 which is 25 watts. The LED comes on at 120 mA which makes the total dissipation 55 watts per channel for a MK-1 and 62 watts for MK-II as it has a bit higher B+. Divide that number by the number of tubes you are using (either 2 or 4). The RM-9 gets unusually long (10,000 hour) tube life because with 4 EL-34 they are run at about 14 watts each.
For comparison purposes many makers such as Rogue, ARC, CJ etc. run their tubes close to max dissipation which I think is a foolish thing. The owners of these amplifiers seem to be OK with the factory recommendation of replacing output tubes every 1500 to 2000 hours. I got it from the plant manager at Sylvania in Altoona, PA that the maximum dissipation of a tube is the maximum. We don't run our tires at their maximum inflation pressure and neither should we run our tubes at their maximum pressure. He told me that used
properly 10,000 hours was their goal and we who own Music Reference amplifiers have found this to be so. This is achieved by running tubes at about half of their maximum dissipation.
It is also interesting to note that in the mid 20th century the rating of tubes changed from "design maximum" to "design center" and many of the numbers (dissipation, max voltage, max current) went down. Some tubes are still rated "design maximum" and some "design center". This new rating system came in because engineers were using the maximum numbers without consideration of high line voltage or higher than normal bias currents in circuits that were usually not adjustable by the user. The earlier "smart" engineers would account for high line voltage etc and put this margin into their design. The not so smart ones pretended like the line voltage is always the same everywhere and all tubes biased up the same. So the tube makers did the "de-rating" for them.
The danger of running tubes to hot can be seen in the Dynaco ST-70 and some early CJ amps where as the line voltage rises 5% the output tube current also rises at about the same or higher rate. However coupled with the fact that the B+ is also rising 5% now the dissipation has risen 10%. If its already at the limit this is enough to push things over the edge. In my amplifiers the tube current actually goes down with rising B+ keeping the dissipation constant.
Sorry for the long answer, I hope you find this interesting.