
I assume we are dealing with a *sick* amp to begin with, the idea is first to avoid (if possible) making it sicker. and 2 or 3 minutes later a resistor started to smoke - a capacitor popped - a transistor became incredibly hot - a transistor/chip exploded or cracked - etc." Notice nothing I mention here is related to power amp output, by any means, but internal parts dissipation/overheating.

The dreaded: "amp turned on, apparently fine.

In case I do not make myself clear: a 100W bulb will easily let through, say, 25W with ease without showing too much brightness so not alerting Tech and that can quickly cook many parts. If used with a larger one, just transformer magnetization current (with even no amp board connected go figure) will make it noticeably orange, and some amps will not even *start*, or "start stupid" and latch output against some rail.ġ00W in my book (others may differ of course) is already too large, problem being that many parts will happily cook, say an overbiased transistor, a driver trying to straight drive a load (because of final transistor failure or misconnection), etc. Yes, 40W is right for a wide range of home or guitar amps, from, say, 20+20W Hi fi to 100/120W MI or PA which covers *a lot*.Ħ0W would be right for 100 to 250W stuff, remembering we test in principle at idle and so power consumption is very low.Ģ5W only for very low powered stuff, say a 15 to 25W (total) amplifier, or some preamp level stuff such as preamps, tuners, CD or Digital players, etc.

Sorry, shoud have suggested some actual values.
