|
As you know most class A amps are quite inefficient, and constantly dissipate a given maximum amount of power. There are some topologies to regulate the dissipation, but nothing drastic. Well I've been thinking, isn't it better for the bias point to be set by the signal level? Basically the bias level is constantly tracking the output to sense the needed current. This is pretty preliminary, I wouldn't even consider building it, but the basic idea is what counts. So basically, with no input signal the amp draws some negligible amount of current (a few mA). When an input signal is detected, an AGC control starts increasing the bias level of the amp until it reaches a given maximum (around 45mA in this case). The interesting part is, the bias level needn't be maximum all the time. A lower input signal won't cause the AGC to pull the bias so hard and will keep the power dissipation at a lower level. The harder you push the amp the harder thr AGC will pull the bias until it reaches a maximum. When no input signal is detected the amp will enter standby and the current consumption will fall to a few mA.
|