I've been thinking about a never done approach on an output stage for a tube amplifier. A cathode follower output amplifier is a rather uncommon technique, due to the high requirement of the driver stage, where it must swing twice the supply voltage on the tube grid to compensate for the complete lack of voltage gain of the cathode follower. This creates problems when designing the power supply, as it needs two rails for the B+ of the driver stage at twice the B+ of the output stage.
By using this technique proposed here, you can squeeze some voltage gain from the output tube, and still rely on the benefits of having a broader bandwidth, less miller capacitance, less distortion, and lower output impedance of the output stage. By using an output transformer for an ultralinear amp, you can choose a 33% tap transformer and have a voltage amplification of around 3, which drastically decreases the requirements of the driver stage. In this example, I've used a center tapped output amplifier, which after the internal resistance of the output tube and all the losses, has an voltage amplification factor of a little less than 2. The only thing I don't like about it is the need to use a 1:1 audio transformer at the input to separate the source, in order to ground it, but audio transformers usually don't distort much due to having almost no magnetizing current across their core - they just work as a separator.
|