|
This circuit takes a 3.3 V PWM signal, controls a MOSFET to convert it to a 12 V PWM signal, then lowpass filters it into an approximately DC value. Here is the chain of events:
The source is a 0 to 3.3 V, 1 kHz PWM voltage, like an STM32 MCU digital output. Call this V_1.
The 100 kOhm pull-up resistor ensures that the intermediate node (connected to the opamp input, call it V_2) is pulled-up to 12 V _unless_ the NMOS transistor is on, which pulls it down to 0 V. The transistor is on when the gate is high, therefore, when V_1 = 0 V, V_2 = 12 V, adn when V_1 = 3.3 V, V_2 = 0 V. So, this part of the circuit has an inverting effect on the signal, and turns a 0 V to 3.3 V digital signal into a 12 V to 0 V signal. Note that this will invert the duty cycle of the PWM signal (e.g., a 25% duty cycle on V_1 now looks like a 75% duty cycle on V_2).
There is a unity gain opamp whose output duplicates V_2. An RC lowpass filter provides the approximate DC offset of the applied signal, and its time constant is set to 20 ms, providing an approximate response time to changes in duty cycle of about 100 ms (5*tau). Try playing wth the time constant to see the tradeoff between output voltage ripple and response time to changes in duty cycle.
Finally, note that the opamp buffer is present so that the capacitor is not influenced by the pull-up resistor during the V_2 charging phase, ensuring that the RC filter has the same charging and discharging time constants. What would happen if the time constants were different? Can you think of a way to eliminate the opamp and keep the time constants equal?
|