Design considerations for a synchronous DC/DC converter
In the field of DC/DC conversion, one can build a synchronous DC/DC converter (also referred to as synchronous rectification). This practice involves replacing a diode with a MOSFET switch.
As far as I understand, this is a beneficial practice since the MOSFET will dissipate much less power than a diode. This, of course, comes at a cost of a more complicated control circuitry. However, nowadays the power management ICs are very advanced and synchronous rectification is not a unique feature of those.
Are there any other design considerations to be made with synchronous converters, besides timing of the switches? Are there any downsides to using one in your system?
2 answers
First, let's clarify something:
replacing a diode with a MOSFET switch
I have never seen the diode actually replaced. The intent is to have the FET on only when the diode would conduct, but no such timing is perfect. Having an actual diode there allows falling back to the classic case when the FET turns on too late or off too early.
Second, the main issue with synchronous rectification is that the timing if tricky. You want the FET to be on only when the diode would have been on. If the FET is on too little, the real diode takes over, and you loose some efficiency. However, having the FET on too much causes shoot-thru currents, which decrease efficiency rapidly.
Particularly the leading edge of turning on the FET is critical. That's when the diode would conduct the largest current, so being late has the most effect on efficiency. However, a little too early and you are essentially shorting the input supply to ground. Not only does that kill efficiency quickly, but it beats up on the main switch, and the synchronous rectifier switch.
You might say that's easy, just turn on the synchronous switch at the same time you turn off the main switch. However, everything has a lag time, particularly turning off the main switch which could be conducting substantial current at the time. Then consider the turn-off time of the main switch (including its driver) probably varies with current, temperature, and applied voltage, and there will be some variations part to part. It's not so easy.
The simplest answer is to err on the side of being late so that there is no chance of every being too early under all combinations of current, voltage, temperature, and what your dog had for breakfast. That is still better than letting the diode conduct all the time. I have seen one design where the dead time from main switch off to synchronous switch on was constantly jittered just a little. The micro would measure the efficiency of the different times, then center itself on whatever was best.
At some point the MOSFET approach will lose to diode if current is sufficiently high:
The drain-source resistance will cause the forward voltage drop to go up linearly. So picking a MOSFET with a small Rds would be one idea to reduce conduction losses. Checking the gate threshold voltage with which you must drive the MOSFET in order to get it turn on fully is another important thing.
1 comment thread