Driving ADC with opamp with large rails
In the image below is an opamp buffer with +/- 15v rails connected to a 3.3v ADC. The input to the opamp is limited to 0-3v. There is no protection circuitry for the ADC.
Is the risk of the opamp overdriving the ADC too great for no input protection to be used?
3 answers
Is the risk too great?
That depends on parameters you haven't told us. How cost-sensitive is this product? What is the reliability expectation of the users? How mission-critical is its usage?
For example, if this were a toy, I'd say screw it and look for a way to not even buffer the signal in the first place. If this is sensing the temperature to control cooling pumps in a nuclear power plant, I'd not only follow every datasheet spec to the letter, but also add some margin, and then add redundant units anyway.
Let's say this is an industrial product where a few cents extra cost doesn't matter much, and you want reasonable robustness to protect your reputation. As others have said, put a resistor between the opamp output and the A/D input. Then take the feedback from the A/D side of the resistor. That makes the opamp produce whatever voltage is necessary on its end so that you get the desired voltage on the A/D end.
However, there is more to it than that. Just adding the resistor can lead to instability. To address that, add a feedback capacitor directly around the opamp:
The value of R1 was chosen to limit the A/D input current to 5 mA. The worst case is when the opamp drives its output all the way to -15 V. (15 V)/(5 mA) = 3 kΩ.
I picked the 47 pF value out of the air. This is hard to predict. Perhaps no additional capacitance is needed at all. I would put the pads on the board, then find the appropriate value by experimentation. I'd probably use twice whatever the largest value was at which I saw oscillation.
The drawback of making C1 too large is that the response slows down. If you don't need particularly fast response, you can be more aggressive with C1 to err on the side of stability.
1. why may the modest resistor inside the loop lead to instability? and 2. (probably related) could you explain how the cap solve this problem
The resistor together with whatever load may be on its right end can cause a phase shift. Even a little parasitic capacitance is enough to cause a measurable phase lag. That extra phase lag in the feedback path can cause the system to oscillate, depending on how close to the stability limit the opamp was trimmed for in the first place.
C1 adds feedback at high frequencies without any additional phase shift. It therefore increases stability at the expense of overall step response time. Another way to think about it is that since it directly feeds back high frequencies, it reduces the overall loop gain at those high frequencies. With less gain, the feedback, with its lag, matters less.
C1 is sometimes referred to as a compensation capacitor. This comes from compensating the overall feedback loop, which is one of the terms used in control theory to achieve stable systems.
Here is a solution to protects very well the ADC from over-voltage spikes, avoid any simulation and design consideration, and that also protects the ADC from a possible negative output spike from the op-amp output at startup.
For high frequencies, it may be necessary to put a "compensation capacitor" as explained in the answer of Olin.
Or just Shottky diodes to ground and the A/D supply. The Shottkys should kick in before the protection circuitry of the A/D. With a normal silicon junction drop, the internal protection circuitry might take most of the load.
The problem is that we know nothing about this ADC device. So, I consider it as a device sensitive to over-voltage, that may or may not have a protection circuitry against negative input voltage. Now, if it has no protection, the zener, with its 0.6V forward voltage should be sufficient in most cases (otherwise a Schottky may be connected in parallel as suggested by Olin). But if the ADC device has a negative input voltage protection, then even if it takes most of the load, it will dissipate at most 0.6 * 15/300 = 0.03 W. I can hardly imagine a protection circuitry that is not able to dissipate this power.
Is the risk of the opamp overdriving the ADC too great for no input protection to be used?
I would certainly say yes but, this can usually be easily solved by using a current limit resistor in the feed line to the ADC. Most ADCs specify a maximum current that their inputs can take. This is an overdrive level and doesn't normally apply when input signals are correctly bounded. $$$$ But, on power-up situations the op-amp output may be able to deliver more than 25 mA at a voltage much larger that 3 volts so, a resistor is a simple and effective counter-measure. $$$$ If the maximum ADC input current spec is 5 mA, then what resistor value will prevent more than 5 mA when 15 volts is applied to the ADC input? You can assume that the ADC input diode protection will try and clamp to a little above 3.3 volts so assume 3.5 volts. It then boils down to: -
$$\dfrac{15\text{ volts} - 3.5\text{ volts}}{R} = 5 \text{ mA}$$
So, R = 2.3 kΩ.
1 comment thread