Post History
What you linked to is more of a user's manual than a datasheet. Particularly for more consumer-oriented meters, this spec may not be explicitly available. However, you can easily measure it. Use...
Answer
#1: Initial revision
What you linked to is more of a user's manual than a datasheet. Particularly for more consumer-oriented meters, this spec may not be explicitly available. However, you can easily measure it. Use an ohmmeter to measure the resistance of the ammeter. You should see the ammeter report a small current. You should also see that the resistance goes down as you switch to less sensitive scales on the ammeter. A good guess is that the voltage developed across the current-sense resistor at full scale is the same as the full scale voltage of the most sensitive voltage scale. That's usually what the native meter does internally. Different current scales then basically switch different resistors across this internal voltmeter. For example, let's assume the most sensitive voltage setting goes to 200 mV full scale. If the lowest current scale goes to 2 mA, then the resistance of that current scale is probably (200 mV)/(2 mA) = 100 Ω. For the 20 mA scale, it would be 10 Ω, for 200 mA 1 Ω, etc.