On page 26 and 27 of that manual, it lists accuracy and resolution separately.
The accuracy is how far away you should expect the measured value to be from the actual value.
The resolution is the smallest change in measured value that can be relied upon to actually be different from the previous measured value.
It would be possible, for instance, to construct an instrument with a bias so that the accuracy was terrible -- never closer to a real value than the bias -- but the resolution is extremely good: a tiny change in measured value always reflected a similar change in the real value. In the other direction, there could be an instrument which is extremely accurate but has poor resolution: it always displays the closest number to the real value, but those numbers have a large interval.