by Tony » Thu Apr 05, 2007 6:45 pm
In practice you will usually find that right down at the limit of resolution there will always be some random fluctuations of the last bit or two. This can be due to electrical noise and/or the analog to digital conversion process itself, as well as small real fluctuations in what is being measured.
The solution is to average multiple readings in software to smooth out these small fluctuations. If this is done properly, statistical averaging can actually result in greater resolution than is theoretically possible from the bit resolution.
For instance, if you take a whole bunch of readings and come up with some numbers like 504, 503, 501, 503, 500, 504, 503, 502...... You could generate a continuous running average of the whole lot and end up with a rock steady displayed number like 502.9
The beauty of electronics is that it is quite often possible to take thousands of readings of a sensor every second, average the results, and obtain both a high resolution and an acceptably fast response. It is trading off speed for extra resolution, but as there is usually plenty of excess speed available to trade. This is usually a very good deal for our type of application.
It is not just a case of connecting up all the hardware. Some sort of software signal processing, and a bit of thought, can usually vastly improve the performance of the system.
Just don't confuse very high resolution with high accuracy. Other factors such as temperature drift, hysteresis effects, and linearity come into it as well.
There is obviously no point in having a digital readout display to 6.0004 inches, if the known accuracy and repeatability is only +/- half an inch.
So those that believe fitting digital readouts will improve bench accuracy, it may, or it may not, depending what you have.
Also known as the infamous "Warpspeed" on some other Forums.