When reporting the resolution of a sensor, we are really reporting the smallest step size that a chosen interface can report from the sensor in question without seeing bit noise. This is not a measure of sensor accuracy.

Different interfaces use different bit resolutions, which means they have different ADC chips/boards. The number of bits the chip has determines the number of steps it can divide a voltage range into.

For example:

SensorDaq uses 13 bits = 213 or 8192 steps
LabQuest interfaces (all models), TI-Nspire Lab Cradle, and LabPro are 12 bit = 212 or 4096 steps
CBL and CBL 2 are 10 bit = 210 or 1024 steps

This really means that each interface can divide a given voltage range into the number of steps shown above.

All of our interfaces have an input range of 0-5 Volts. If a sensor was using the full 0-5 Volt input range then we could just divide the range of the sensor by the number of steps to get the resolution.

Many sensors, however, do not use the full 5V of the input range. So you need to find the fraction of input range the sensor uses (this is typically listed in the sensor’s documentation). Then you can find the total number of steps that are available for that sensor. You will also need to know the full scale reporting range of the sensor.

Calculate the fraction of the 5V input range that is used by the sensor. Then multiply this value by the number of steps used by the AD board on the interface. Next, take the sensor range and divide it by the number of steps you have available at the full scale range. This is the resolution of the sensor.

What is the uncertainty of my sensor?