Each instrument has an inherent amount of uncertainty in its measurement. Even the most precise measuring device cannot give the actual value because to do so would require an infinitely precise instrument. A measure of the precision of an instrument is given by its uncertainty. As a good rule of thumb, the uncertainty of a measuring device is 20% of the least count. Recall that the least count is the smallest subdivision given on the measuring device. The uncertainty of the measurement should be given with the actual measurement, for example, 41.64 ± 0.02cm.
Here are some typical uncertainties of various laboratory instruments:
Meter stick: ± 0.02cm
Vernier caliper: ± 0.01cm
Triple-beam balance: ± 0.02g
Graduated cylinder: 20% of the least count
Here's an example. The uncertainty of all measurements made with a meter stick whose smallest division (or least count) is one millimeter is 20% of 1mm or 0.02cm. Say you use that meter stick to measure a metal rod and find that the rod is between 10.2 cm and 10.3cm. You may think that the rod is closer to 10.2cm than it is to 10.3cm, so you make your best guess that the rod is 10.23cm in length. Since the uncertainty in the measurement is 0.02cm, you would report the length of the metal rod to be 10.23 ± 0.02cm (0.1023 ± 0.0002 m).
Text provided by Clemson University
When dealing with multiple uncertainties then it is important to determine how much a single measurement can Affect the overall outcome, more on determining overall error in total percent uncertainty.
