First,
we need to define terms:
This
is the ability to ‘hit the target’, exactness, correctness.
Deviating
only slightly or within acceptable limits from a standard.
The precision states the amount of deviation from a standard that is present. It is also a way of expressing the range of uncertainty for a measurement.
From
the American Heritage dictionary: “Capable of, resulting from, or designating
an action, performance, or process executed or successively repeated within
close or specified limits: A precise measurement with a precise instrument,
yet wholly inaccurate.”
In
common usage, precision is often confused with accuracy – it is a part of
accuracy, but during a calibration process it is useful to understand that
precision is separate from accuracy.
Precision
gives rise to terms like ppm (parts per million) or tolerance, and is used to
indicate the limits to which a measurement can be made with a particular
instrument.
An
example:
The
precision of the K2 dial indication is 10 Hz. The digits shown on the dial limit the K2 frequency indication to
10 Hz. This says nothing about the
frequency that is actually displayed, only that the numbers displayed can only
be judged to the nearest 10 Hz.
Perhaps
a different example would be helpful – consider the value 1/3. We can also represent this value as a
decimal, but at the loss of some precision.
When we use the rounded decimal 0.333 for the value 1/3, we can (and
usually do) say that it is ‘accurate to 3 decimal places’ – the same thing
could be stated as: ‘The precision is 0.0005 (or 0.05%)’.
One
way I like to think about accuracy and precision is to picture a ‘bulls-eye’ on
an archery target. The dead center
position of the bulls-eye of one particular target represents accuracy
and the distance of the rings from the center is the precision. Let’s say a precision value of 3 is assigned
for shots that will always land within the 3rd ring. Any shot with a precision of 3 could be
directed at any one of many similar targets and hit those targets anywhere
within the 3rd ring, but since we have defined only one target to
represent accuracy, the shots at any other target not be accurate although they
would have equal precision. Precision
is how closely you can place the shots, while accuracy depends on
shooting at the correct target.
The
precision of an instrument will limit the resulting accuracy when we can use
that instrument in calibrating another device (i.e., If you calibrate the K2
reference oscillator with a counter that has only 10 Hz precision, the accuracy
of the result is +/- 10 Hz because you cannot read the counter with greater
precision). Stability of the device used for calibration is another factor in
considering the results obtained.
For calibration work a standard can be any instrument that is used to calibrate another. To do a meaningful calibration both the accuracy and the precision of the standard must be known. The accuracy of the result will be no better than the accuracy plus/minus the precision of the standard, and, plus/minus the accuracy of any other measuring instrument involved. The means of calibrating the standard must also be considered in judging the accuracy of your results.
Many standards have calibration traceable to some very accurate standard such as an atomic clock (which is a ‘primary’ standard) – which is fine, but to complete the picture, one should know how many generations removed from that primary standard are involved when we judge the accuracy of any secondary standard we might have available. A calibration lab may have a secondary standard that has been carried to that standard atomic clock for its calibration, so it would be one generation removed from that standard. Other secondary standards can be then calibrated using this secondary standard and so on – all these secondary standards are traceable to the atomic standard. The term ‘traceable’ alone does not indicate any high degree of accuracy; but it does state that the precision (and accuracy) of the secondary standard can be determined (and should be known).
Calibration
When a device is to be calibrated, the level of difficulty involved depends on the accuracy to be achieved. The accuracy and precision of all instruments used in the calibration process can affect the accuracy of the results. This may even include the ears and eyes of the person performing the calibration steps – for instance, the ability to discern the pitch of a tone, or to determine a particular value on an oscilloscope trace.
As a general rule, the accuracy of a standard used in a calibration process should be at least 10 times better than the accuracy that you are trying to achieve.
In an amateur radio workshop, we often ‘make do’ with an instrument that is marginal. Be careful to know the limits of measurement accuracy that can be expected from our instruments. Expectations that are higher than the limits imposed by the instruments at hand may can lead to incorrect conclusions and result in frustration. The really good technician not only knows how to use his measuring tools, but also is aware of the limits of those tools and doesn’t try to obtain any more information from them than they are capable of providing. Obviously, you would not use a ruler marked in centimeters if you needed to measure a length to the nearest 0.1 millimeter, neither would you reasonably use a frequency counter accurate only to the nearest 200 Hz in an attempt to determine a frequency to the nearest 100 Hz.
A seldom recognized point about digital displays is that they can show any number of digits that the designer wants. The number of digits is often not an indication of the precision of the device, and it is definitely not an indication of its accuracy. A digital counter itself will have an uncertainty of +/- 1 digit unless the counter chain includes an additional low order stage that is not displayed. This display uncertainty must be added to the precision of the instrument itself. Again I have to state, “Know your tools as well as their limitations”.
A Laboratory Grade
Frequency Standard
If you would like additional information about frequency standards and calibration, or if you would like to build your own standard, there is an excellent article in QEX for May/June 2002, pg.13, by Randy Evans KJ6PO titled, “A Laboratory Grade 10 MHz Frequency Standard”. He provides techniques for calibrating the standard and much of that information is applicable to calibrating any other oscillator.