[time-nuts] What is "accuracy"? (newbie timenut, hi folks!)

Bill Hawkins bill.iaxs at pobox.com
Thu May 5 23:31:26 EDT 2016


Well, I'll take a crack at this, although I'm no expert.
I hope it provides a base for others to build on.

First the basics. Accuracy is a property of the thing being measured.
Precision is a property of the measuring instrument.
A digital voltmeter may have a precision of one millivolt and an
accuracy of a tenth of a volt.
You know what the meter reads to one millivolt, but you only know the
voltage to an accuracy of 0.1 volt.

Time and frequency are mathematically related. If you know one, you know
the other.
They can be measured to an accuracy that is very near the precision of
the instrument because there is no analog to digital conversion, as
required by most physical values.
The accuracy is somewhat degraded by the zero crossing detector.

Otherwise, measuring frequency and time is simply a matter of counting
cycles of an oscillator.
A clock is a cycle counter with a fixed period of repetition.

When you want to know the accuracy of a clock with respect to a
standard, you are really interested in how well they match over a period
of time. You can watch your wall clock slow down with respect to WWV or
some other national standard. Then you can say the clock is accurate to
some value of minutes per day or other counts per period of time.

I've never really looked at Allan Deviation, but it seems to be a
statistical method for displaying variations in accuracy with time.

Perhaps ADEV is what you need.

Regards,
Bill Hawkins
 

-----Original Message-----
From: time-nuts [mailto:time-nuts-bounces at febo.com] On Behalf Of BJ
Sent: Thursday, May 05, 2016 8:34 PM
To: time-nuts at febo.com
Subject: [time-nuts] What is "accuracy"? (newbie timenut, hi folks!)

Hi Time Nuts,
I'm fairly new to the fascinating world of time and frequency, so I
apologise profusely in advance for my blatant ignorance.

When I ask "what is accuracy" (in relation to oscillators), I am not
asking for the textbook definition - I have already done extensive
reading on accuracy, stability and precision and I think I understand
the basics fairly well - although, after you read the rest of this, you
may well (rightly) think  I am deluding myself. It doesn't help matters
when some textbooks, papers and web articles use the words precision,
accuracy and uncertainty interchangeably. (Incidentally, examples of my
light reading include the 'Vig tutorial' on oscillators, HP's Science of
Timekeeping Application note, various NIST documents including the
tutorial introduction on frequency standards and clocks, Michael
Lombardi's chapter on Time and Frequency in the Mechatronics Handbook
and many other documents including PTTI and other conference
proceedings). Anyway, you can safely assume I understand the difference
between accuracy and precision in the confused musings that follow
below.

What I am trying to understand is, what does it REALLY mean when the
manufacturer's specs for a frequency standard or 'clock' claim a certain
accuracy. For ease and argument's sake let us assume that the accuracy
is given as 100 ppm or 1e-4 ....  

As per the textbook approach, I know I can therefore expect my 'clock'
to have an error of up to 86400x1e-4= 8.64 s per day.

But does that mean that, say, after one day I can be certain that my
clock will be fast/slow by no more than 8.64 seconds or could it
potentially be greater than that? In other words, is the accuracy a hard
limit or is it a statistical quantity (so that there is a high
probability that my clock will function this way, but that there is
still a very small chance (say in the 3sigma range) that the error may
be greater so that the clock may be fast/slow by, say, 10 seconds)? Is
it something inherent, due to the nature of the type of oscillator (e.g.
a characteristic of the crystal or atom,
etc.) or does it vary so that it needs to be measured, and if so, how is
that measurement made to produce the accuracy figure? Are environmental
conditions taken into account when making these measurements (I am
assuming so)? In other words, how is the accuracy of a clock determined?


Note that I am conscious of the fact that I am being somewhat ambiguous
with the definitions myself. It is my understanding that the accuracy
(as given in an oscillator's specs) relates to frequency - i.e. how
close the
(measured?) frequency of the oscillator is to its nominal frequency -
rather than time i.e. how well the clock keeps time in comparison to an
official UTC source.... but I am assuming it is fair to say they are two
sides of the same coin. 

Does accuracy also take stability into account (since, clearly, if an
oscillator experiences drift, that will affect the accuracy - or does
it?) or do these two 'performance indicators' need to be considered
independently? 

I am guessing that the accuracy value is provided as general indicator
of oscillator performance (i.e. the accuracy does REALLY just mean one
can expect an error of up to, or close to?, a certain amount) and that
stability (as indicated by the ADEV) is probably more
significant/relevant.

It is also entirely possible I am asking all the wrong questions. As you
can see, confusion reigns. I am hoping things will become clearer to me
as I start playing around with hardware (fingers and toes crossed on
that one).

In the meantime, if anyone could provide some clarity on this topic or
set my crooked thinking straight, my gratitude will be bountiful. 

Thanks.

Belinda




More information about the time-nuts mailing list