[time-nuts] GPS message jitter (preliminary results)
Tom Van Baak
tvb at LeapSecond.com
Mon Jul 18 18:10:32 EDT 2016
> Most definitely... if I turn on Lady Hather's display filter (which does a sliding average
> of "n" readings) the standard deviation and jitter values drop dramatically.
Be careful here. We're using standard deviation here as a measure of the instability, or noise, or jitter of the NMEA edge. Adding a sliding average filter to the raw data is (an accidental) method to remove that noise; the very thing you were trying to measure. So claiming "jitter values drop dramatically" is not exactly a surprise. Effectively what's happening is that you are changing the "tau" of the std dev.
At this point you're much better off not to use raw or filtered std dev at all -- and just go with ADEV on the raw data, where the tau of the sigma is explicit.
----- Original Message -----
From: "Mark Sims" <holrum at hotmail.com>
To: <time-nuts at febo.com>
Sent: Monday, July 18, 2016 1:49 PM
Subject: [time-nuts] GPS message jitter (preliminary results)
> Most definitely... if I turn on Lady Hather's display filter (which does a sliding average of "n" readings) the standard deviation and jitter values drop dramatically. With a 60 second filter the deviation was down to around .15 msecs and the peak-peak jitter below a millisecond. Average and rms values of the jitter was 2.4 microseconds. I saw similar improvements with the cheap NMEA receivers.
>> Looking at the data by eyeball (maybe not the best way): A 10 to 30 sample finite impulse average (boxcar) would do a lot for the result.
More information about the time-nuts