[time-nuts] Framework for simulation of oscillators

Tom Van Baak tvb at LeapSecond.com
Sun Mar 27 22:25:30 EDT 2016


> BTW: I discovered that Timelab stops processing after 10'000'000 datapoints,
> which is kind inconvenient when doing a long term measurment...
> 
> Attila Kinali

I've collected a day of TimeLab/TimePod data at tau 0.001 which is 86'400'000 datapoints. Should be no problem.

Note Stable32 has a user-configurable limit (Conf -> Data -> Max_Data_File_Size). Or you can decimate during reading.

My command line tools have no sample limit (well, just malloc() limited) and can be orders of magnitude faster than Stable32 or Timelab since they are batch and not GUI.

But this begs the question -- what are you doing with 10M datapoints? Once you get beyond a couple of decades of data it's often better to compute statistics in segments and display all the segments of the whole as a time series.

So, for example, don't compute a single stddev or ADEV number from the entire 10M data set. While this gives an apparently "more precise" measurement due to sampling, it will also hide key factors like trends, periodicity, spectral components, outliers, and glitches.

I'm not sure if you got your answer on synthetic data, but Stable32 has a data noise generator, where you get to specify alpha from -2 to +2. I created 1M samples of each of the 5 noise types and use those cached files as a noise reference.

See also the 5 noise types in high-res here:

http://www.leapsecond.com/pages/allan/Exploring_Allan_Deviation_v2.pdf
http://www.leapsecond.com/pages/allan/

/tvb


More information about the time-nuts mailing list