[time-nuts] Archiving Timing Data
lists at rtty.us
Mon Jan 10 22:55:56 UTC 2011
In a normal industrial setting, one would make a few decisions and start
pushing data in a direction.
In this case, I'd like to look at *everybody's* data. That makes a bit more
complex problem. If it's an "archive and import" approach, the process has
to work on autopilot. Customizing the import parameters for each data set
isn't going to work out.
If it's a dump and dupe on whateverSQL, we better be ready for 5.1.51 goes
to 5.1.52 and the format changes just a bit stuff. That process keeps on
going on. I dump it under 5.1.51, you later try an import under 5.1.2357 -
Not an easy problem to solve. I've done it many ways, and none of them are
From: time-nuts-bounces at febo.com [mailto:time-nuts-bounces at febo.com] On
Behalf Of Chris Albertson
Sent: Monday, January 10, 2011 5:41 PM
To: Discussion of precise time and frequency measurement
Cc: scmcgrath at gmail.com
Subject: Re: [time-nuts] Archiving Timing Data
On Mon, Jan 10, 2011 at 2:08 PM, Bob Bownes <bownes at gmail.com> wrote:
> There is a difference between archival format and database format. If you
> are looking for an archival format that is portable, then a CSV (or other
> delimiter of your choice) is ideal. They are easy to import to a real
If it needs to be "imported" it will never be used. No one will
ever bother to take the time to import gigabytes of compressed data.
Don't worry about the space 1TB disks now sell for $99 and the price
trend s downward.
CSV is easy to import, just like bricks are easy to move, easy until
you have a billion or two of them. We have data archives here and
from experience, you have to really, really want to look at the data
if it needs to be imported into a tool. The other thing is that over
time if the data is not in usable format people forget they have it or
what it is.
Redondo Beach, California
time-nuts mailing list -- time-nuts at febo.com
To unsubscribe, go to
and follow the instructions there.
More information about the time-nuts