[volt-nuts] Agilent calibration

Daniel Mendes dmendesf at gmail.com
Wed Aug 14 21:17:41 EDT 2013


Sorry if i´m being naive, but what´s the difficuty of making a digital 
equipment with a memory to store the offset of each scale ans subtract 
it before sending to the display, no pot trimming involved?
Why aren´t all of the ones made after, let´s say, 1995, like this?

Daniel

Em 14/08/2013 20:29, Dr. David Kirkby escreveu:
> On 14 August 2013 06:41, Charles Steinmetz <csteinmetz at yandex.com> wrote:
>> Joe wrote:
>>
>>> The way I read this is that if I send them a DMM that is within spec, they
>>> won't adjust it or provide pre/post data. Is this the case? If I spend
>>> over
>>> $200 sending a DMM to them, I want it adjusted to the best possible specs
>>> and I want the data. I do not want someone just saying that it is good
>>> enough and send it back to me. I can get that for $50 in El Paso.
>> The big difference is not between adjusting and not adjusting -- it is
>> between getting a calibration "with full data" and getting one without data.
>> /The true value of calibration is not the adjustment -- it is the data./
>>
>> Agilent doesn't just say it is good enough -- they tell you specifically how
>> far off it is and quantify the statistical uncertainty of their measurement.
>> That is everything you need (i) to correct readings you make with the
>> instrument and (ii) to be confident of the potential uncertainty of those
>> measurements.
>>
>> Let's say your meter has an uncertainty spec of +/- 15 uV (1.5 ppm) total at
>> 10 V.  If your calibration certificate says the meter reads dead on at
>> 10.000000 V, the reading shown on the display is your measurement result
>> (with a certainty of 1.5 ppm, or +/- 15 counts from the reading) when you
>> measure a 10 V source.  But the cal certificate could just as well say that
>> the meter reads 10.000008 V when measuring a 10.000000 V source.  In that
>> case, you know to subtract 0.000008 V from whatever the meter reads when you
>> measure a 10 V source to get your measurement result (again, with a
>> certainty of 1.5 ppm, or +/- 15 counts from the corrected reading).  Of
>> course, in the real world a voltage standard will have its own calibration
> One of the advantages of modern instruments over older ones is that
> measurements are often more convenient to make. This can reduce your
> measurement time and so cost. For many companies, a case can be made
> to upgrade if a newer instrument will save time and money.
>
> As a rough guess, I would assume 99.999% of instruments sold sold by
> Agilent are for commerical non-metrology work. Those 99.999% of users
> do not want to remember to subtract 0.000008 V -  they want that
> instrument to be as accurate as possible.
>
> Now if you take an instrument like the Agilent 3458A 8.5 digit DVM,
> then the intended user base is going to have a lot of metrologists.
> Those people might prefer their instruments are not adjusted, but I
> think for 99.999% of users of test equipment, they would want the
> instruments adjusted. With so much done in software now, arguments
> about pots drifting once adjusted dont make any sence.
>
> By its very nature, the readers of volt-nuts will often fall into the
> 0.001% that might not want their instruments adjusted, but I think it
> is fair to say most would.
>
> Agilent must have thought about these arguments, and have come to a
> decision not to adjust. I'm a bit surprised myself, but they obviously
> have their reasons. Clearly if an adjustment requires someone to go in
> with a screwdriver, then it takes time, has some element of risk of
> causing accidental damage, and it might well cause things to drift
> more in the short term.
>
> Dave
> _______________________________________________
> volt-nuts mailing list -- volt-nuts at febo.com
> To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
> and follow the instructions there.



More information about the volt-nuts mailing list