Basic Calibration Terms

Guest
Hi,

I have a program which I use to conduct tests. One of these tests
requires that I calibrate the voltage supply at the following points
0, 5, 12, 15, and 22 Volts. These points mark boundaries above which
certain functionality must be available. The problem is that the
voltage I output from my dac does not result in the exact
corresponding voltage from the voltage supply. Therefore to get these
values I need to output the following voltages from the dac - 0.2,
5.2, 12.2, 15.3, 25.

I have writen a gui to allow the user to adjust these values, but I'm
not sure what terms I should use to explain this gui. For example, I
have the value 15 volts (desired value), 15.3 volts (the actual value
my program requests the supply to set itself to via the dac), and
finally the actually value from the supply.

/Barry
 
bg_ie@yahoo.com wrote:
Hi,

I have a program which I use to conduct tests. One of these tests
requires that I calibrate the voltage supply at the following points
0, 5, 12, 15, and 22 Volts. These points mark boundaries above which
certain functionality must be available. The problem is that the
voltage I output from my dac does not result in the exact
corresponding voltage from the voltage supply. Therefore to get these
values I need to output the following voltages from the dac - 0.2,
5.2, 12.2, 15.3, 25.

I have writen a gui to allow the user to adjust these values, but I'm
not sure what terms I should use to explain this gui. For example, I
have the value 15 volts (desired value), 15.3 volts (the actual value
my program requests the supply to set itself to via the dac), and
finally the actually value from the supply.
I think "Target Value" and Measured Value" might be
understandable.

--
Regards,

John Popelish
 
I think "Target Value" and Measured Value" might be
understandable.
Perhaps add an "Offset" term too (Target - Measured)? It could even
be color coded to indicate the degree of offset. Unless *any* offset
has to be compensated for.

John
 
<bg_ie@yahoo.com> wrote in message
news:462a72e7-d53f-40d4-9f95-a039cc95a301@m77g2000hsc.googlegroups.com...
Hi,

I have a program which I use to conduct tests. One of these tests
requires that I calibrate the voltage supply at the following points
0, 5, 12, 15, and 22 Volts. These points mark boundaries above which
certain functionality must be available. The problem is that the
voltage I output from my dac does not result in the exact
corresponding voltage from the voltage supply. Therefore to get these
values I need to output the following voltages from the dac - 0.2,
5.2, 12.2, 15.3, 25.

I have writen a gui to allow the user to adjust these values, but I'm
not sure what terms I should use to explain this gui. For example, I
have the value 15 volts (desired value), 15.3 volts (the actual value
my program requests the supply to set itself to via the dac), and
finally the actually value from the supply.
Maybe you should do some hardware work to make the DAC
output create to proper voltages.



 

Welcome to EDABoard.com

Sponsor

Back
Top