Design and work around

R

Richard

Guest
I had a company design a custom terminal board for me. It does numerous
things, but what I'm fighting with is the analog section. Standard 0-20ma
inputs and , Thru a Precision 250 ohm resistor to convert to 0-5 volts for
a ADC Card.
ADC Card is a 16 channels single ended 12 bit analog.

Problem is, that no matter what the source is, the voltage always seems to
be approx. 50mv lower than actual. I have an Omega MA Simulator in order to
test this. If I send 4 ma then the ADC Card reads 0.950 volts and 20 ma =
4.950 volts.
It appears to be pretty linear. Not sure what is causing it, because I have
not seen this on my other boards designed with 250 ohm resistors.

Anyway, since it's linear, and we built abunch of these boards, I need a
correction factor in my scaling in order to make the engineering units
correct. I would like to scale it on the voltage side since we know the
offset is the same on all the channels and not offset the scaled value.

I have the following for each channel:
Factor and Offset which works normally (When I don't need an offset)

Vin * Factor + Offset

I can't add the millivolt difference to Offset because offset is adding to
the scaled value. But I could add it to Vin like this....
CrFactor = 0.50
((Vin + CrFactor) * Factor) + Offset

The above works for Sensors 4-20ma, because we are actually scaling 1 - 5
volts or 0.950 to 4.950 and by adding the correction factor it brings it
to 1.00 to 5.00

Follow?

Okay, But now, we actually have a sensor that is 0-20ma or 0-5volts, the
above formula and correction factor does not work because Zero volts is
already zero volts (no more voltage drop) so the actual voltage shown is
0 to 4.950

Using the above correction factor, we end up with 0.050 to 5.000 which is
incorrect and 0 to 4.950 is also incorrect.
Would it be 0 - 4.900 ?

Not sure, Anyone have an idea how I can handle this voltage drop for all my
channels?
My software "could" look at MinVoltage and MaxVoltage and determine if the
scalling starts at Zero or Negative and use a different formula for each
case?

Richard








----== Posted via Newsfeeds.Com - Unlimited-Uncensored-Secure Usenet News==----
http://www.newsfeeds.com The #1 Newsgroup Service in the World! >100,000 Newsgroups
---= East/West-Coast Server Farms - Total Privacy via Encryption =---
 
On Sat, 20 Nov 2004 09:28:47 -0600, Richard wrote:

I had a company design a custom terminal board for me. It does numerous
things, but what I'm fighting with is the analog section. Standard 0-20ma
inputs and , Thru a Precision 250 ohm resistor to convert to 0-5 volts for
a ADC Card.
ADC Card is a 16 channels single ended 12 bit analog.
The problem is, there is no such thing as "Standard 0-20 ma". The standard
for current-loop control or sensing is 4-20 mA. This way 0 mA can be used
to indicate a fault condition.

Please incorporate this into your spec, and come back if you're still
having problems. Let's eliminate as many variables as possible! :)

Good Luck!
Rich
 
On Sat, 20 Nov 2004 09:28:47 -0600, "Richard" <rwskinner ATawesomenet
Dot net> wrote:

I had a company design a custom terminal board for me. It does numerous
things, but what I'm fighting with is the analog section. Standard 0-20ma
inputs and , Thru a Precision 250 ohm resistor to convert to 0-5 volts for
a ADC Card.
ADC Card is a 16 channels single ended 12 bit analog.

Problem is, that no matter what the source is, the voltage always seems to
be approx. 50mv lower than actual. I have an Omega MA Simulator in order to
test this. If I send 4 ma then the ADC Card reads 0.950 volts and 20 ma =
4.950 volts.
It appears to be pretty linear. Not sure what is causing it, because I have
not seen this on my other boards designed with 250 ohm resistors.

Anyway, since it's linear, and we built abunch of these boards, I need a
correction factor in my scaling in order to make the engineering units
correct. I would like to scale it on the voltage side since we know the
offset is the same on all the channels and not offset the scaled value.

I have the following for each channel:
Factor and Offset which works normally (When I don't need an offset)

Vin * Factor + Offset

I can't add the millivolt difference to Offset because offset is adding to
the scaled value. But I could add it to Vin like this....
CrFactor = 0.50
((Vin + CrFactor) * Factor) + Offset

The above works for Sensors 4-20ma, because we are actually scaling 1 - 5
volts or 0.950 to 4.950 and by adding the correction factor it brings it
to 1.00 to 5.00

Follow?

Okay, But now, we actually have a sensor that is 0-20ma or 0-5volts, the
above formula and correction factor does not work because Zero volts is
already zero volts (no more voltage drop) so the actual voltage shown is
0 to 4.950

Using the above correction factor, we end up with 0.050 to 5.000 which is
incorrect and 0 to 4.950 is also incorrect.
Would it be 0 - 4.900 ?

Not sure, Anyone have an idea how I can handle this voltage drop for all my
channels?
My software "could" look at MinVoltage and MaxVoltage and determine if the
scalling starts at Zero or Negative and use a different formula for each
case?

Richard
Maybe you gave a ground loop problem, with the single-ended ADC. Do
you have a spare ADC channel? If so, run it to the common point on the
termination board, digitize (and maybe smooth) that, and subtract it
from all the other ADC channel readings.

John




----== Posted via Newsfeeds.Com - Unlimited-Uncensored-Secure Usenet News==----
http://www.newsfeeds.com The #1 Newsgroup Service in the World! >100,000 Newsgroups
---= East/West-Coast Server Farms - Total Privacy via Encryption =---
 

Welcome to EDABoard.com

Sponsor

Back
Top