D
Dan
Guest
Hi,
It's been years since I've done any basic circuit design, so I'm
looking for a couple of pointers in the right direction to get me
started.
I've got a heating system control and a hot water boiler that need
to talk.
The control system puts out a 0-10V DC voltage to indicate what
water temperature it needs.
The hot water boiler takes a 0-10V DC analog input signal to
indicate what water temperature it should produce.
However, the 0-10V DC scales for the two devices are not the same.
10V DC out from the controller indicates a "99*C" water temperature,
but the boiler will only produce 80*C at 10V. At the low end, 2.2V
from the controller indicates a desire for 22*C water, but the boiler
will produce 27*C. Both scales are basically linear within that
range. Details on the scale differences are available here:
http://www.tekmarcontrols.com/sb/sb046.pdf
I'd like to put a simple circuit on the line to properly adjust the
voltages being sent to the hot water boiler. Currently, the scales
are close enough that the system works, but the water temperatures are
a bit off.
I've been scratching my head for a couple of days, and I can't come
up with anything simple. A small microcontroller with an ADC/DAC
combination would do the job, but I'd like to power the device off the
input signal if that's at all possible. This might be hard as above
about 7V, the output would end up needing to be higher than the input.
Can anyone point me in the right direction (or if this is a well-
known problem, point me at the terminology I should be searching
around for?)
daniel
It's been years since I've done any basic circuit design, so I'm
looking for a couple of pointers in the right direction to get me
started.
I've got a heating system control and a hot water boiler that need
to talk.
The control system puts out a 0-10V DC voltage to indicate what
water temperature it needs.
The hot water boiler takes a 0-10V DC analog input signal to
indicate what water temperature it should produce.
However, the 0-10V DC scales for the two devices are not the same.
10V DC out from the controller indicates a "99*C" water temperature,
but the boiler will only produce 80*C at 10V. At the low end, 2.2V
from the controller indicates a desire for 22*C water, but the boiler
will produce 27*C. Both scales are basically linear within that
range. Details on the scale differences are available here:
http://www.tekmarcontrols.com/sb/sb046.pdf
I'd like to put a simple circuit on the line to properly adjust the
voltages being sent to the hot water boiler. Currently, the scales
are close enough that the system works, but the water temperatures are
a bit off.
I've been scratching my head for a couple of days, and I can't come
up with anything simple. A small microcontroller with an ADC/DAC
combination would do the job, but I'd like to power the device off the
input signal if that's at all possible. This might be hard as above
about 7V, the output would end up needing to be higher than the input.
Can anyone point me in the right direction (or if this is a well-
known problem, point me at the terminology I should be searching
around for?)
daniel