J
JohnT
Guest
I have a couple questions on powering a remote device:
3.3V (+/- .1V), 100mA nominal, and surges to .75A for 2 seconds every 10
seconds.
I read somewhere to keep wire losses below 10%, I'm guessing to keep heat
down. Running 12V over 22 gauge (I have to use 22AWG) allows only 95 feet
of wire. I need to go more than 200 feet (400 foot of wire) resulting in a
4.1V loss (33%). Considering my ~25% duty cycle what loss is tolerable?
When the device activates it draws .75A and I found (trial and error) I had
to add a big 2200uf cap to the remote device to keep the voltage from
dipping too much. This is when going to a 3.3V LDO regulator. What causes
this dip? Should this cap be placed on the device side of the regulator or
the 'wire' side...or both?
I could really cut the amps by going to a 90% efficient DC-DC
converter...but these are quite expensive. Big caps are cheaper if they are
a viable solution.(?)
jt
3.3V (+/- .1V), 100mA nominal, and surges to .75A for 2 seconds every 10
seconds.
I read somewhere to keep wire losses below 10%, I'm guessing to keep heat
down. Running 12V over 22 gauge (I have to use 22AWG) allows only 95 feet
of wire. I need to go more than 200 feet (400 foot of wire) resulting in a
4.1V loss (33%). Considering my ~25% duty cycle what loss is tolerable?
When the device activates it draws .75A and I found (trial and error) I had
to add a big 2200uf cap to the remote device to keep the voltage from
dipping too much. This is when going to a 3.3V LDO regulator. What causes
this dip? Should this cap be placed on the device side of the regulator or
the 'wire' side...or both?
I could really cut the amps by going to a 90% efficient DC-DC
converter...but these are quite expensive. Big caps are cheaper if they are
a viable solution.(?)
jt