D
Don Kelly
Guest
--
Don Kelly
dhky@peeshaw.ca
remove the urine to answer
"qude" <qmdynamics@yahoo.com> wrote in message
news:1119216943.195356.8180@g44g2000cwa.googlegroups.com...
"right" to. Please include at least some of the material you are responding
to.
The magnetic flux density due to a wire carrying current I is proportional
to the current and inversely proportional to distance. The presence of other
current carrying wires (such as the necessary return path(s) affect the flux
density so consideration of a single wire in space is not a realistic
option(effectively you have a single turn coil with a return conductor
infinitely far away).
The field due to a current of 2A is 1/9 that due to a current of 18A at a
point external to the wire.
Is this of importance to decisions as to what voltage is to be used? Not
really.
There is an optimum voltage level depending on the power to be transferred
and the distance it is to be transferred (at a given frequency). Rule of
thumb- more power and or longer distance- go to higher voltage. This is not
a physical rule- but an economic one. That is why AC transmission put
Edison's DC systems out of business. You can transmit 100MW at 200V for a
distance of 200 miles but the cost would be horrendous compared to doing it
at 140,000V. However, to use 140,000 V to supply a home would be rather
dangerous, expensive, and ridiculous so 120/240 is a good practical and
economic balance. (Sue, you should know this- examples are all around
)
--
Don Kelly
dhky@peeshaw.ca
remove the urine to answer
Don Kelly
dhky@peeshaw.ca
remove the urine to answer
"qude" <qmdynamics@yahoo.com> wrote in message
news:1119216943.195356.8180@g44g2000cwa.googlegroups.com...
Who were you answering? There is no indication of what you are sayingright?
This means the 12volts/18ampere setup produce larger
magnetic field magnitude (from the current flow) than
the 110volts/2Ampere setup (noting that both produce the
same 220 Watts power), right??
Or does the larger 110 volts in the latter give more push
to the 2 ampere resulting in similar magnetic field
magnitude for both setups??
If the answer is that the 12volts/18 ampere indeed
produce larger magnetic field, do designers find it
necessary to use larger voltage instead of larger
current to prevent magnetic field interference?
If the answer is that both produce the same magnetic
field magnitude, then it means in larger voltage
such as 110 volts, it pushes the 2 ampere more so
it moves faster compared to the 18Ampere with 12
volts that don't push it faster. If not, how can
the magnetic field magnitude in both be the same
(assuming they are the same).
Thanks.
qude
---------
"right" to. Please include at least some of the material you are responding
to.
The magnetic flux density due to a wire carrying current I is proportional
to the current and inversely proportional to distance. The presence of other
current carrying wires (such as the necessary return path(s) affect the flux
density so consideration of a single wire in space is not a realistic
option(effectively you have a single turn coil with a return conductor
infinitely far away).
The field due to a current of 2A is 1/9 that due to a current of 18A at a
point external to the wire.
Is this of importance to decisions as to what voltage is to be used? Not
really.
There is an optimum voltage level depending on the power to be transferred
and the distance it is to be transferred (at a given frequency). Rule of
thumb- more power and or longer distance- go to higher voltage. This is not
a physical rule- but an economic one. That is why AC transmission put
Edison's DC systems out of business. You can transmit 100MW at 200V for a
distance of 200 miles but the cost would be horrendous compared to doing it
at 140,000V. However, to use 140,000 V to supply a home would be rather
dangerous, expensive, and ridiculous so 120/240 is a good practical and
economic balance. (Sue, you should know this- examples are all around
--
Don Kelly
dhky@peeshaw.ca
remove the urine to answer