I don't understand the importance of high voltage-low curren

Guest
My problem is this: You are distributing 100 watts. You start out with
10 volts, 10 amps. You increase the volts by a factor of 2 through a
transformer. Now you must have 20 volts, 5 amps right? Using Ohm's
Law, 10 volts = 10 amps * 1 ohms, in the first case there is 1 ohm.
But, in the next case you have 20 volts = 5 amps * 4 ohms. So why did
the ohms change? When you use a transformer to increase voltage,
doesn't that mean ohms must increase if amps decrease? I am thoroughly
confused. I know I am thinking about it the wrong way, can someone
explain it to me? If you want me to clarify, let me know. Basically my
question is this: How do you create more voltage without creating more
current?
 
chesemonkyloma@gmail.com wrote:

My problem is this: You are distributing 100 watts. You start out with
10 volts, 10 amps. You increase the volts by a factor of 2 through a
transformer. Now you must have 20 volts, 5 amps right? Using Ohm's
Law, 10 volts = 10 amps * 1 ohms, in the first case there is 1 ohm.
But, in the next case you have 20 volts = 5 amps * 4 ohms. So why did
the ohms change? When you use a transformer to increase voltage,
doesn't that mean ohms must increase if amps decrease? I am thoroughly
confused. I know I am thinking about it the wrong way, can someone
explain it to me? If you want me to clarify, let me know. Basically my
question is this: How do you create more voltage without creating more
current?
I think you have your thinking of the actual question given to you a
little mixed up.
To put it simply as one of the biggest factors are, less conductor
material to relay the energy to the end point.

HV circuits like 5k, 12k etc. use very small conductors to transfer
the energy to the customer. The current is less of the same ratio.
At the end point, it then gets down converted into voltage where the
current service will increase and voltage drops. At this point the
conductors will have to be increased because a change in ohms in the
conductors have more lossy effects at lower voltages verses the
higher voltages.

etc..
It's really all about the use of material mass in the end.

if you were to attempt to transfer high levels of power at low
voltages for great distances, the amount of copper used would be un
thinkable like in supplying great cities for their electrical needs.


--
http://webpages.charter.net/jamie_5"
 
On Mon, 14 Apr 2008 16:55:17 -0700 (PDT), chesemonkyloma@gmail.com
wrote:

My problem is this: You are distributing 100 watts. You start out with
10 volts, 10 amps. You increase the volts by a factor of 2 through a
transformer. Now you must have 20 volts, 5 amps right? Using Ohm's
Law, 10 volts = 10 amps * 1 ohms, in the first case there is 1 ohm.
But, in the next case you have 20 volts = 5 amps * 4 ohms. So why did
the ohms change? When you use a transformer to increase voltage,
doesn't that mean ohms must increase if amps decrease? I am thoroughly
confused. I know I am thinking about it the wrong way, can someone
explain it to me? If you want me to clarify, let me know. Basically my
question is this: How do you create more voltage without creating more
current?
If the resistance in a circuit is constant, applying more voltage will
result in more current, in accordance with Ohm's Law.

However, when dealing with transformers, we are not dealing with a
constant resistance. Instead, a transformer passes a constant power -
ignoring losses, the power into a transformer will equal the power
out. The input voltage/output voltage ratio of a transformer is
proportional to the turns ratio of the transformer, and the input
current/output current ratio is inversely proportional to the turns
ratio, to keep the input and output powers equal.


--
Peter Bennett, VE7CEI
peterbb4 (at) interchange.ubc.ca
GPS and NMEA info: http://vancouver-webpages.com/peter
Vancouver Power Squadron: http://vancouver.powersquadron.ca
 
On Apr 15, 11:55 am, chesemonkyl...@gmail.com wrote:
My problem is this: You are distributing 100 watts. You start out with
10 volts, 10 amps. You increase the volts by a factor of 2 through a
transformer. Now you must have 20 volts, 5 amps right? Using Ohm's
Law, 10 volts = 10 amps * 1 ohms, in the first case there is 1 ohm.
But, in the next case you have 20 volts = 5 amps * 4 ohms. So why did
the ohms change? When you use a transformer to increase voltage,
doesn't that mean ohms must increase if amps decrease? I am thoroughly
confused. I know I am thinking about it the wrong way, can someone
explain it to me? If you want me to clarify, let me know. Basically my
question is this: How do you create more voltage without creating more
current?
They key is to understand that losses go with i^2, so reducing i (by
raising V in proportion since P=iV) is a good thing.

Cheers
 
<chesemonkyloma@gmail.com> wrote in message
news:808aa209-d4f8-4b27-a83b-dac3b61c8f2a@t54g2000hsg.googlegroups.com

OK you guys have helped me but what I still don't get is: Why does the
equation V^2/R produce a different power than I^2*R, and how do you
reduce I and increase V in the equation P=IV without changing the ohms
of resistance? I think that the *load*, for instance, the appliances
in the house, affect the current right? So that would be the ohms
changing? So in P=VI, P is the losses of the load you are powering,
and in P=I^2*R that is the resistance of the wire? How do you know
whether P is losses or what? I think I'm starting to understand.
There are two resistances involved. One is the resistance
of the wire bringing the power from the station to the
consumer. The other is the resistance of any load at the
consumer's location. Call them R1 and R2.

There are three voltages to be concerned with. The first
is the voltage at the station, the second the voltage
dropped along the wire, and the third is the voltage that
arrives at the consumer end and is applied across the
load resistance. Call them V1, V2, and V3. It is
evident that V3 = V1 - V2.

The wasted power is given by P1 = V2*I = V2^2/R1 = I^2*R1.

The power delivered to the customer is given by
P2 = V3*I = V3^2/R2 = I^2/R2.
 
On Tue, 15 Apr 2008, chesemonkyloma@gmail.com wrote:

On Apr 15, 6:32 am, Varactor <Morefl...@gmail.com> wrote:
On Apr 15, 11:55 am, chesemonkyl...@gmail.com wrote:

My problem is this: You are distributing 100 watts. You start out with
10 volts, 10 amps. You increase the volts by a factor of 2 through a
transformer. Now you must have 20 volts, 5 amps right? Using Ohm's
Law, 10 volts = 10 amps * 1 ohms, in the first case there is 1 ohm.
But, in the next case you have 20 volts = 5 amps * 4 ohms. So why did
the ohms change? When you use a transformer to increase voltage,
doesn't that mean ohms must increase if amps decrease? I am thoroughly
confused. I know I am thinking about it the wrong way, can someone
explain it to me? If you want me to clarify, let me know. Basically my
question is this: How do you create more voltage without creating more
current?

They key is to understand that losses go with i^2, so reducing i (by
raising V in proportion since P=iV) is a good thing.

Cheers
OK you guys have helped me but what I still don't get is: Why does the
equation V^2/R produce a different power than I^2*R, and how do you
reduce I and increase V in the equation P=IV without changing the ohms
of resistance? I think that the *load*, for instance, the appliances
in the house, affect the current right? So that would be the ohms
changing? So in P=VI, P is the losses of the load you are powering,
and in P=I^2*R that is the resistance of the wire? How do you know
whether P is losses or what? I think I'm starting to understand.

Without fully following the thread, your original use of ohms law
wasn't about the voltage drop, but was your expectation of a certain
resistance because you had voltage and current.

You are actually dealing with the voltage drop along the cable, or the
resistance of that cable which will cause a voltage drop.

The higher the current through a circuit, the more affect that resistance
has. Lower the current, and the resistance of the cable becomes less of a
factor.

But of course, lower the current and you can't supply as much power to the
load. Which is why they raise the voltage to compensate for the lower
current, the power in wattage being passed along the cable being the same
if both are changed by the same factor.

Try a different angle. In the days of tubes, the voltages were all high
voltage, while the current levels were really quite low. Except for
circuits where really high power was used, like transmitters, you rarely
saw large diameter wire in the wiring, since it didn't need to pass much
current, and the resistance of that narrow diameter wire was not a factor.
For a lot of equipment, the power supply would offer up 350v, if that
much, but the current drain would never be more than a few hundred
milliamps.

Then along came solid state devices. They all ran at quite low voltage,
but the current drain was pretty high. So 12 volts or even 5volts, but
it was often common to see an amp or so needed. Suddenly, you had to
be careful of the wiring, because the resistance of the narrower gauge
wire would become a factor. Bad connectors too, if they didnt' make
good contact their resistance would be more significant. The resistance
of the #20 wire or whatever was used did not change from when it was
used in tube circuits, but if it had a resistance of 1ohm in the tube
equipment, the 1ohm in solid state equipment might start being a problem
because of the needed current passed through it.

Michael
 
On Apr 15, 6:32 am, Varactor <Morefl...@gmail.com> wrote:
On Apr 15, 11:55 am, chesemonkyl...@gmail.com wrote:

My problem is this: You are distributing 100 watts. You start out with
10 volts, 10 amps. You increase the volts by a factor of 2 through a
transformer. Now you must have 20 volts, 5 amps right? Using Ohm's
Law, 10 volts = 10 amps * 1 ohms, in the first case there is 1 ohm.
But, in the next case you have 20 volts = 5 amps * 4 ohms. So why did
the ohms change? When you use a transformer to increase voltage,
doesn't that mean ohms must increase if amps decrease? I am thoroughly
confused. I know I am thinking about it the wrong way, can someone
explain it to me? If you want me to clarify, let me know. Basically my
question is this: How do you create more voltage without creating more
current?

They key is to understand that losses go with i^2, so reducing i (by
raising V in proportion since P=iV) is a good thing.

Cheers
OK you guys have helped me but what I still don't get is: Why does the
equation V^2/R produce a different power than I^2*R, and how do you
reduce I and increase V in the equation P=IV without changing the ohms
of resistance? I think that the *load*, for instance, the appliances
in the house, affect the current right? So that would be the ohms
changing? So in P=VI, P is the losses of the load you are powering,
and in P=I^2*R that is the resistance of the wire? How do you know
whether P is losses or what? I think I'm starting to understand.
 
On Mon, 14 Apr 2008 16:55:17 -0700 (PDT), chesemonkyloma@gmail.com
wrote:

My problem is this: You are distributing 100 watts. You start out with
10 volts, 10 amps. You increase the volts by a factor of 2 through a
transformer. Now you must have 20 volts, 5 amps right? Using Ohm's
Law, 10 volts = 10 amps * 1 ohms, in the first case there is 1 ohm.
But, in the next case you have 20 volts = 5 amps * 4 ohms. So why did
the ohms change? When you use a transformer to increase voltage,
doesn't that mean ohms must increase if amps decrease? I am thoroughly
confused. I know I am thinking about it the wrong way, can someone
explain it to me? If you want me to clarify, let me know. Basically my
question is this: How do you create more voltage without creating more
current?
---
If you have a 10 volt source which can deliver 100 watts of power to a
load, then the circuit looks like this:


I--->
+--------------+ <--------+
| | |
[SOURCE] [LOAD]R [E]
| | |
+--------------+ <--------+


and, if the load _is_ dissipating 100 watts, then the current in it
must be:


P 100W
I = --- = ------ = 10 amperes
E 10V


and its resistance must be:


E 10V
R = --- = ----- = 1 ohm
I 10A


Then, we can write, for convenience:

10V
10A--> /
+--------------+
| |
[SOURCE] [1R]
| |
+--------------+



Now, if we add a transformer in order to double the voltage into the
load, we'll have:

20V
20-->A /
+----+ +----------+
| P||S |
[SOURCE] R||E [1R]
| I||C |
+----+ +----------+


Notice that, since we've increased the voltage into the load to 20V,
the current through the load will be:


E 20V
I = --- = ----- = 20 amperes
R 1R

and the power it'll want to dissipate is:


P = IE = 20V * 20A = 400 watts.


Now we have a problem, since the source can only supply 100 watts.

So what do we do?

Change the load resistance so that it'll be dissipating 100 watts.

Like this:

E˛ 20˛
R = ---- = ------ = 4 ohms
P 100W

So you can see that you were on the right track.

Matter of fact, in the real world the problem becomes one of changing
the resistance of the load as power supply voltages change. For
example, in countries where 120VAC mains are standard, a 100 watt
incandescent lamp filament has a resistance of about 80 ohms when the
lamp is hot while in countries with 240VAC mains a 100 watt
incandescent lamp will have a filament resistance of about 320 ohms.

So, the answer to your question:

"How do you create more voltage without creating more current?"

is: "Increase the load resistance."

JF
 
On Tue, 15 Apr 2008 12:16:17 -0700 (PDT), chesemonkyloma@gmail.com
wrote:


OK you guys have helped me but what I still don't get is: Why does the
equation V^2/R produce a different power than I^2*R,
---
It doesn't.

Let's look at your earlier circuit again:


10V>--------+
|
[1R]
|
GND>--------+


The current in the resistor will be:


E 10V
I = --- = ----- = 10 amperes
R 1R

and the power disspated by the resistor will be either:



E˛ 10˛V
P = ---- = ----- = 100 watts, or
R 1R


P = I˛R = 10˛V * 1R = 100 watts


and how do you
reduce I and increase V in the equation P=IV without changing the ohms
of resistance?
---
You can't.
---

I think that the *load*, for instance, the appliances
in the house, affect the current right?
---
Right.
---

So that would be the ohms changing?
---
Yes
---

So in P=VI, P is the losses of the load you are powering,
and in P=I^2*R that is the resistance of the wire? How do you know
whether P is losses or what? I think I'm starting to understand.
---
The notation is largely immaterial. Watts is watts, so it just
depends on what you're talking about or what you're looking at.


Using your 10V source we can look at it like this:

10V>--[WIRING RESISTANCE]---+
|
[LOAD RESISTANCE]
|
GND>------------------------+

and if we assign an arbitrary resistance of, say, 0.1 ohm to the
wiring resistance and leave the load at 1 ohm, then we'll have:


R1
10V>--[0.1R]--+
|
[1R]R2
|
GND>----------+


Now the current in the circuit will be:


E 10V
I = --------- = ------ = 9.09 amperes.
R1 + R2 1.1R



The voltage dropped across R1 will be:


E = IR = 9.09A * 0.1R = 0.91 volts


and the power the wiring will dissipate will be:


P = IE = 9.09A * 0.91V ~ 8.3 watts


or P = I˛R = 9.09A˛ * 0.1R ~ 8.3 watts

E˛ 0.91V˛
or P = ---- = --------- ~ 8.3 watts
R 0.1R


Since the supply puts out 10 volts and the wiring drops 0.91V of it,
that means that the load has 9.09 volts across it, so it will
dissipate:


P = IE = 9.09A * 9.09V ~ 82.6 watts

Just for fun, work it out using P = I˛R and P = E˛/R and you'll see
that it all comes out the same.

The point is that one [general] notation doesn't stand for line losses
and another one stand for load dissipation.

JF
 
On Tue, 15 Apr 2008 17:25:37 -0500, John Fields
<jfields@austininstruments.com> wrote:

On Tue, 15 Apr 2008 12:16:17 -0700 (PDT), chesemonkyloma@gmail.com
wrote:


OK you guys have helped me but what I still don't get is: Why does the
equation V^2/R produce a different power than I^2*R,

---
It doesn't.

Let's look at your earlier circuit again:


10V>--------+
|
[1R]
|
GND>--------+


The current in the resistor will be:


E 10V
I = --- = ----- = 10 amperes
R 1R

and the power disspated by the resistor will be either:



E˛ 10˛V
P = ---- = ----- = 100 watts, or
R 1R


P = I˛R = 10˛V * 1R = 100 watts
--- ^^^^
Oops... 10˛A

JF
 
Thank you so much! I haven't actually read all the posts yet, but
understanding that it is the voltage DROP clears things up for me. You
guys are so helpful! I'm pretty new to electronics/electricity, but
I'm starting to understand it much better. I'll probably have more
questions in the future.
 
Also, my problem was that I KNEW the equations work, and I KNEW you
couldn't change those without resistance, I did that with the
equations, but now I get it!
 
On Mon, 14 Apr 2008 16:55:17 -0700, chesemonkyloma wrote:

My problem is this: You are distributing 100 watts. You start out with
10 volts, 10 amps. You increase the volts by a factor of 2 through a
transformer. Now you must have 20 volts, 5 amps right? Using Ohm's
Law, 10 volts = 10 amps * 1 ohms, in the first case there is 1 ohm.
But, in the next case you have 20 volts = 5 amps * 4 ohms. So why did
the ohms change? When you use a transformer to increase voltage,
doesn't that mean ohms must increase if amps decrease? I am thoroughly
confused. I know I am thinking about it the wrong way, can someone
explain it to me? If you want me to clarify, let me know. Basically my
question is this: How do you create more voltage without creating more
current?
Ohms don't enter into it for this calculation, except for the resistance
of the transmission wires themselves.

Say you've got 1000 feet of AWG 10, resistance .9989 ohms/1000 ft. which
is close enough to 1 ohm for this discussion
(source: http://www.thelearningpit.com/elec/tools/tables/Wire_table.htm }.

So, with a 10 volt, 10 A supply, the one ohm of wire will drop the whole
10 volts, leaving nothing for the load (actually, you'd get a voltage
divider consisting of the wire resistance and the actual resistance of
the load, but let's set that aside for now.)

With a 20 volt, 5A supply, the line drop is only 5 volts, leaving 15 volts
for the load. With a 100V, 1A supply, the line drop is 1V, leaving 99V
for the load, and so on.

The load itself has a resistance, which is, indeed, R = E/I; if you
apply Ohm's law to the supply, it's referred to as "impedance", but
that's a whole nother topic as well.

The point is, the higher voltage/lower current supply incurs less
IR (E = IR) losses in the lines themselves.

Hope This Helps!
Rich
 

Welcome to EDABoard.com

Sponsor

Back
Top