D
Daniel
Guest
Hi, I have recently been trying to get 1.5 volts (or anything other than the power supply's rated 3 volts for that matter) out of a wall-wart 3 volt power supply. The wall-wart power supply is dc in output at 3 volts and even if I connect a 300 ohm resistor in series with the wall-wart power supply and the load (a 1.5 volt clock, old style analog clock)and when i measure the voltage with the voltmeter in place of the load (but remember the resistor is still in series) I keep getting 3 volts dc measured. I even put a 20 ohm resistor in parallel with the 300 ohm resister and still no change in measurement. I bought a older style wall-wart with a 1.5 volt setting and it measures at 3 volts with the voltmeter in series with the output leads. I can't get 1.5 volts (or anything other than 3 volts for that matter) no matter what I do and if with the older style wall-wart power supply I measure at the output leads with that power supply set to 12 volts, etc. the measured voltage is very high (19 volts for the 12 volt setting). Now, I know that without a load certain power sources don't measure the same as with a load but if I measure the voltage (I wish i didn't have to do that as the voltage measured was 3 instead of the 1.5 the clock is rated for, I might have done tiny damage to the clock for all I know), if I measure the voltage across where the battery would normally be it measures at 3 volts! SO I had to have been taxing the clock with it running at 3 volts instead of 1.5! Anyways - can someone shed some light on this? I am at the minimum above average in electronics skills and this problem I have been having has been driving me bonkers. The math shows that dropping 3 volts to 1.5 and allowing a current of .08 amps requires 18.75 ohms and I provided that like said above (300 ohms in parallel with 20 ohms equals 18.75 ohms) and it still measures at 3 volts dc! any help appreciated before I tear my hair out...Thanks, I think!