C
Commander Kinsey
Guest
Why do (cheap? expensive ones may be better) PC ATX power supplies need current drawn from the 5V line to make the 12V line work correctly?
I have a PC with 3 graphics cards running scientific applications. I acquired three old graphics cards that take about 300W each, and have loads of cheap (CIT) PSUs that are rated at 650W on the 12V line, which is what those cards use. So I run each card off its own supply. But the 12V line at no load, or even at 300W, is only giving out 10 to 10.5V. If I attach a small dummy load of an amp or so to the 5V line, the 12V line suddenly becomes 12V.
Why are the two lines related in any way?
Sorry for the crosspost, I'm not sure which of these groups are active.
I have a PC with 3 graphics cards running scientific applications. I acquired three old graphics cards that take about 300W each, and have loads of cheap (CIT) PSUs that are rated at 650W on the 12V line, which is what those cards use. So I run each card off its own supply. But the 12V line at no load, or even at 300W, is only giving out 10 to 10.5V. If I attach a small dummy load of an amp or so to the 5V line, the 12V line suddenly becomes 12V.
Why are the two lines related in any way?
Sorry for the crosspost, I'm not sure which of these groups are active.