Guest
How does the GT 520 auto-detect the connected the device ?
It has 3 connectors: VGA, HDMI and DVI.
What I want to do is:
1. Use VGA always for monitor.
2. Use HDMI only for audio connection to receiver.
Problem with this setup is that GT 520 believes receiver to be a monitor and switches to it automatically on boot, with different/weird results, either no screen, or bad screen.
Only pulling out HDMI cable and rebooting will restore screen to VGA.
I have not yet tried the following solution:
1. Use DVI for monitor.
2. Use HDMI for receiver.
What would happen in this scenerio ?
I could not perform this experiment cause I didn't think of it at the time and the nvidia driver installation failed, complaining about some wizard already running after reboot which was kinda weird.
I installed an older driver over a newer driver (gt 1030).
I have now replaced the GT 520 by the GT 1030 and will attempt to re-install latest GT 1030 driver.
Hopefully this time it will work.
I would still like to know how GT 520 does auto-detect and if it's somehow possible to force it to always use VGA or always use DVI... instead of having to revert to physical solutions like yanking/plugging high-powered hdmi cables and rebooting.
For now I assume this forcing/manual selection of display device is not possible but please enlightening me if I am wrong.
(System uses a socket 939 winfast motherboard which only supports one graphics card in normal mode unfortunately... would have liked to use both cards for experimenting purposes and maybe even cuda-ing sometime in future.... perhaps it's the build in sli-card-link thing that causes this not sure).
Bye,
Skybuck.
(Posted this on nvidia forum too, perhaps this posting is more clear and shorter and easier to find with google )
It has 3 connectors: VGA, HDMI and DVI.
What I want to do is:
1. Use VGA always for monitor.
2. Use HDMI only for audio connection to receiver.
Problem with this setup is that GT 520 believes receiver to be a monitor and switches to it automatically on boot, with different/weird results, either no screen, or bad screen.
Only pulling out HDMI cable and rebooting will restore screen to VGA.
I have not yet tried the following solution:
1. Use DVI for monitor.
2. Use HDMI for receiver.
What would happen in this scenerio ?
I could not perform this experiment cause I didn't think of it at the time and the nvidia driver installation failed, complaining about some wizard already running after reboot which was kinda weird.
I installed an older driver over a newer driver (gt 1030).
I have now replaced the GT 520 by the GT 1030 and will attempt to re-install latest GT 1030 driver.
Hopefully this time it will work.
I would still like to know how GT 520 does auto-detect and if it's somehow possible to force it to always use VGA or always use DVI... instead of having to revert to physical solutions like yanking/plugging high-powered hdmi cables and rebooting.
For now I assume this forcing/manual selection of display device is not possible but please enlightening me if I am wrong.
(System uses a socket 939 winfast motherboard which only supports one graphics card in normal mode unfortunately... would have liked to use both cards for experimenting purposes and maybe even cuda-ing sometime in future.... perhaps it's the build in sli-card-link thing that causes this not sure).
Bye,
Skybuck.
(Posted this on nvidia forum too, perhaps this posting is more clear and shorter and easier to find with google )