How should two dissimilar monitors be hooked up to two dissimilar video cards from an overall performance point of view?
Here's my specific setup (but a referral to a general article on the subject would be fine too):
Monitor 1: 15" LCD Flat Panel
Monitor 2: 19" CRT
Video Card 1: GeForce 3 AGP with 64 MB DDR
Video Card 2: ATI Radeon PCI with 32 MB SDR
Motherboard supports designating which monitor is primary.
Currently I'm running the lesser performance video card 2 as "primary" both for boot purposes and for Windows XP (yes, that's my OS) purposes. That card 2 is presently hooked up to the 15" LCD flat panel display.
I've set it up in several different ways, and it doesn't seem to make much difference. It works any way it's set up (thank God). Just curious if there is a preferred way to do this and why.
Here's my specific setup (but a referral to a general article on the subject would be fine too):
Monitor 1: 15" LCD Flat Panel
Monitor 2: 19" CRT
Video Card 1: GeForce 3 AGP with 64 MB DDR
Video Card 2: ATI Radeon PCI with 32 MB SDR
Motherboard supports designating which monitor is primary.
Currently I'm running the lesser performance video card 2 as "primary" both for boot purposes and for Windows XP (yes, that's my OS) purposes. That card 2 is presently hooked up to the 15" LCD flat panel display.
I've set it up in several different ways, and it doesn't seem to make much difference. It works any way it's set up (thank God). Just curious if there is a preferred way to do this and why.