DVI vs Analog Monitor

Status
This thread has been Locked and is not open to further replies. Please start a New Thread if you're having a similar issue. View our Welcome Guide to learn how to use this site.

RPhox

Thread Starter
Joined
Apr 21, 2001
Messages
140
I just bought a new pc with a flat digital monitor and GeForce 5200 card. The guy at the store said I should connect both the regular VGA cable and the DVI cable (although I don't see why).

Here's the problem:

When I go into the Nvidia interface it seems that there are more settings for the analog monitor than for the digital. For instance, the analog settings allow for greater resolution. When I set the digital settings at a resolution of 1280x1024 - the screen has flickering spots. And the refresh rate on the digital monitor is fixed at 60 hertz while the analog allows the choice of 60, 70, 72 or 75

My question is did I waste money for the more expensive DVI? Is there a noticeable difference between the two? Am I missing something in the setup?Why do the analog settings offer more options?

Thanks,

RFox
Pentium 4
Windows XP Home
512 RAM
GeForce 5200
AOC Flat DVI Monitor
 

JohnWill

Retired Moderator
Joined
Oct 19, 2002
Messages
106,425
The guy at the store is wrong. Connect ONE interface, the monitor doesn't need both connected.

For either interface, the normal refresh rate is 60hz on the panels I've seen, including the one I'm using here. There is no refresh flicker like a CRT, so the faster refresh rates really have no meaning. Truthfully, I see no real difference between the VGA and DVI connections, I'm using the DVI because I have it.
 
Joined
Aug 2, 2004
Messages
3,370
But John everything Digital is better....I saw it in an Ad! :D

(and I like your new "Experience" blurb!!!)
 
Joined
Aug 17, 2003
Messages
17,584
Use one monitor on each here, and I can confirm that the difference with DVI was "just noticeable" as the extra AD and DA conversions are eliminated.
 
Joined
Jul 10, 2002
Messages
326
I can only tell you of my experience with DVI vs Component (close to VGA)on my Hidef TV. quite simply there is no logic to it. On some connections there is no difference others a big difference. sometimes DVI is better other times Component. I always thought DVI should be as good or better. But when they finally activated the DVI on my cable box, I quickly hooked up the TV only to find the black levels are worse on DVI, the picture appeared washed out.

the bottom line is: sometimes digital is better other times worse.

I agree with Johnwill only connect one VGA or DVI not both.

Finally LCD is not like CRT you dont need high refresh rates 60 is usually fine.

Steve L
 

RPhox

Thread Starter
Joined
Apr 21, 2001
Messages
140
Thanks.

But I still don't understand why I can set up the monitor to a greater resolution when it is defined as Analog.
 

JohnWill

Retired Moderator
Joined
Oct 19, 2002
Messages
106,425
It's real simple, you can set the greater resolution, but when you look at the actual result, the picture quality sucks! The native resolution of your LCD panel is the only resolution that will give you really good quality video. It's the nature of the beast, since the pixel size and location are fixed for LCD panels.
 
Joined
Aug 2, 2004
Messages
3,370
John,

I always wondered HOW they accomplished a resolution that required Pixel size that was smaller than the native physical resolution?????

You actually know????

Virtual Pixels???
 
Joined
Aug 17, 2003
Messages
17,584
That is why anything that does not exactly scale to a physical pixel resolution looks terrible on an LCD.

Also it is why an LCD is so mauch better when it is used at the correct resolution, a pixel is exactly a pixel, unlike a CRT when it is "only_about_as_accurate_as_the_beam_can_turn_on_and_off"
 

JohnWill

Retired Moderator
Joined
Oct 19, 2002
Messages
106,425
winbob, the short answer is, they don't. They just do pretty much the same thing that would happen if you take a fixed resolution picture and increase the resolution in a graphics application. The result is not nearly as good as the original, and the same thing happens on an LCD screen. :)
 

RPhox

Thread Starter
Joined
Apr 21, 2001
Messages
140
Me thinks you guys went off on a tangent...

I still don't understand why if I set the monitor as digital the best resolution I can get is 1024x768. Anything over that causes strange color pixels all over the place. But as analog, I can set it to the max at 1280x1024 without any problems.

After all, isn't digital supposed to be better?

I'm starting to feel like I wasted the $$ on the DVI.
 
Status
This thread has been Locked and is not open to further replies. Please start a New Thread if you're having a similar issue. View our Welcome Guide to learn how to use this site.

Users Who Are Viewing This Thread (Users: 0, Guests: 1)

As Seen On
As Seen On...

Welcome to Tech Support Guy!

Are you looking for the solution to your computer problem? Join our site today to ask your question. This site is completely free -- paid for by advertisers and donations.

If you're not already familiar with forums, watch our Welcome Guide to get started.

Join over 807,865 other people just like you!

Latest posts

Members online

Top