Jump to content
Compatible Support Forums
Sign in to follow this  
Vasco

DVI and TFT / Advantages?

Recommended Posts

I bought an 18" Samsung TFT. I am thinking of a new video card as well. The monitor supports DVI. Now my question: is there an advantage of using DVI on both sides (video card and monitor)? Is the picture better or the monitor faster? Or does it change simply nothing? If not I don't have to buy a new video card with DVI.

Share this post


Link to post

With an analogue (sub-D type connector), the red, green and blue channels are combined and transmitted as a composite analogue signal which then has to be split back by the monitor into the individual R, G & B signals. This conversion process causes a degradation in the signal; analogue signals are also subject to noise and interference, which causes further degradation.

 

With DVI, the monitor signal is digital, and is split into distinct red, blue and green channels. This means that there is no degradation of the signal quality, which ultimately means that the monitor has a clearer picture - in practice, less blurry, with a more distinct boundary between areas of different colour.

 

Most video cards produced in the last year or so come with DVI outputs, but it's safer to check before you buy.

 

Rgds

AndyF

Share this post


Link to post

I have bought an HP 15" TFT as well, and Im using the analog VGA connector on my GF2MX. I would like to upgrade, though, and I was looking at the ATI Radeon 8500 (LE). The problem with Nvidia cards is that they support DVI only on their 128MB models (GF3&4), which pushes them out of my price range. The Radeon has one of the best, if not the best, display qualities in both 2D and 3D. I am seriously thinking of ditching the Nvidia train and catch the ATI bandwagon smile

 

On the other hand I will ditch the AMD wagon, and jump the Intel train smile

Share this post


Link to post
Quote:

The problem with Nvidia cards is that they support DVI only on their 128MB models (GF3&4), which pushes them out of my price range.

That's odd. My 64MB Hercules GF2U has a DVI port, and I'm fairly sure my Leadtek GF3 (again 64MB) has one, but I'm not 100% certain on that.

Share this post


Link to post

My GF2 Ultra (Gainward) certainly had a DVI output on.

My GF3 Ti200 (Hercules) 64MB certainly doesn't, I think it's down to the individual cards rather than a blanket "No NVidia 64MB cards have DVI output on".

Share this post


Link to post

Yea ok, I did more research...the GF3 range with 64MB do not have DVI, nor the GF4 TIs with 64MB. The 128MB models do have DVI (both GF3,4), probably the GF2 Ultras as well.

Share this post


Link to post

Not being awkward smile but my 64MB Creative GF3 Ti200 has a DVI port - they're under £200 stg now I think

Share this post


Link to post

Thanks for the input. I think I will go for a GF4 with DVI then. About ATI's Radeon: they have good hardware, but as far as I know their driver are not best and can cause some problems. For the image quality I would have to buy a Matrox, but then I can forget about UT2003 wink

Share this post


Link to post
Quote:
but as far as I know their driver are not best and can cause some problems


I have an ati 8500 and use it for gaming and dual monitor support. Af far as I am concerned, ati drivers are great. They have been for about 6 months or so now. I was totally pro nvidia, then, well...I worked at a PC repair shop, and I have seen ALOT of nvidia run computers with graphics problems. I am so happy with my 8500, and for $274 cdn I am saving alot over the $400+ for a gf3 or gf4 (non MX)

Share this post


Link to post

Well Vasco, you could wait until Tuesday 14th when Matrox announce their new card.

If rumours are true it's going to be anamazing piece of kit, the usual Matrox image quality and available to market soon after the announcement.

Share this post


Link to post

You made me curious Blade about the new Matrox card. I will wait and let's see what magazines etc. say about it... wink

Share this post


Link to post
Quote:

With an analogue (sub-D type connector), the red, green and blue channels are combined and transmitted as a composite analogue signal which then has to be split back by the monitor into the individual R, G & B signals. This conversion process causes a degradation in the signal; analogue signals are also subject to noise and interference, which causes further degradation.

With DVI, the monitor signal is digital, and is split into distinct red, blue and green channels. This means that there is no degradation of the signal quality, which ultimately means that the monitor has a clearer picture - in practice, less blurry, with a more distinct boundary between areas of different colour.

Most video cards produced in the last year or so come with DVI outputs, but it's safer to check before you buy.

Rgds
AndyF


I'm fairly certain analogue VGA uses discreet red, green and blue signals...

I'm also fairly certain that since flat panels are, by their very nature, digital devices having to convert the incoming analogue signal to digital (after it's already been converted from digital to analogue by the RAMDAC in the video card) is the source of much of the signal degradation...

Share this post


Link to post
Quote:

I'm also fairly certain that since flat panels are, by their very nature, digital devices having to convert the incoming analogue signal to digital (after it's already been converted from digital to analogue by the RAMDAC in the video card) is the source of much of the signal degradation...
That's why it is much better to get a video card and flat panel with DVI connectors - then it stays digital the whole way...

Share this post


Link to post

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×