Jump to content
Compatible Support Forums
Sign in to follow this  
Mr.Guvernment

Buying a new video card and.......

Recommended Posts

Right, first back to the driver support.

I shall start by not talking about the Marvel, I shall move onto that a little bit later.

Lets first post a link to the page containg Matrox's latest driver releases:

http://www.matrox.com/mga/support/drivers/latest/home.cfm

 

Lets take a look under the WinXP column and see which of Matrox's cards have had specific WinXP drivers released for them.

G550, G450 eTV, G450, G400, G400-TV, G400MAX, G200 MMS, G200 with TV Tuner, G200, Mystique G200, MGA G200, Productiva G100 Range, Millennium II, Mystique 220, Mystique & Millennium.

We are talking about a lot of old cards here having specific WinXP support, some of these cards are ancient, the Mystique for example was originally a 2MB PCI card.

Don't you think that the above link kind of puts to shame the following statement:

For even the 450 and so on Matrox has officially said that those cards ill never be supported by Windows XP?

 

Now the Marvel G400 was a different animal, Matrox addmitted really early on that it was unlikely that they would be able to get full driver support or video tools available for this card sorted out.

Why?

I really do not know, I am not a driver writer and I don't work for Matrox but there has to be a reason behind it.

Companies do not just decide to discontinue support of a device for no reason unless the card is coming towards the end of it's life, something the Marvel G400 isn't doing.

Maybe that question would be best aimed at Matrox directly.

However, if I remember correctly Matrox did offer a very generous rebate on the purchase of a Marvel G450 for Marvel G400 owners, I cannot remember the full amount, but it wasn't too bad at all.

 

A big reason I do not like Matrox is that they design the driver interface in windows very thoughtleslly

 

It still amazes me when somebody clicks "no" when closing a Word document when asked "DO you wich to save changes?" and then moan because they did want to save them.

It still amazes me when people find their PC's riddled with virus's because they love to open e-mail's from an unknown source.

The point I'm trying to get to here is that there really is no accounting for how people will cope with things on a PC.

I personally have never spoken to a Matrox owner who has found the dual-head software difficult and awkward to use.

Sure, initially it can look confusing as there are just so many more options available to a Matrox owner than there is to an ATI or NVidia one.

Maybe Matrox should release a "Simple version" driver release for their dual-head cards for those who find the current ones a little bit difficult.

 

am absolutely shocked that you said getting a good monitor has nothing to do with image clarity...........

 

I totally stand by this statement.

In fact it has totaly shocked me to think that anybody feels that the monitor is what makes all the difference with respect to image quality.

Actually, the more I think about this the more my head starts to hurt, the monitor making all the difference?

So then why don't all these professional workstations have 8MB ATI cards costing about £25 each and then having really expensive monitors?

Why, because the thought that the monitor is what makes 99.9% of the difference is just crazy!!

Now, you seem to be questioning my testing this, which is fair enough, you're call, we live in a world where proof is always required.

At work I'm more or less exclusively using Iiyama monitors.

The Iiyama units offer great overall build quality, their support is great and touch wood, I've hardly seen any problem units.

these range from 17" basic units, through the Pro models, then move onto 19" both standard and Pro and finish off with 22" units.

 

The cards that I've used to drive these monitors have ranged from onboard Intel, through cheap ATI's, then Matrox G400's, G450's & G550's.

Also NVidia GF2's, GF3's, one that originally had a GF4 and CAD stations with Quadro Pro's.

I was first alerted to how poor the NVidia image quality is when some users were upgraded to new PC's.

They did have P2 systems with G400's in and they were upgrade to P3's with GF2 cards in.

A few support calls to me about how bland the screen was looking, slight flicker in the corners of the display, fuzy lines and the like.

I went to see the displays and to be honest couldn't see anymajor problems.

To please the users we set things up in the labs, we put PC's next to each other using identical model monitors, one system with a G400 in, another with a GF2 and connected them both to a 22" Iiyama.

Once the two were next to each other there was no comparison, it was like night and day, like black and white.

The Matrox based system was giving a clear, sharp image from corner to corner.

All graphics were sharp, nothing was blurred, no fuzzy lines, as near perfect as could be.

A quick swap of the two monitors showed that neither monitor was faulty at all, it was purely the GF2's lack of decent filters.

 

The issue was further shown with the purchase of a whole batch of PC's containing GF3 cards.

The non-Matrox users who were upgraded were happy, the Matrox users demanded that I took a look at things myself.

End result, we sent all the GF3's back and had them replaced with G550's - users are now happy to look at their screens all day.

 

I am not in any way saying that the monitor itself does not make a difference, saying that is as crazy as saying that the monitor makes up 99.9% of the image quality.

However what I am saying is that taking a monitor and graphic card combination, the card plays at the very least 50% of the overall quality.

Just do a search on your favourite search engine for "Image Quality" or "2D Image Quality" and things like that, you will get thousands of pages listed that will all tell you that Matrox are still unbeatable with respect to these issues.

The truth is, as you increase resolution on non-Matrox cards the image quality becomes noticeably worse, this is not the case on Matrox cards.

You want to use 800x600 or even 1024x768 as bases for this argument, then true the differences are not as noticeable, once we hit 1280x1024 and above the Matrox, even many years old is untouchable.

 

Please list the features that the Matrox cards have SPECIFICALLY and the PRICE

 

Not sure what to tell you here, the G1000/Parhelia is not going to be announced until next week, as Matrox like to keep things secret until the actual announcement I can niehter give you specifications nor price.

Taking the G550, well we have 32MB DDR RAM, 360Mhz RAMDAC.....and one of these cards would set you back about £89.

However the point here is that £89 for the worlds best image quality output is what myself and a lot of computer people are willing to pay.

 

It would appear that the bottom line here is that you are a pissed off Marvel G400 owner.

That is the whole basis for you thinking Matrox are a buch of wankers, you bought a card that was quite capable of running under all currently available OS's yet wasn't able to run under a then unannounced one.

At least the G400 Marvel can still be used as a basic graphics card under WinXP, something the ATI Fury MAXX users were not able to do under Win2k.

 

You have obviously not done image quality test, either professionally or with the naked eye, if you had then you would be agreeing with 99% of all the professional testing sites / tech web sites that will all tell you that the Matrox image quality is still unbeatable.

Share this post


Link to post

OK BladeRunner,

 

good post. Good comments and a nice comeback. wink

 

 

 

-------------------------------------------------------------------

Now the Marvel G400 was a different animal, Matrox addmitted really early on that it was unlikely that they would be able to get full driver support or video tools available for this card sorted out.

Why?

I really do not know, I am not a driver writer and I don't work for Matrox but there has to be a reason behind it.

Companies do not just decide to discontinue support of a device for no reason unless the card is coming towards the end of it's life, something the Marvel G400 isn't doing.

Maybe that question would be best aimed at Matrox directly.

However, if I remember correctly Matrox did offer a very generous rebate on the purchase of a Marvel G450 for Marvel G400 owners, I cannot remember the full amount, but it wasn't too bad at all.

---------------------------------------------------------------------

 

This is what i am talking about^^... The vid capture and all that jazz - the"marvel" part of the G400 Marvell does not work.

 

^not true either. There is a work arround however. you can buy 3rd party programs... So I am not realy pissed off. Just not impressed at all. They could easily fix it. Its a software thing not hardware.

 

 

 

 

 

 

 

Next.------------------------Price-------------------------------

 

PART A)

 

My CaRD

 

http://www.matrox.com/mga/products/pricing/home.cfm

please note that the G400 Marvell is still their top product and also the most expensive one to this day.

 

 

 

PART B)

 

http://www.matrox.com/mga/products/marv_g450_etv/home.cfm

-Matrox G450-eTV chip

-32 MB(DDR)

-AGP 4x

-""High-quality"" DVD and video playback

-TV tuner with Personal Video Recorder

-capture video at 320x240 VCD resolution ONLY

-S-video and composite video input

-Timeshifting with picture-in-picture

-DX7

-UltraSharp 360 MHz RAMDAC

- and

-DVI output

-TV output

-for games (roughly equal to a geforce1MX)

PRICE===============================>>230

 

 

 

 

 

 

ATI READEON 8500 DV

http://www.ati.com/products/pc/aiwradeon8500dv/index.html

 

-it has everything that the Matrox has above and then some

-stereo tv tuner

-remore!!!

-firewire ports

 

-time shifting

-itegrated interactive program guide

-64 to 128mb DDR

 

-HYDRAVISION (dual head)

-""highest quality"" dvd playback in "industry" !!

-capture digital video at 720x480 @30fps (DVD quality) !!!!!!!!!!!!!!!

 

Awesome 3D gaming and graphics

Powered by the revolutionary RADEON™ 8500 GPU and 64MB DDR memory for the most advanced 3D graphics in its class

ATI's innovative TRUFORM™, SMARTSHADER™ and SMOOTHVISION™ technologies make 3D characters and objects more realistic

ATI's HYPER Z™ II technology conserves memory bandwidth for improved performance in demanding applications

ATI's latest 3D rendering technologies, CHARISMA ENGINE™ II and PIXEL TAPESTRY™ II, power incredible 3D processing capabilities leading to unbelievable graphics quality

(in short equal to Geforce 3 ti500)

 

and more!!!

PRICE================================>>>$180

 

 

 

 

hmmm.

 

 

i don't want to argue. i think we did a good job. poeple should be able to make a prety good decision here based on our little battle. I guess Matrox cards could have better image but to be perfecly honest I personally can't see any difference. Some popele are more sensitive than others i guess for these things.

Share this post


Link to post
Quote:
Originally posted by plato
Powered by the revolutionary RADEON™ 8500 GPU and 64MB DDR memory for the most advanced 3D graphics in its class
ATI's innovative TRUFORM™, SMARTSHADER™ and SMOOTHVISION™ technologies make 3D characters and objects more realistic
ATI's HYPER Z™ II technology conserves memory bandwidth for improved performance in demanding applications
ATI's latest 3D rendering technologies, CHARISMA ENGINE™ II and PIXEL TAPESTRY™ II, power incredible 3D processing capabilities leading to unbelievable graphics quality


More marketing babble than you can read in an evening. Incredible that people fall for it.

Any PC parts manufacturers pages are stuffed with superlatives and SUPER MEGA ™ 's that do not really mean a thing. Independent testing is what you should follow when selecting parts.

As Dilbert said: "Marketing - two drinks minimum"

H.

Share this post


Link to post

^i agree

 

sorry for that garbage the truth is i got lazzy and didn't want to do too much work.

 

 

laughlaughwink

Share this post


Link to post

jeez...some excitment

 

I dont know how you can possibly say that you cannot see the difference in 2d quality on a matrox card vs almost any other card on the market.

 

If there wasnt any difference in quality, then why have so many magazines/websites/forums spoke so highly of matrox cards?

 

Quote:
matrox are good marketers and put on a good bulshit parade. they pay off the right people to make buzz, but they always disapoint and their products become substandard within a few months

 

OK....so lets assume matrox is paying of some companies that own websites/magazines and what not. Heh...aint no way that this is gonna fly. If im gonna go out and buy a card expecting to see visual quality in 2d appl - I sure as hell am gonna want to see the quality. And if my old ati card is visually better, then I'm obviously gonna post this problem in a respectible forum like this one. Since I got interested in computers (when I was 13) I have known about matrox cards being VERY impressive cards from the get go.

 

Now....as far as you saying that the monitors make 99.9 % of visual quality...your foolin yourself man. When I upgraded from an ati radeon 64meg ddr vivo, to a ati radeon 8500 64 MB card...the imporvment in visual quality was apparent to me. The monitors whites were washed in Tide, and text was way more sharp. Now....dont get me wrong, a monitor does play a large role in quality (tubes mostly). BUT ... now dont argue with me cause I know alot about filters ... filtering has alot to do with configuration of electronic parts. Some companies have ideas, some companies have differing ones. As far as filter quality goes...its up to the math and matching the components in a filter (depending on what TYPE of filter that you are configuring) as to how well the filter works. If you want a filter to cut out certain frequencies, there is a math aspect to this as well as part quality.

 

As far as I am concerned matrox knows how to design great filters and implement them properlly. Something as trivial as stray capacitance or minor phase differences can have a huge effect on something like detail in a crt. Especially with all the DAC and ADC goin on.

 

I have had the pleasure to work on a few workstations with matrox cards in them, and they have performed admirably. I wasnt playing q3 arena at work, but I was using applications like photoshop and the odd bit of autocad. Even textpad looked defined and sharp. Now I was runnin this box on an old monitor, and it looked alright, granted, anything looks better on a Sony 21" but, I would buy a matrox card if I had money to build a real 'workstation' (for me, this constitues no games and the like. Just powerful, 2d, content creation applications).

Share this post


Link to post

OK

 

 

i am off to buy some orange juice and eggs. Also i'll get a haircut. I'll consider your coments on the way.

 

smile

Share this post


Link to post

check out the new creative card

 

Anyways

 

I ended up getting the G4 - 4600 Ultra, i could not find the ati 128 all in wonder ANYwhere!! and i only had a week to get it frown

 

 

the card is good so far, i will know for sure once i got it in my good computer back home.

 

As for ATI, i have used them all the time, this is my first Geforce card, and never used maxtor, same as my friend, he swears by them, we have never had any driver issue problems, they alwasy worked.

Ati All In Wonder 128 Pro 32mb (4x agp)

Ati All In Wonder 128 32mb (2x AGP)

Rage Fury MAXX 64mb (first 64mb card out)

Tv Wonder

Radeon 7000 64mb x 2

Radeon 8500 64mb

Radeon 8500 DV (64mb)

ATI Tv Tuner

 

[/list:u]

My friends tends to buy new cards when they come out :), he already wants to grab the 128mb all in wonder.

 

When i first got XP, my Ati all in wonder rage 128, did not suport the tv tuner, but they do now, so all is good.

 

 

Anyways

 

 

i am more looking forward to Creative's new card, which no one has mentioned as of yet smile

 

Supposed to have like 20g/sec memory bandwidth smile

 

3Dlabs took the unusual step of pulling forward the announcement of its next generation graphics architecture, the P10 visual processing unit (VPU), a product we didn't expect to be talking about for another two weeks. While this architecture is going to find its way initially into the Oxygen line of workstation graphics cards from 3Dlabs, it also heralds some of what we can hope to see coming from Creative Labs this Christmas.

 

I have to say that there wasn't as much information to back up the P10 launch as you would normally expect with a major chip launch, and this is an announcement that is two years overdue from 3Dlabs, but like I said the announcement was extremely hurried. Nevetheless, it's interesting to see the direction 3Dlabs is taking and reflect on the influence of Longhorn on the P10's specs. The next killer app for 3D may be the operating system, which I think is a signifcant development. It also means that we can predict what to expect in next gen products from other graphics chip vendors, and do geek gossip over coffee.

 

Therefore, I surmise, 3Dlabs has done us all a big service rushing out their announcement. By making the P10 public the company is giving us a glimpse into the issues that are going to drive 3D graphics hardware architectures in the coming months for almost all the graphics industry, and it's damn good stuff.

 

Some of the key points of damn goodness came up at WinHEC 2002 this year and P10's feature list reflects the directions Microsoft was giving hardware developers at their conference. Things like,

 

Full programmability - While Nvidia and ATi have mature programmable graphics products on the market, it's worth noting that they still retain some level of support for the old fixed function pipeline with some form of integrated T&L circuitry. The next step for the graphics chip industry is to move to a fully programmable pipeline and to remove those pesky transistors for fixed function graphics. Graphics is going to need all the silicon real estate it can muster, but every chip will use those extra transistors differently.

 

Multi-tasking Graphics - Microsoft's nextF generation operating system, Longhorn, is pushing the industry to create graphics processors that will offload almost all of the typical functions of managing windowed displays. This means that every window on your desktop becomes a 3D texture, whether it is running a game, a digital video, or an Office application. The CPU has to handle all of Longhorn's open apps, videos, and games running in multiple windows, and Microsoft is working on determining how much graphics hardware it should ask for as a minimum to keep its OS humming. The graphics processor becomes a true partner processor for the CPU, but the question is, how low will Microsoft keep the bar on graphics performance and features? Will Microsoft open up the PC and graphics markets by demanding a significantly higher level of 3D graphics performance for base level Longhorn systems than what we are seeing today, or will it try and hedge its bets by staying a generation or two behind the curve?

 

Bye bye VGA - We have to say bye bye to VGA, and the sooner the better. VGA is the last of the big legacy items remaining on the PC. It makes ISA look nimble and hip. With no VGA, graphics processors get to ditch the lowest common denominator.

 

Just in case you are unfamiliar with the nuances of the programmable 3D graphics pipeline, I suggest you give Tom's excellent review of the GeForce3's technologies a look:

 

High-Tech And Vertex Juggling - NVIDIA's New GeForce3 GPU

http://www.tomshardware.com/graphic/01q1/010227/index.html

 

The above article is a great place to get a good grounding on where the programmable 3D graphics pipe got its big start in the mainstream. And Tom does a good job of explaining terminology and how pixels flow through the pipeline. I could have cut and pasted the stuff, but I believe that's illegal.

[/list:u]

Share this post


Link to post

I've read the now official specifications on the new Matrox card and all I can say is wow!

No wonder 3Dlabs were forced to bring the announcement for the P10 forward because at the current time that is the only card that is likely to rival the Matrox.

Matrox will have on market a card every bit as good as the P10 will be but a good few months ahead - there is a lot to be said for three year product cycles over the six month ones NVidia work on.

Matrox promised and the specifications would appear to show they delivered too.

 

Apparently the new Matrox cards will be available en mass from June and that is when they will be getting a huge wad od cash from me.

I cannot see my buying the very top card, 256MB RAM would be impressive but not exactly needed, so I shall be looking at the 128MB version.

A card with those specifications will be good for a couple of years while the rest of the industry plays catch up.

Tripple head display *Drool* seriously worth considering some cheap 17" monitors for that kind of setup.

Share this post


Link to post
Quote:
Tripple head display *Drool* seriously worth considering some cheap 17" monitors for that kind of setup.


I am with you on that one...dual is nice....but three...well...thats getting a little bit serious

Share this post


Link to post

The specs and features look excellent smile but i wanna see some numbers to see how it compares to the competition smile

 

Just give it a few days for nVidia to take the delay loops outta the geforce 4 drivers though laugh (honest).

Share this post


Link to post

It does sound good, however it would need video capture as well for me to even consider it.

Share this post


Link to post

A couple suggestions from a budding graphics programmer smile

 

I've been watching the Big Four compete with their different technologies for some time now. nVidia (who also may be using some technologies from 3dfx, whom they acquired in 2000) produce really fast cards, high clock cycle speeds, and so on. ATI produces (from what I have seen) robust technologies which lack in speed and driver support (heard LOTS about their flaky drivers).

 

But there are 2 technologies that may yet overtake the above two, in terms of superior graphics technology; only problem is they are only evolving now into something that could eventually be "Big". And the reason they may pull this off is because they provide intelligent technologies.

 

Matrox's new Parhelia GPU is scary... lifelike graphics from a mediocre model, given a texture for color, a texture for bump, and a texture for polygon interpolation from the original. Take a look at http://matrox.com/mga/products/parhelia512/technology/disp_map.cfm, they have amazing demos there. The main reason why this card will rock? Less bandwidth usage for textures and polygon data (and less of a need for blazing fast RAM when there's not as much data to push across the bus).

 

And PowerVR produces probably the most intelligent cards out there; problem is that their GPUs are not up to snuff in terms of capabilities and programmability. http://www.powervr.com dictates that the fastest card is the one that only renders what is seen... and they do the best job of the listed 4 technologies here, in terms of hidden surface removal.

 

Right now, the best graphics card to buy is one which (a) has a programmable GPU, because this means it's more than just a "purchase", it's more like an "investment", where for months and years to come, more and more games will take advantage of their capabilities. (Currently, few do. Doom 3, and that's about it.) And (B) that has crisp, clear rendering without a lot of washed out image quality. Take any game, max out the graphics, and see which technology shows up best... and which one scrambles faster to make sure that the drivers work properly. This is your best choice.

 

(Of course you could be like me, and hold out with your Voodoo3 because you're too damn bullheaded to buy an nVidia card, and too damn poor to buy an ATI! :))

Share this post


Link to post

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  

×