I have a Matrox G550 Dual Head on an Asus P4T533C MoBo. I replaced my two old analog PC Monitors with two LCD ones (Samsung Syncmaster 226BW). There is some incompatibility of the G550 with the new monitors (because they are too big), so I would like to replace the G550 Graphics card with some other one with the following requirements:
- to be able to support 2 monitors with digital connection (not analog)
- to support a resolution of 1680x1050 (32bit colour) for both monitors
- preferably to have more than 64MB RAM
- to be compatible with the Asus motherboard (I think it is AGP 4x)
Do you have any Graphics Card in mind? What would you suggest?
Thanks very much in advance for your help!
"The one who asks, makes a fool of himself once.
The one who doesnt ask, remains always a fool."
If you still have an AGP port, then the matrox parhelia is a very good choice. I use it in triple head with 3 19" LCD monitors and it's apleasure to work with
Maximum resolutions (per display)
• Digital, 1-2 monitors: 1920 x 1200
• Analog, 1-2 monitors: 2048 x 1536
• Analog, 3 monitors: 1280 x 1024
Otherwise, there is the P690 Plus LP PCIe x16 that supports up to 4 monitors:
Maximum resolutions (per display)
• Digital, 1-2 monitors: 1920 x 1200
• Analog, 1-2 monitor: 2048 x 1536
• Analog, 3-4 monitors: 1920 x 1200
for a workstation, I wouldn't even look at NVidias or ATIs.. They spit out polygons very fast, but for working they are far away from Matrox.
well, though I don't know whether a Matrox QID supports that resolution, but I might be able to test that soon with mine (this is the PCIe version, but the driver is the same). there is an AGP version, too, as well as a PCI one. supports 4 monitors max. could be worth a look.
I was given a GF 7900GS with AGP and dual dvi at work, btw. my colleagues say these work with 1680x1050, at least with linux, but need a PSU with at least 20A on the 12V rail...
-greetings, markus-
ps. I got the same screen here (the 226BW) and have a FX 5200 right now running with linux and the mentioned resolution. works fine
--
I'm sorry, but my karma just ran over your dogma.
<Shroomz> wrote:My FX 5200 just decided to die on me last year. ... I'm using a passive Asus N6200 (same make as mobo) in that machine now & it's been very good.
mine is already showing signs of bubonic capacitors, and all of my co-workers cards of that series died already. some got repaired by soldering new capacitors, some didn't. have to wait for a new psu though...
--
I'm sorry, but my karma just ran over your dogma.
i have had lots of asus motherboards and they have been fine, their video cards, on the other hand have been atrocious. 3 of them died. i mean that stuff isn't supposed to die. i have PCI graphics cards from the dark ages that still work.
I have had several Asus graphic cards and never had a problem. Now I have Egva and my wife an MSI, both working very well.
There are so many variables on it all, that I find difficult to really know why something brakes.
Anyway, it is known that sometimes some micro-components come damaged from factory, because there was a problem. You have the case of Thomsom TV sets that exploded all around the world, here you have an experience of somebody who actually sow it:
“I used to think exploding TV sets were an urban myth. Much like Candyman, the Bigfoot and Greek broadband Internet.
NAH! TV sets do explode and I (much to my dismay) just witnessed such an occurrence. My bedroom TV set if any. All poor me did was plug it in. And suddenly an unearthly noise.. Something like bursting a paper bag full of air. And smoke coming out of the TV.
For a minute I thought it was the BBQ channel, or a war movie, but noooo - it was TV circuits on fire. The thought of throwing water on it passed my mind, and luckily it didn’t stay there. So, I unplugged the damn thing and will be shipping it off to where worthless pieces of electronic equipment meet their makers. Umm… Japan?
What next?! If I get any e-mail from Bigfoot - I’ll tell”.
I was told by a friend about a trademark I’m not going to name, the following story: About 3.000 HDD came damaged to people’s PC from the shops; they were apparently in perfect conditions, but... In a very hot place where the drives are mounted (in Asia by the way), a lorry broke down “after” the electronic pieces were proved through quality-check procedures to be perfectly working, but “high temperature” got it all bad. Unfortunately, the lorry passed the whole day in the highway waiting to be rescued and repaired until the next morning. This took too much time and so the electronic parts got a looooonnng sun bath… All the Hard Drives build with those components were bad.
This is one of the many variables possible.
*MUSIC* The most Powerful Language in the world! *INDEED*
I have a G550 as well and use it with a LCD via VGA.
I really don't see *any* difference in quality to DVI.
fyi :
The VGA output quality of graphic cards has dramatically changed to the worse in the last years. Not to promote DVI, but because of some strange decisions of international frequency emitting conferences. The manufactuers had to add some capacitors to boil down the frequency transmission flow leading to the unpleasing side effect of lowering quality.
The "old" G550 doesn't suffer this issue, btw
the 'old' G550 doesn't support 1050x1680 resolution, if I remember correctly not even under Win2K or XP - assuming my Matrox Dual Head was a G550...
btw I use that resolution over an analog KVM switch without a problem.
The picture is a tiny bit blurred, but imho that 'softening' is more eye-pleasing than the extremely sharp dots of a pure digital connection
Well, it works here with this resolution. Not via DVI of course, but via VGA.
The unpleasing side effect was, that I had to leave my beloved Win98 where everything was so fast.
A Matrox Bios update and installation of the latest drivers for my G550 did the trick! I now run 2 monitors in 1680x1050 with my old and beloved G550. So, finally, no need for a new graphics card for my good old DAW
"The one who asks, makes a fool of himself once.
The one who doesnt ask, remains always a fool."