The Ultimate Video Card Guide!
Mr D J
actually i dont think RAM matters alot cuz i got XFX GForce 6600 GT and it has 128 MB of ram but I can run any game on max resolution (if my monitor supports it) although my A64 processor might give it a boost too
Techie
Why not try running Doom 3 on 1600x1200 and let me know. That resolution with medium-high settings requires a 512MB card to produce the fastest rendering texture rate possible.
Like I said, the higher quality settings and the higer the resolution you go, the more GPU RAM you need.
Like I said, the higher quality settings and the higer the resolution you go, the more GPU RAM you need.
lord_shar
Quote:
Originally Posted by Joker The Owner
actually i dont think RAM matters alot cuz i got XFX GForce 6600 GT and it has 128 MB of ram but I can run any game on max resolution (if my monitor supports it) although my A64 processor might give it a boost too
|
So yes, more video ram never hurts...
swaaye
I've recently gotten myself a free Geforce FX5600 256MB. Immediately threw it into my AthlonXP 2.5Ghz (zoom).
I'd place the Geforce FX5600 (overclocked to 350/250(500) just around a Radeon 8500 (280/300(600) in speed in this game. The 8500 actually may be faster, I don't remember exactly. Running around Lion's Arch at 1680x1050 shows about 25fps max. Not a fast experience by any means.
By the way, GeforceFX fully supports DX9 in every way. It actually has a featureset superior to DX9 Shader Model 2. Problem is the architecture sucks at actually performing these effects. ATI's midrange cards that were available at the launch of FX5200/5600, the Radeon 9500PRO/9600 cards, were significantly better cards.
Stay away from Geforce FX unless you can get a FX5900+ for really cheap.
I'd place the Geforce FX5600 (overclocked to 350/250(500) just around a Radeon 8500 (280/300(600) in speed in this game. The 8500 actually may be faster, I don't remember exactly. Running around Lion's Arch at 1680x1050 shows about 25fps max. Not a fast experience by any means.
By the way, GeforceFX fully supports DX9 in every way. It actually has a featureset superior to DX9 Shader Model 2. Problem is the architecture sucks at actually performing these effects. ATI's midrange cards that were available at the launch of FX5200/5600, the Radeon 9500PRO/9600 cards, were significantly better cards.
Stay away from Geforce FX unless you can get a FX5900+ for really cheap.
Techie
Yes but what settings are you running it at? Fastest or most visual?
EternalTempest
Quote:
Originally Posted by swaaye
I've recently gotten myself a free Geforce FX5600 256MB. Immediately threw it into my AthlonXP 2.5Ghz (zoom).
I'd place the Geforce FX5600 (overclocked to 350/250(500) just around a Radeon 8500 (280/300(600) in speed in this game. Running around Lion's Arch at 1680x1050 shows about 25fps max. Not a fast experience by any means. Stay away from Geforce FX unless you can get a FX5900+ for really cheap. |
I'm getting 40-50 FPS, 1024x768 @ 75mhz, 4AA refresh rate with my Fx 5700 ultra overall in GW (I run with the benchmark stats - I forgot to take it off the mod to the short cut). I also have an amd athlon 64 3400+ (older version), running on 1 gb of memory and older 120gb ATA100 (not SATA drive).
Game runs pretty smooth around only 30min fps over all.
swaaye
Highest Quality. I ran through the whole spectrum and only Lowest Quality sped it up significantly, to around 35fps max or so. 4X AA brought it down to about 15-19fps.
Techie's guide at post #1 is pretty spot on with what you should buy today. I just thought it would be fun to see how a FX5600 actually runs it. I wouldn't trade in my 4 yr old Radeon 9700 on this FX5600, that's for certain.
BTW EternalTempest I think the FX5700 uses a more powerful chip than the 5600. NV almost immediately refreshed their NV3x line after the launch of 5800. You know how the companies refresh every 6 months? Well NV30 was shelved for NV35 in about 3 months.
The 5900 at least has a lot more shader power than 5800 because they literally added some more math units to the core. It still wasn't enough to touch a 9700 or 9800 though. So your performance should definitely be a little better than a 5600.
Techie's guide at post #1 is pretty spot on with what you should buy today. I just thought it would be fun to see how a FX5600 actually runs it. I wouldn't trade in my 4 yr old Radeon 9700 on this FX5600, that's for certain.
BTW EternalTempest I think the FX5700 uses a more powerful chip than the 5600. NV almost immediately refreshed their NV3x line after the launch of 5800. You know how the companies refresh every 6 months? Well NV30 was shelved for NV35 in about 3 months.
The 5900 at least has a lot more shader power than 5800 because they literally added some more math units to the core. It still wasn't enough to touch a 9700 or 9800 though. So your performance should definitely be a little better than a 5600.
EternalTempest
You jogged my memory, when I did do research on my video card the 5700 was based on a diffrent gpu, that's why I bough it.
The 5700 Ultra is based on the NV36 chip, a better but slower then the 5900 series used and the 5800 was based on older chip and that was the reason why I didn't go with it. If I rember correctly, may be slight off.
Found this info:
NV30 = GeForce FX 5800
NV35 = GeForce FX 5900
NV38 = GeForce FX 5950
I almost bet 1 plat the 5700 Ultra came at the tail end of the 5x line as a revsion card / fill the gap card with a much better chip.
The 5700 Ultra is based on the NV36 chip, a better but slower then the 5900 series used and the 5800 was based on older chip and that was the reason why I didn't go with it. If I rember correctly, may be slight off.
Found this info:
NV30 = GeForce FX 5800
NV35 = GeForce FX 5900
NV38 = GeForce FX 5950
I almost bet 1 plat the 5700 Ultra came at the tail end of the 5x line as a revsion card / fill the gap card with a much better chip.
MaglorD
Quote:
Originally Posted by lord_shar
ATI's benchmarks with the x800's and x850's are a bit skewed due to their catalyst driver's dynamic texture filtering. Also referred to as "bri-linear filtering," ATI's x800 series dynamically shifts between bilinear + trilinear filtering modes to achieve the best benchmarks. However, the X800's/850's could not perform true trilinear filtering unless you turn off ATI's filtering optimizations, but once you disabled this feature, ATI's X800's/850's fell behind NVidia's 6800's. ATI caught a lot of flak for the above and finally conceded by adding an "off" switch to their optimized texture filtering.
Why does the above matter? Simple: ATI was compromising video quality for the sake of benchmarks. NVidia did this in the past as well with their fx5000 series, so they're not squeaky-clean either. However, neither company should be resorting to such driver tweaks given the speed of their current card lines. Drivers can always be updated, but you're stuck with the video card until you toss it, so you might was well get the best hardware possible until the next best thing comes out. |
No card uses pure tri-linear optimisation, it's simply too resource-intensive to implement. See:
http://graphics.tomshardware.com/gra...i-x800-12.html
The benchmarks here are based on both ATI and Nvidia using their own optimisations.
swaaye
Both NV and ATI do exactly the same things. NV was FAR worse about it back in the FX days, but they are on about equal footing with ATI right now, perhaps slightly behind. I have problems in KOTOR with my 6800 and texture shimmering on the floor (fixed by jacking up the performance slider to HQ).
Actually ATI has been ahead of NV for a long time on image quality. ATI's antialiasing is far superior, especially compared to FX cards which can not do gamma corrected AA (I'm not sure 6x00 can do it either). And ATI can do 6X MSAA whereas NV can only do 4X MSAA. NV does have a questionable hybrid 8X AA mode which is 4X MSAA + 2X SSAA and is horribly slower than the 4X MSAA mode. The xS modes are junk because even though they perform well, they look awful.
Bottom line in IQ is:
ATI >>>> NV FX
ATI >> NV 6x
ATI > NV 7x (still no 6X MSAA)
Bottom line also is neither of them do trilinear or anisotropic the text book way. I have no problems with tweaks until I can see them. Actually the GF4 does anisotropic very well, but it's also very slow at it because of this.
NV has no ground to stand on with regards to image quality. They were caught red handed hacking 3dmark03 back in the day to make FX look better than the crap it is/was. They rewrote shaders in games to make them run faster on FX, shaders of lower quality than the game's originals. And their intensely optimized anisotropic and trilinear optimizations create bad shimmering in many games, and these settings are the default that you see benchmarked everywhere. In essense, NV cards are slower than they seem if you want to fix shimmering.
I like my 6800 a lot, but my 9700 looks better most of the time. Obviously it's slower though lol. ATI also doesn't release drivers for a SLI/dual-core niche market and break support for big games like Guild Wars, among others.
Actually ATI has been ahead of NV for a long time on image quality. ATI's antialiasing is far superior, especially compared to FX cards which can not do gamma corrected AA (I'm not sure 6x00 can do it either). And ATI can do 6X MSAA whereas NV can only do 4X MSAA. NV does have a questionable hybrid 8X AA mode which is 4X MSAA + 2X SSAA and is horribly slower than the 4X MSAA mode. The xS modes are junk because even though they perform well, they look awful.
Bottom line in IQ is:
ATI >>>> NV FX
ATI >> NV 6x
ATI > NV 7x (still no 6X MSAA)
Bottom line also is neither of them do trilinear or anisotropic the text book way. I have no problems with tweaks until I can see them. Actually the GF4 does anisotropic very well, but it's also very slow at it because of this.
NV has no ground to stand on with regards to image quality. They were caught red handed hacking 3dmark03 back in the day to make FX look better than the crap it is/was. They rewrote shaders in games to make them run faster on FX, shaders of lower quality than the game's originals. And their intensely optimized anisotropic and trilinear optimizations create bad shimmering in many games, and these settings are the default that you see benchmarked everywhere. In essense, NV cards are slower than they seem if you want to fix shimmering.
I like my 6800 a lot, but my 9700 looks better most of the time. Obviously it's slower though lol. ATI also doesn't release drivers for a SLI/dual-core niche market and break support for big games like Guild Wars, among others.
Techie
The 9700 pro is an AMAZING GPU. It was the most futureproof card to be released when it was. It is still pulling off rates that some cards that were releaed after cannot.
lord_shar
Quote:
Originally Posted by MaglorD
Actually, the 6800 Ultra still loses to the X800 XTPE when both cards are compared on even footing.
No card uses pure tri-linear optimisation, it's simply too resource-intensive to implement. See: http://graphics.tomshardware.com/gra...i-x800-12.html The benchmarks here are based on both ATI and Nvidia using their own optimisations. |
http://graphics.tomshardware.com/gra...603/index.html
As you can tell, the optimizations were discovered after the X800's initial card review once people started complaining about X800's the image quality drops during high load situations.
And yes, NVidia's 6800 series does true trilinear filtering without any discernable image quality loss. Optimizations are fine, so long as users don't see any compromises in image quality, which is what ATI didn't do with the X800/850's. At the current performance levels being achieved by today's card lines, neither NVidia nor ATI is doing us video-enthusiast any favors by tossing image quality out the window for a few extra frames over its rival.
swaaye
No the 6800 does not do trilinear correctly by default. I have one and I can load up several games right now which show you how they tweaked things and what it did. You must use HQ mode to get full trilinear. At least better trilinear, maybe still not perfect.
lord_shar
Quote:
Originally Posted by swaaye
No the 6800 does not do trilinear correctly by default. I have one and I can load up several games right now which show you how they tweaked things and what it did. You must use HQ mode to get full trilinear. At least better trilinear, maybe still not perfect.
|
MaglorD
Quote:
Originally Posted by lord_shar
Please check the date on the article you linked, then check the date on this one disclosing ATI's questionable optimizations:
http://graphics.tomshardware.com/gra...603/index.html As you can tell, the optimizations were discovered after the X800's initial card review once people started complaining about X800's the image quality drops during high load situations. And yes, NVidia's 6800 series does true trilinear filtering without any discernable image quality loss. Optimizations are fine, so long as users don't see any compromises in image quality, which is what ATI didn't do with the X800/850's. At the current performance levels being achieved by today's card lines, neither NVidia nor ATI is doing us video-enthusiast any favors by tossing image quality out the window for a few extra frames over its rival. |
Here is what they said:
"In our X800 test NVIDIA's trilinear optimization was not disabled, so the comparable values continue to be valid and comparable"
And no, ATI's optimisations result in very good texture quality. Even the reviewers at Tom's Hardware thought so.
lord_shar
Quote:
Originally Posted by MaglorD
Lord Shar, Tom's Hardware states in the article about ATI's disclosures they did NOT disable Nvidia's optimisations when testing the X800 against Nvidia, yet the conclusion of the article rates the X800 XTPE favourably against the competition.
Here is what they said: "In our X800 test NVIDIA's trilinear optimization was not disabled, so the comparable values continue to be valid and comparable" And no, ATI's optimisations result in very good texture quality. Even the reviewers at Tom's Hardware thought so. |
I agree that the optimizations ATI is preforming are still acceptable, but these should not be detectable by the naked eye. This is why the second report about questionable optimizations surfaced once again. I'm just glad ATI finally yielded to public criticizm and installed an optimization off-switch.
Either way, the 7800GTX is the current king of the hill... let's hope the X1800 can dethrone it or come close so that we'll see another appreciable price drop in the high end video card arena
MaglorD
Quote:
Originally Posted by lord_shar
Which article are you referring to, the initial X800 review or the post-mortem report?
|
Quote:
I agree that the optimizations ATI is preforming are still acceptable, but these should not be detectable by the naked eye. This is why the second report about questionable optimizations surfaced once again. I'm just glad ATI finally yielded to public criticizm and installed an optimization off-switch. |
As for 7800, it will soon be dethroned :P
lord_shar
Quote:
Originally Posted by MaglorD
...<SNIP>...
lol, but it's not detectable unless you do a lot of graphic manipulation. The same cannot be said of Nvidia's FX optimisations. |
Quote:
Originally Posted by MaglorD
As for 7800, it will soon be dethroned :P
|
MaglorD
Quote:
Originally Posted by lord_shar
Someone caught it with the naked eye... otherwise no one would be talking about it.
|
It is actually a pretty sensible performance enhancement, with minimal visual issues. However, having drivers analyze mip map uploads to hide the cheat is an unfortunate consequence.
So it was only spotted because of a driver issue and only because the X800 had a lower than expected frame rate, which can be fixed. Not that there was image degradation. I doubt very much the lower than expected frame rate was spotted with the "naked eye", but rather with FRAPS.
lord_shar
Quote:
Originally Posted by MaglorD
This is what John Carmack says:
It is actually a pretty sensible performance enhancement, with minimal visual issues. However, having drivers analyze mip map uploads to hide the cheat is an unfortunate consequence. So it was only spotted because of a driver issue and only because the X800 had a lower than expected frame rate, which can be fixed. Not that there was image degradation. I doubt very much the lower than expected frame rate was spotted with the "naked eye", but rather with FRAPS. |
Either way, both the 6800's and X800's/850's have been relegated to mid-grade video card status, now that the 7800's and X1800's are out, or soon to be released. Something tells me that both sides will employ further brilinear filtering techniques to minimize GPU load and squeeze out every possible frame rate point.
Techie
The x1800 is out, which a higher price range than the 7800GTX. It actually does do well in some benchmarks, beating it out by a little. In some though, it disappoints.
http://www.rage3d.com/reviews/video/...index.php?p=12
http://www.rage3d.com/reviews/video/...index.php?p=12
Old Dood
Quote:
Originally Posted by Techie
The x1800 is out, which a higher price range than the 7800GTX. It actually does do well in some benchmarks, beating it out by a little. In some though, it disappoints.
http://www.rage3d.com/reviews/video/...index.php?p=12 |
Esprit
Nice post about the video cards.
Though your cut-off for the "full eye candy at almost any resolution" for Guild Wars was at 6600GT. I run a 6600 non-GT PCI-Ex card and I can play on any resolution for GW with maximum settings.
Though your cut-off for the "full eye candy at almost any resolution" for Guild Wars was at 6600GT. I run a 6600 non-GT PCI-Ex card and I can play on any resolution for GW with maximum settings.
Xa Panda
question! I am using a x800 normal 128mb edition card but running on a 20" screen at 1680x1050. What the best tweaking/settings I should run?
right now I am at all high on max res and I get very little lag... but just wondering... any tweaks...
right now I am at all high on max res and I get very little lag... but just wondering... any tweaks...
Techie
Well if you aren't lagging and you are still managing to pull off an unhitched framerate, then leave as is, maybe even try to boost settings to a higher quality.
If you are saying you want to reduce what little lag you have and increase framerate, then try lower quality settings and/or a lower resolution.
If you are saying you want to reduce what little lag you have and increase framerate, then try lower quality settings and/or a lower resolution.
Xa Panda
thanks!
Techie
Your welcome, let me know if you have any other problems.
Xa Panda
yeah... I just dont want it to look ugly but I want the full res... it just at certain times is kinda glitchy....
Techie
Hence the 128MB VRAM (Video RAM) you have in your card, its just slower to load textures, which doesn't make it a bad card whatsoever.
The x800 is still a great card.
The x800 is still a great card.
swaaye
Well I know that Geforce FX is what brought about the trinilear and anisotropic tweaks. Radeon 9700 can't do as much as the later chips with regards to these optimizations. I think ATI may have been literally reacting to NV by adding the same tweaks.
Radeon 9600 was the first ATI chip to support some serious brilinear tweaks. People in-the-know on the Beyond3D forums hinted this, but they were really trying to see if we noticed it before it became public knowledge. I can not tell the difference visually between my 9700 and 9600, but I know the 9600 is doing the tweaks. My notebook's 6800 on the other hand will make textures swim/shimmer pretty terribly in KOTOR2 by default. HQ mode fixes it. Jedi Knight 2 shows similar behaviour with textures in "Quality" mode.
The gist of it is that I never noticed the tweaks these two companies were doing until I got a 6800 in my life. NV's "Quality" mode is anything but. I think I've also heard that 7800 may be even worse. More aggressive tweaks. Don't quote me on that though as I don't remember which review I saw that in.
Radeon 9600 was the first ATI chip to support some serious brilinear tweaks. People in-the-know on the Beyond3D forums hinted this, but they were really trying to see if we noticed it before it became public knowledge. I can not tell the difference visually between my 9700 and 9600, but I know the 9600 is doing the tweaks. My notebook's 6800 on the other hand will make textures swim/shimmer pretty terribly in KOTOR2 by default. HQ mode fixes it. Jedi Knight 2 shows similar behaviour with textures in "Quality" mode.
The gist of it is that I never noticed the tweaks these two companies were doing until I got a 6800 in my life. NV's "Quality" mode is anything but. I think I've also heard that 7800 may be even worse. More aggressive tweaks. Don't quote me on that though as I don't remember which review I saw that in.
Techie
Well I have a 7800GT and the bilinear/trilinear looks good really on any game.
lord_shar
Another thing to consider: Once you start running in the 1600x1200 resolution ranges, the pixels are so fine that 4x AA becomes unnecessary (unless you play with your eyes a few inches off the display's surface ) My two buddies use Dell Ultrasharps with native 1600x1200 resolutions, with one of them upgrading to Dell's latest wide aspect screen. I get to see both side-by-side at this upcoming Saturday's LAN party.
Here's the odd part: the house hosting the LAN party is located on LANDA LANE in CA. My buddy wants to paint a "Y" on the end of the street sign so that it reads "LAN DAY."
Here's the odd part: the house hosting the LAN party is located on LANDA LANE in CA. My buddy wants to paint a "Y" on the end of the street sign so that it reads "LAN DAY."
Techie
ROFL I want to see a pic of that
lord_shar
OK, I'll post pics of the street sign + the party after Saturday
Techie
Ok awesome, what games and such are gonna be taking place?
I have been to CS 1.6/CSS Lan and a UT2K4 LAN. Unfortunately, I haven't really been to any multi-game LAN's.
I have been to CS 1.6/CSS Lan and a UT2K4 LAN. Unfortunately, I haven't really been to any multi-game LAN's.
lord_shar
We're probably doing BF2, Quake4, and Fear. There's a few LAN gamers there who refuse to touch GW
One of the rigs is a liquid-cooled 7800GTX-SLI.
Here's a pic from our last LAN day at the host's previous house:
One of the rigs is a liquid-cooled 7800GTX-SLI.
Here's a pic from our last LAN day at the host's previous house:
Techie
Very nice, looks similar to mine, except mine is a bit more messy because of the dry ice
I might switch to liquid just for the sheer quiet factor and the fact that I can do mild to heavy-mild OC's if needed.
I might switch to liquid just for the sheer quiet factor and the fact that I can do mild to heavy-mild OC's if needed.
Old Dood
Quote:
Originally Posted by lord_shar
Another thing to consider: Once you start running in the 1600x1200 resolution ranges, the pixels are so fine that 4x AA becomes unnecessary (unless you play with your eyes a few inches off the display's surface ) My two buddies use Dell Ultrasharps with native 1600x1200 resolutions, with one of them upgrading to Dell's latest wide aspect screen. I get to see both side-by-side at this upcoming Saturday's LAN party.
Here's the odd part: the house hosting the LAN party is located on LANDA LANE in CA. My buddy wants to paint a "Y" on the end of the street sign so that it reads "LAN DAY." |
Techie
Whats your video card?
Old Dood
Mine? X800XL 256Mb PCIe
Oh..I also like the pics above. Such neat and tidy rooms. I would be ashamed to show a pic of my desk.
Oh..I also like the pics above. Such neat and tidy rooms. I would be ashamed to show a pic of my desk.