plain 6800gt vs x800xl
Seron
i'm wondering if it's just the bfg 6800gt that is on par with x800xl from ati, or if the other (non pre-overclocked) 6800gt's is on par with x800xl and the bfg (pre-overclocked) 6800gt is better than the x800xl.
i'm wondering, because the plain, non-overclocked versions of the 6800gt is now cheaper than the x800xl, whereas the bfg one is still $350+.
i'm wondering, because the plain, non-overclocked versions of the 6800gt is now cheaper than the x800xl, whereas the bfg one is still $350+.
AceSnyp3r
The 6800GT and x800XL perform just about the same in the benchmarks I've seen. Usually one or the other gets a few more frames in different games, but they're pretty close really. If you want to overclock though, you can do that yourself without paying BFG extra to do it for you, though overclocking voids the warranty I believe.
Sarus
A little off topic but ...
I'd go with the nvidia card. In my experience Nvidia cards are better supported and less buggy in games. I'm using an ATI x800xt and it's been nothing but problems with battlefield 2. My brother has a 6800GT and he has no problems at all. To be fair it could be something like my mobo but I seriously doubt it.
I'd go with the nvidia card. In my experience Nvidia cards are better supported and less buggy in games. I'm using an ATI x800xt and it's been nothing but problems with battlefield 2. My brother has a 6800GT and he has no problems at all. To be fair it could be something like my mobo but I seriously doubt it.
silentcid
I play battlefield 2 and guildwars and I don't have that problem while using my x800 XT.
Sarus
Quote:
I play battlefield 2 and guildwars and I don't have that problem while using my x800 XT. |
Algren Cole
Quote:
Originally Posted by AceSnyp3r
The 6800GT and x800XL perform just about the same in the benchmarks I've seen. Usually one or the other gets a few more frames in different games, but they're pretty close really. If you want to overclock though, you can do that yourself without paying BFG extra to do it for you, though overclocking voids the warranty I believe.
|
overclocking voids the warranty on video cards...if you buy a BFG card you get the overclock and the warranty. It's worth the extra money as teh BFG cards generally pull noticeably higher frame rate....I've purchased BFG cards exclusively for about 2 years now.
<3 BFG
lord_shar
I have a 6800gt and 6800Ultra, both from BFG. If given a choice between the X800XL vs. the 6800GT, I'd probably go with the nVidia for several reasons:
1) the 6800gt supports full DX9.0c, while the ATI X800 only goes up to DX9.0b. This doesn't matter with today's games, but DX9.0c might become a requirement for some future titles.
2) ATI's drivers tend to perform dynamic texture quality adjustments during high stress video movement, lowering image quality in order to maintain fps. While this is an intelligent way of managing hardware resources, it doesn't allow accurate measurement of the hardware's performance until the feature is turned off. When it is, more than a few titles bench lower on ATI boards vs. nVidia.
However, the nVidia 6800 series, especially the Ultra, are huge power hogs. If you don't have a sufficiently beefy power supply, you may encounter some system stability issues if you go with one of these power-hungry video cards.
1) the 6800gt supports full DX9.0c, while the ATI X800 only goes up to DX9.0b. This doesn't matter with today's games, but DX9.0c might become a requirement for some future titles.
2) ATI's drivers tend to perform dynamic texture quality adjustments during high stress video movement, lowering image quality in order to maintain fps. While this is an intelligent way of managing hardware resources, it doesn't allow accurate measurement of the hardware's performance until the feature is turned off. When it is, more than a few titles bench lower on ATI boards vs. nVidia.
However, the nVidia 6800 series, especially the Ultra, are huge power hogs. If you don't have a sufficiently beefy power supply, you may encounter some system stability issues if you go with one of these power-hungry video cards.
MadeInChina
x800XL seems to have better performance on the benchmarks I've seen.
Algren Cole
Quote:
Originally Posted by MadeInChina
x800XL seems to have better performance on the benchmarks I've seen.
|
refer to the second bullet in the post made by lord_shar...that's the reason why.
MaglorD
Quote:
Originally Posted by lord_shar
2) ATI's drivers tend to perform dynamic texture quality adjustments during high stress video movement, lowering image quality in order to maintain fps. While this is an intelligent way of managing hardware resources, it doesn't allow accurate measurement of the hardware's performance until the feature is turned off. When it is, more than a few titles bench lower on ATI boards vs. nVidia. |
cannonfodder
I have both, an XFX 6800gt and a sapphire x800xl, they are both equal in my experience, in benchmarks the 6800gt scores higher than the x800xl in 3dmark03, but in 3dmark05 the x800xl scores a little bit higher.
It really all goes down to preference, in the UK where I am the x800xl is about £50 or so cheaper than any of the 6800gt's so for that I would choose the ati card over the nvidia..
But both are excellent cards with performance on par with each other
It really all goes down to preference, in the UK where I am the x800xl is about £50 or so cheaper than any of the 6800gt's so for that I would choose the ati card over the nvidia..
But both are excellent cards with performance on par with each other
lord_shar
Quote:
Originally Posted by MaglorD
Do you have a link from a reliable source to back up what you're saying here?
|
http://graphics.tomshardware.com/gra...603/index.html
The article from June 2004 found that the X800's catalyst drivers dropped the texture filtering mode from trillinear to bi-linear mode during high load situations, but ramped up to trillinear mode once GPU load went down. Although ATI insists that the X800 is still performing tri-linear filtering, it's actually "bri-linear" since it's mixing the two filter modes. This filtering could not disabled until many several driver revisions later. As far as I know, it is still present in the current ATI catalyst driver set.
Although the technique does have its merrits, it is completely software driven and doesn't paint an accurate portrait of the hardware's normal performance when benched against a "non-driver-optimized" video card.
MasterQu
If your looking at that much cash on a card just go for the new 7800 nvidia. 1 7800 pushes what 2 6800 or x800 in sli mode can do. Want stats go to nvidia's website and get the specs and benchmarks.
lord_shar
Quote:
Originally Posted by MasterQu
If your looking at that much cash on a card just go for the new 7800 nvidia. 1 7800 pushes what 2 6800 or x800 in sli mode can do. Want stats go to nvidia's website and get the specs and benchmarks.
|
MasterQu
People are still using agp? jk
Algren Cole
Quote:
Originally Posted by MasterQu
People are still using agp? jk
|
I built my machine just a couple of months ago...and unfortunately my ASUS A8V Deluxe uses AGP ....I gotta upgrade already
Teklord
I'm just going to wait and build a Crossfire solution: out of the box support for any application without the need for special profiles!
I've been on both sides of the fence, but lately its been ATI and the features of Crossfire definitly have me leaning that way.
I've been on both sides of the fence, but lately its been ATI and the features of Crossfire definitly have me leaning that way.
MasterQu
Couldn't catch me using ATI. Nvidia sold me when I had to call them for driver support. As I was talking to them they rebuilt the driver, I was using on a (i think) 5700 card at the time that was having issues with UT2004, and emailed it to me before i hung up the phone. Thats service and it bought my loyalty. Never mind the fact that everytime ati comes out with some thing nvidia reciprocates a pwns ati a$$.
Algren Cole
Quote:
Originally Posted by Teklord
I'm just going to wait and build a Crossfire solution: out of the box support for any application without the need for special profiles!
I've been on both sides of the fence, but lately its been ATI and the features of Crossfire definitly have me leaning that way. |
I don't know how much research you have done into the Crossfire series...but alot of websties that I consider reputable sources are stating that the nVidia series is going to maintain it's crown even when the Crossfire series is released.
here's an excerpt from slashdot.org....
Quote:
Kez writes "While at Computex in Taipei HEXUS.net grabbed some benchmarks of an ATi CrossFire powered system. They have since had the chance to reconstruct a similar system and perform the same benchmarks with other cards and configurations to give us an idea of how CrossFire will perform. Obviously, CrossFire's performance will almost certainly change before release time, but in the very least the article provides an idea of what to expect. Interestingly, from these tests it looks like Nvidia's SLI may remain top-dog for graphics performance." |
I haven't done a whole lot of research into this series aside from reading what other posters have to say about it....so I can't add much of my own valid input...but I do trust the people at slashdot.org and the numerous other sites where I've read that the crossfire series isn't going to dethrone nVidia.
but here are the benchmarks done at Computex on a DFI LanParty Board.
Dirkiess
Algren, I read something just recently regarding the Crossfire series. I'll have to see if I can find the article, but it was related to the same thing, but it also mentioned that the drivers were only in Beta and not the final release.
Whether this was true or not, I don't know, but it's either plausible or a possible side step for ATI until they can get the drivers up to speed.
To be honest though, I think there may well be pro's and Con's for both ATI and NVidia.
These could all be related to the systems that people have and a prime example of this, was a news article I read today that says NVidia's own NForce 3 motherboards cause problems with there own 6800 series of cards. Now how is that for compatibility?
I'll see if I can find either of the links to the above articles. I've read a lot of articles today.
Whether this was true or not, I don't know, but it's either plausible or a possible side step for ATI until they can get the drivers up to speed.
To be honest though, I think there may well be pro's and Con's for both ATI and NVidia.
These could all be related to the systems that people have and a prime example of this, was a news article I read today that says NVidia's own NForce 3 motherboards cause problems with there own 6800 series of cards. Now how is that for compatibility?
I'll see if I can find either of the links to the above articles. I've read a lot of articles today.
lord_shar
I can see Crossfire producing good image quality, but I'm not yet sold on its concept. The problem with daisy-chaining 2 video cards is that one card always has to wait for output from another card before before being able to render a video frame. Compared to nVidia SLI's parallel video processing set-up where video load is evenly distributed accross both GPU's, ATI's crossfire is essentially serial or sequential video processing, which doesn't sound as efficient speed-wise since one card is usually waiting for the other.
I will reserve judgement until the finished product is released, just to give ATI a fair chance.
I will reserve judgement until the finished product is released, just to give ATI a fair chance.
Zakarr
Quote:
Originally Posted by lord_shar
I have a 6800gt and 6800Ultra, both from BFG. If given a choice between the X800XL vs. the 6800GT, I'd probably go with the nVidia for several reasons:
1) the 6800gt supports full DX9.0c, while the ATI X800 only goes up to DX9.0b. This doesn't matter with today's games, but DX9.0c might become a requirement for some future titles. 2) ATI's drivers tend to perform dynamic texture quality adjustments during high stress video movement, lowering image quality in order to maintain fps. While this is an intelligent way of managing hardware resources, it doesn't allow accurate measurement of the hardware's performance until the feature is turned off. When it is, more than a few titles bench lower on ATI boards vs. nVidia. However, the nVidia 6800 series, especially the Ultra, are huge power hogs. If you don't have a sufficiently beefy power supply, you may encounter some system stability issues if you go with one of these power-hungry video cards. |
Higher version shaders are useless if game don't support them. You won't automatically gain any image quality with better shaders.
2) You can turn it off if you want.
Benchmarks are no the whole truth. Results don't represent all the games in the world. Some games work better with ATI while some other games work better with NVIDIA. One might to want his/her favorite games work the best way rather than blindly looking for benchmark results from few games.
ATI consume less power which means less heat and less stress for power supply. Usually ATI cards have more silent stock coolers at least in certain brands and you can usually reduce the fan speed without overheating a lot more than in NVIDIA GF 6800 to reduce cooler noise.
ATI X800 has better quality anti-aliasing than in NVIDIA GF 6800. And better anti-aliasing performance with high resolutions like 1600x1200 or maybe even with 1280x960 or 1280x1024
NVIDIA usually have better OpenGL performance so Linux users and OpenGL graphics based games like Call of Duty, Doom3, Quake series, Linux 3D games and all games which are based on Quake or Doom 3 engine, are faster and/or less buggy.
NVIDIA often overclocks better so overclocker might want to choose it instead ATI. They are most likely more future proof cards because of more advanced shaders but no one know how soon and widely these shaders are going to be used in near future games and how much gamers get benefits from them. Latest Splinter Cell and Far Cry support these shaders but in Far Cry the cap between X800 and GF6800 is pretty minor. However, latest Splinter Cell gains noticeable image quality differences with GF 6800 but that is because from some reasons developers didn't add support for ATIs 2.0 shaders and ATI have to use old 1.1 shaders. Maybe it was done in purpose by offering bribes. Who knows.
lord_shar
Quote:
Originally Posted by Zakarr
1) DirectX 9.0b and 9.0c are mainly bug fixes and don't define any new 3D hardware feature standards so both ATI and NVIDIA support DirectX 9.0 but X800 series support pixel shader 2.0b and vertex shader 2.0 while NVIDIA GF 6800 series support pixel and vertex shaders 3.0
Higher version shaders are useless if game don't support them. You won't automatically gain any image quality with better shaders. |
Quote:
Originally Posted by Zakarr
2) You can turn it off if you want.
|
Quote:
Originally Posted by Zakarr
Benchmarks are no the whole truth. Results don't represent all the games in the world. Some games work better with ATI while some other games work better with NVIDIA. One might to want his/her favorite games work the best way rather than blindly looking for benchmark results from few games.
|
Quote:
Originally Posted by Zakarr
ATI consume less power which means less heat and less stress for power supply. Usually ATI cards have more silent stock coolers at least in certain brands and you can usually reduce the fan speed without overheating a lot more than in NVIDIA GF 6800 to reduce cooler noise.
ATI X800 has better quality anti-aliasing than in NVIDIA GF 6800. And better anti-aliasing performance with high resolutions like 1600x1200 or maybe even with 1280x960 or 1280x1024 NVIDIA usually have better OpenGL performance so Linux users and OpenGL graphics based games like Call of Duty, Doom3, Quake series, Linux 3D games and all games which are based on Quake or Doom 3 engine, are faster and/or less buggy. NVIDIA often overclocks better so overclocker might want to choose it instead ATI. They are most likely more future proof cards because of more advanced shaders but no one know how soon and widely these shaders are going to be used in near future games and how much gamers get benefits from them. Latest Splinter Cell and Far Cry support these shaders but in Far Cry the cap between X800 and GF6800 is pretty minor. However, latest Splinter Cell gains noticeable image quality differences with GF 6800 but that is because from some reasons developers didn't add support for ATIs 2.0 shaders and ATI have to use old 1.1 shaders. Maybe it was done in purpose by offering bribes. Who knows. |
MaglorD
Quote:
Originally Posted by MasterQu
Couldn't catch me using ATI. Nvidia sold me when I had to call them for driver support. As I was talking to them they rebuilt the driver, I was using on a (i think) 5700 card at the time that was having issues with UT2004, and emailed it to me before i hung up the phone. Thats service and it bought my loyalty. Never mind the fact that everytime ati comes out with some thing nvidia reciprocates a pwns ati a$$.
|
MaglorD
Quote:
Originally Posted by lord_shar
Yep, here's the link from Tom's Hardware Guide:
http://graphics.tomshardware.com/gra...603/index.html The article from June 2004 found that the X800's catalyst drivers dropped the texture filtering mode from trillinear to bi-linear mode during high load situations, but ramped up to trillinear mode once GPU load went down. Although ATI insists that the X800 is still performing tri-linear filtering, it's actually "bri-linear" since it's mixing the two filter modes. This filtering could not disabled until many several driver revisions later. As far as I know, it is still present in the current ATI catalyst driver set. Although the technique does have its merrits, it is completely software driven and doesn't paint an accurate portrait of the hardware's normal performance when benched against a "non-driver-optimized" video card. |
Most review sites do not disable Nvidia's optimisations while leaving ATI's optimisations intact, so I don't see how the comparisons are unfair.
It is generally accepted the X800XL is comparable to the 6800GT in fair comparison, without disabling optimisations while running either card. So pick whichever is cheaper.
emirate xaaron
Quote:
Originally Posted by Dirkiess
Algren, I read something just recently regarding the Crossfire series. I'll have to see if I can find the article, but it was related to the same thing, but it also mentioned that the drivers were only in Beta and not the final release.
Whether this was true or not, I don't know, but it's either plausible or a possible side step for ATI until they can get the drivers up to speed. To be honest though, I think there may well be pro's and Con's for both ATI and NVidia. These could all be related to the systems that people have and a prime example of this, was a news article I read today that says NVidia's own NForce 3 motherboards cause problems with there own 6800 series of cards. Now how is that for compatibility? I'll see if I can find either of the links to the above articles. I've read a lot of articles today. |
lord_shar
Quote:
Originally Posted by MaglorD
From what I read in the article, both manufacturers heavily optimise their drivers to get the most out of the hardware. This is to be expected since delivering true filtering slows down the hardware considerably. The trick is not to sacrifice image quality too much while employing the optimisations and this ATI appears to have accomplished. Don't forget how horrible Nvidia's FX optimisations were.
Most review sites do not disable Nvidia's optimisations while leaving ATI's optimisations intact, so I don't see how the comparisons are unfair. It is generally accepted the X800XL is comparable to the 6800GT in fair comparison, without disabling optimisations while running either card. So pick whichever is cheaper. |
MaglorD
Quote:
Originally Posted by lord_shar
The reason nVidia's 6800-series didn't get slammed was because they maintained full anisotropic filtering and consistent image quality throughout the tests . ATI's x800's, on the other hand, dropped image quality for the sake of frame rates without informing its user-base until complaints came in. I'm just glad ATI responded positively to the criticism by adding an off-switch to disable bri-linear filtering. Unfortunately, this also revealed ATI's X800 hardware design starting to show its age vs. the 6800 series. I just hope they do better against the 7800GTX with their next solution sans "cheating."
|
The reason Nvidia didn't get slammed is because ATI went and told everyone they were performing true trilinear filtering when they actually optimised by using brilinear. Even so, the image quality with ATI's brilinear filtering is superior to Nvidia's brilinear filtering and does approach true trilinear filtering.
Techie
To be honest its any mans game. ATI and Nvidia have their ups and downs. Once Crossfire is used to its full potential I bet it will blow SLI away. ATI is known for being late to release cards, but they never disappoint.
lord_shar
Quote:
Originally Posted by MaglorD
I truly doubt any card performs trilinear filtering even with bri-linear turned off. So it's pointless to compare without optimisations. The important thing is to ensure optimisations do not degrade quality compared to the competition.
|
Quote:
Originally Posted by MaglorD
The reason Nvidia didn't get slammed is because ATI went and told everyone they were performing true trilinear filtering when they actually optimised by using brilinear. Even so, the image quality with ATI's brilinear filtering is superior to Nvidia's brilinear filtering and does approach true trilinear filtering.
|
Yes, I'm sure Nvidia does have some degree of video driver optimization. However, we users cannot detect these optimizations since they don't leave any visible image degredation, while their ATI counterparts do.
lord_shar
Quote:
Originally Posted by Techie
To be honest its any mans game. ATI and Nvidia have their ups and downs. Once Crossfire is used to its full potential I bet it will blow SLI away. ATI is known for being late to release cards, but they never disappoint.
|
EDIT: As of the WHQL-80 driver release, SLI no longer requires exact card brand matches (e.g., you can now SLi-link a BFG with a Leadtech 7800GTX). GPU core models and hardware specs must still match, but otherwise, it's now more flexible.
MaglorD
Quote:
Originally Posted by lord_shar
Speculation is fine, but please provide some links backing up the above to at least support your statements and reasoning.
I have yet to read any articles mentioning any bri-linear filtering being done by NVidia. Can you please link the URL here if you have it? Yes, I'm sure Nvidia does have some degree of video driver optimization. However, we users cannot detect these optimizations since they don't leave any visible image degredation, while their ATI counterparts do. |
This is indeed a "cheat" that both major vendors now do. Instead of always sampling the two adjacent mip map levels and doing a full blend between them, they have plateaus where only a single mip level is sampled, reducing the average samples from 8 to about 6. It is actually a pretty sensible performance enhancement, with minimal visual issues.
I'm not sure what visible image degradation you are referring to.
Here is what is quoted in the Tom's Hardware article:
The objective of trilinear filtering is to make transitions between mipmap levels as near to invisible as possible. As long as this is achieved, there is no "right" or "wrong" way to implement the filtering.
We have added intelligence to our filtering algorithm to increase performance without affecting image quality. As some people have discovered, it is possible to show differences between our filtering implementations for the RADEON 9800XT and RADEON X800. However, these differences can only be seen by subtracting before and after screenshots and amplifying the result. No-one has claimed that the differences make one implementation "better" than another.
lord_shar
Quote:
Originally Posted by MaglorD
Quoted from the Tom's Hardware article.
This is indeed a "cheat" that both major vendors now do. Instead of always sampling the two adjacent mip map levels and doing a full blend between them, they have plateaus where only a single mip level is sampled, reducing the average samples from 8 to about 6. It is actually a pretty sensible performance enhancement, with minimal visual issues. I'm not sure what visible image degradation you are referring to. Here is what is quoted in the Tom's Hardware article: The objective of trilinear filtering is to make transitions between mipmap levels as near to invisible as possible. As long as this is achieved, there is no "right" or "wrong" way to implement the filtering. We have added intelligence to our filtering algorithm to increase performance without affecting image quality. As some people have discovered, it is possible to show differences between our filtering implementations for the RADEON 9800XT and RADEON X800. However, these differences can only be seen by subtracting before and after screenshots and amplifying the result. No-one has claimed that the differences make one implementation "better" than another. |
Techie
But has anyone actually determined what the difference is? Or has a program yet to be created that shows this?
Old Dood
I like my X800XL over the 6800GT only because it was a $100.00 cheaper when I bought it. The 6800GT has the Pix Shader 3...that is what makes it better in that respect. As for which one is faster? They are too evenly matched in speed.
How I look at it overall is whenever I upgrade to a better vid card I am always happy. It is way better then what I had before. My 9800pro still is a solid card in my "second" system. I was amazed at how much better my newer X800XL was over that.
How I look at it overall is whenever I upgrade to a better vid card I am always happy. It is way better then what I had before. My 9800pro still is a solid card in my "second" system. I was amazed at how much better my newer X800XL was over that.