The Ultimate Video Card Guide!

Techie

Techie

Frost Gate Guardian

Join Date: Nov 2005

Fairfield, Ohio

Mo/W

Well due to a lot of these posts being about video cards, I figured I would make this neat little guide.

A video card in simple terms is what powers the graphics on your computer. It is what determines your screen resolution, refresh rates, basically whatever displays on your monitor. Now there are 3 main of video card slots (4 if you inclue integrated), they are the following:
  • PCI - The oldest of the video card slots. Is not made on motherboards anymore, and uses very old video cards such as the Nvidia Riva TNT and ATI Rage.
  • Integrated - Everyone's nightmare. This is what comes pre-installed on your motherboard, and CANNOT be removed. It usually is shared memory, meaning the memory capacity it has means how much RAM it takes away from your comp. Ever wonder why 512MB RAM shows up in System Information as 448? Well 512 - 64MB shared memory = 448MB of free RAM.
  • AGP - The most widely used slot today, is still old, being created around 1999. There are 2 main modes, of these have absolutely no difference in framerates whatsoever: AGP 4x/AGP 8x. Most cards today support this, including 6600GT, 6800, 6800GT, x800, and even x850XT.
  • PCI-Express - The newest and fastest video card slot available. It is a bit more expensive to upgrade, but you get what you pay for. PCI-E offers 2x the bandwidth of AGP slots, meaning a chance of a big FPS jump. Almost every single high-end card uses PCI-E, including the newer 7800GTX and the spankin' new x1800XT.

What Is Video Card RAM?

You may be wondering why I refer to "video card RAM" Think of a video card as a small computer. It has a core, memory speed, along with specified RAM. This RAM allows for faster processing of textures, and how much it can texturize per second. There are the following RAM specifications, going from least to greatest:
  • 8MB
  • 16MB
  • 32MB
  • 64MB
  • 128MB
  • 256MB

    ^ Thanks to EternalTempest for this side note. If you have Windows Vista and want the full eyecandy of the GUI, 256MB on your video card is a must have.
  • 512MB

The higher the RAM = the better texture processing. It can also render images and objects faster as well.

Core Speed and Memory Speed

Ah, what makes a video card, the core and memory speed. Think of the core speed as the processor, and the memory speed as standard RAM speed. For example, a 600mhz core and 900mhz memory will do some damage on games, you should get high FPS rates, unless you have hardly any pipelines (will get to this later). Now something like 250mhz core and 200mhz memory will do mediocre, lower FPS rates.

So for simple terms: Higher Clock + Memory = Higher FPS

Pipelines

Ah, pipelines. No, not what connects your toilet to the sewer, I mean video card pipelines. Pipelines are basically open transfer ports in your video card that allow the processing of textures and information. The more you have and the more that are open and working = the better FPS and the better the quality will be.

For example, say 4 pipelines can render 1.1M pixels a second. Well, maybe 16 pipes can render 6M pixels a second. Why? Because there are more working pipes transferring more information.

This really is used for overclocking, which is on down later.

Major Video Card Manufacturers

Now, there are two main corporations that produce video cards. ATI, the oldest company, and Nvidia, the fierce competitor.

Nvidia is known for accomplishments such as SLI, while ATI is known for it's relatively new Crossfire (equivalent to SLI), and the creation of the PCI-Express architechture.

Really, no company is better, no company is worse. It is all who you like and don't like, pretty much fanboyism. Both have their ups and downs.

Here is a list of some cards that are newer and fairly inexpensive that will do great for GW.

ATI
  • x300 (PCI-E)
  • x600 (PCI-E)
  • x800 (PCI-E/AGP)
  • x800GTO (PCI-E/AGP)
  • x800GT (PCI-E/AGP)
  • x850XT (PCI-E/AGP) <-- Recommended
  • x700 Pro (PCI-E)
  • 9800 Pro (AGP) <-- Recommended
  • 9600 Pro/XT (AGP)

Nvidia
  • FX 5200 (AGP/PCI)
  • FX 5500 (AGP)
  • FX 5700 (AGP)
  • 6200 (PCI-E/AGP)
  • 6600 (AGP/PCI-E)
  • 6600GT (PCI-E/AGP) <-- The best card for your money right now. Will get great FPS on any game at almost any resolution. I recommend!
  • Vanilla 6800 (PCI-E/AGP)
  • 6800GT (PCI-E/AGP) <-- Great higher end card.
  • 6800 Ultra (PCI-E/AGP) <-- Is expensive, and I recommend getting the 7800GT or GTX. It comes with either 256MB or 512MB of RAM. The 512MB is ideal for rendering textures at higher resolutions, like 1600x1200 on Doom 3.
  • 7800GT (PCI-E)
  • 7800GTX (PCI-E)

*Note: Just because ATI sponsors GW does NOT mean their cards perform better in the game.

On With The Show!

Now how to decide if you card will do ok in GW:

Minimum System Specs:

Windows XP/2000/ME/98
800 MHz Pentium III or equivalent
256 MB RAM
ATI Radeon 8500 or GeForce 3 or 4 MX with 32MB of video memory
2 GB available hard drive space
Internet connection
DirectX 8.0

Now to be honest, that setup is the bare minimum, at 800x600 resolution with a crappy FPS. If you just want to play the game, and no eye candy, then fine that will work. If you want a cheap upgrade from a 32MB card, go with the GeForce 4 Ti4200 with 128MB, or the GeForce MX4000 with 128MB. Both about $30-40.

Recommended System Specs:

Windows XP/2000/ME/98
Pentium III 1GHz or equivalent
512 MB RAM
ATI Radeon 9000 or GeForce 4 Ti Series with 64MB of video memory
2 GB available hard drive space
Internet connection

Now we are getting somewhere. If you want the full eyecandy, well some cards might be able to pull it off at the right resolution. I would recommend the ATI 9550, or the Nvidia 5200 or 5500. They can hold their own and are DirectX 9.0 compatible.

Now here is a list of cards that will get the full eye candy at almost any resolution:
  • 6600GT
  • x800
  • x800GTO
  • x800GT
  • x800XT
  • x850
  • x850XT
  • x850XT PE
  • 6800
  • 6800GT
  • 7800GT
  • 7800GTX
  • x1300
  • x1600
  • x1800
  • x1800XT
  • 9800 Pro
  • 9800XT
  • 9600XT

Pretty much any one of these cards will get you what you want out of the game.

Now for some cards that can pull of pretty good settings:
  • 9550
  • 9600 Pro
  • 9250
  • GeForce MX 4000
  • GeForce Ti 4200
  • FX 5200
  • FX 5500
  • x300
  • x600
  • x700

And then the cards that will pretty much be put on low settings:
  • GeForce 2
  • 9000
  • 8500
  • GeForce 3

Card Resellers/Manufacturers

Here are some of the resellers and manufacturers that produce ATI and Nvidia cards:

ATI
  • Powercolor - Basically the largest manufacturer of ATI, has high-quality cards.
  • ATI - Well, the company that invents the card directly sells them as well
  • Sapphire - Another major reseller, has great support and decent quality of cards.

Nvidia
  • BFG - Great cards, most come pre-OC'ed. Great service, support, quality, can't ask for a better company!
  • XFX - Known to have performance cards, but is also known to come up short with service and support.
  • eVGA - My personal favorite company. Has awesome cards, great quality, awesome service and support. Rivals BFG
  • MSI - A great card company, has great PCI-Express cards, and great support to match.
  • Gigabyte - I've had bad experiences with their support, but they do make decent quality cards.
  • AOpen - Not a highly known manufacturer, but does create very nice cards, idle and load temps stay low.

Overclocking

Now the fun starts, OVERCLOCKING!

Overclocking - To push a piece of computer hardware past manufacturer standards, defaults, or settings.

Basically, making it vroom faster

Now there are several guides on overclocking, which will go way more in-depth than I could ever go. Here are just a few links if your interested:

http://www.devhardware.com/c/a/Video...d-Overclocking
http://www.pimprig.com/forums/showth...rerid=&t=16413
http://www.sysopt.com/features/graph...le.php/3549986

Conclusion

That pretty much covers it! If you are still unsure if your card can run GW ok, look around the forum, search for your card and see what others have said.

I like to also follow by this little guide:

32MB RAM - Low settings, 800x600/1024x768
64MB RAM - Low/Medium settings, 800x600/1024x768
128MB RAM - Medium/High settings, 800x600/1024x768/1280x964
256MB RAM - High settings, 800x600/1024x768/1280x964
512MB RAM - High settings, all resolutions

Note that this is also based on RAM you have and your CPU.

I am not responsible for any damage caused to your card. This is soley a guide on helping you find out if your card can run Guild Wars or not.

Thanks, and hope this guide helped!

EternalTempest

EternalTempest

Furnace Stoker

Join Date: Jun 2005

United States

Dark Side Ofthe Moon [DSM]

E/

Great post (rated 5 stars ), I would like an add a small tidbit. If you plan on running Windows Vista in full glory (the new interface), 256 mb video card is min requirement. It will run on fine (from looks of it so far) on lesser but not get all the cool new GUI stuff.

Techie

Techie

Frost Gate Guardian

Join Date: Nov 2005

Fairfield, Ohio

Mo/W

Thank you

Yeah I just got Vista the other day and realized that 256MB is a must have. It is very cool

lord_shar

lord_shar

Furnace Stoker

Join Date: Jul 2005

near SF, CA

You forgot the 6800-Ultra, one of the faster AGP solutions available if you have an older system. These also come in 256MB or 512MB versions.

If possible, try to get a BFG-brand card if you're going with NVidia. Their cards come pre-overclocked out of the box, have full 24/7 support, and lifetime warranties.

Techie

Techie

Frost Gate Guardian

Join Date: Nov 2005

Fairfield, Ohio

Mo/W

Yes they have the best support, but the price does not account for the puny 25mhz OC they do, you will maybe get a 1-3 FPS boost for extra $$$. It is really not worth it. But if you like their service and support, BFG cannot be beaten.

And the Ultra is mainly sold in 512MB versions, PCI-E. And I could list every gfx card out there, but I picked the most popular that will do good for the $$$.

swaaye

swaaye

Frost Gate Guardian

Join Date: May 2005

I think the X800GTO cards are the mid-range champs right now actually. They are 12 pixel / 6 vertex pipeline cards with 256MB 256-bit DDR3, running 400/980MHz. You can almost certainly clock them to over 500MHz, at which point they will rival a 6800GT for the price of a 6600GT.

I just upgraded a friend from a 6600GT to a X800GTO (550/1000) and he was rather awestruck at the improvement for the same price.

Whatever you do, stay away from Geforce FX. The Geforce FX cards are especially awful and I cringe at seeing people buy them.

Ultra High End
AGP - ATI Radeon X850XT PE is the fastest there probably will ever be.
PCIe - Geforce 7800GT or GTX, or the Radeon X1800XL or XT.

Midrange
AGP - Radeon X800GTO
PCIe - Radeon X800GTO or GTO2 (can be unlocked to a full X850XT PE if you read up)

Entry Level
AGP - Geforce 6600 non-GT, Radeon X700, or a used Radeon 9700 or 9800 nonSE from eBay!
PCIe - Geforce 6600 or Radeon X700
PCI - If you really are stuck with PCI you should probably upgrade your entire system honestly. The rest of the system is probably slow too. If you really need a card though go with a Radeon 9250 or GeforceFX 5500. There aren't many choices, and none of them are fast.

------------------
Excellent Card Roundups
Tom's Hardware's VGA Charts Summer 2005 PCIe
Tom's Hardware's VGA Charts Summer 2005 AGP
------------------

Techie

Techie

Frost Gate Guardian

Join Date: Nov 2005

Fairfield, Ohio

Mo/W

Yes but they lack SM 3.0, and regardless of what people say, is going to be crucial in future games. I would rather buy a SM 3.0 enabled card, which helps support HDR, which is already featured in games like Lost Coast and DoD: Source. You have to take in consideration not just performance, but shaders and models as well.

EternalTempest

EternalTempest

Furnace Stoker

Join Date: Jun 2005

United States

Dark Side Ofthe Moon [DSM]

E/

Does the 6600GT support sm3.0 (where the x800GTO card doesn't)?

swaaye

swaaye

Frost Gate Guardian

Join Date: May 2005

SM3.0 isn't going to matter one bit if the card's performance isn't there to begin with. The 6600 series arguably is inadequate for real pixel shading tasks as it does not have the massive pixel fillrate needed for things like HDR lighting, etc. And, considering the userbase of SM2a cards, developers will not leave you in the dust if you have a X800 on down.

http://techreport.com/reviews/2004q4.../index.x?pg=10

In current games an X800GTO will absolutely dust a 6600GT. Though in Guild Wars they are probably equal or the 6600GT may actually be faster. I noticed this in a review. But I still pulled 1600x1200 4X AA at 60-90fps on the X800GTO. The extra 128MB of RAM helps a lot with jerkyness too.

I have access to Radeon 7500, 8500, 9500PRO, 9600, 9700, X800GTO along with Geforce 6600GT, 6800Go, and 6800GT. The 8500 on up run Guild Wars extremely well. For antialiasing you should go for Radeon 9800 and Geforce 6600GT on up. 256MB video RAM helps smooth out the loading of textures in game which can create pauses and jitters when turning around for example. The Radeon X800GTO was across the room from my 6800Go and my friend's 6800GT and it was absolutely on par with the 6800GT. Incredible card for $200.

If you really take SM3.0 seriously you should not use anything below a 6800GT. Obviously the 7800 cards are fantastic for SM3.0, and the new Radeon X1800 is even faster. It starts to get a wee bit pricey up there though.

lord_shar

lord_shar

Furnace Stoker

Join Date: Jul 2005

near SF, CA

Quote:
Originally Posted by Techie
Yes they have the best support, but the price does not account for the puny 25mhz OC they do, you will maybe get a 1-3 FPS boost for extra $$$. It is really not worth it. But if you like their service and support, BFG cannot be beaten.

And the Ultra is mainly sold in 512MB versions, PCI-E. And I could list every gfx card out there, but I picked the most popular that will do good for the $$$.
BFG card's memory clocks are also typically cranked +100mhz above spec, and benchmarks show a lot more than +1-3 fps depending on resolution. BFG card heatsinks are usually optimal despite their reference design. Also, their core yields purchased are well above average since they will come out of the factory pre-overclocked. This requires more quality control on their side to fish out the good GPU cores from the bad.

BFG's are priced about the same as their competitors according to www.pricewatch.com.

The main reason I replied was to note that your list was incomplete. Also, if you're buying at this performance range, price typically isn't a concern.

swaaye

swaaye

Frost Gate Guardian

Join Date: May 2005

The OC in the name becomes far less useful as you go down. For example, the Geforce FX5200 OC is like only 20MHz above stock on both RAM and GPU. Remember though that anything in the FX series should not be bought anymore. Anything below a FX5950 Ultra is absolutely pathetic for modern games. Even the 5950Ultra has trouble competing with a lowly Radeon 9600 at times.

My friend's 6800GT OC is running 370MHz instead of 350MHz on core, not sure about RAM. If the prices are equal the decision is obvious, but considering that basically ALL cards can overclock to BFG's OC levels you should not pay more for them.

lord_shar

lord_shar

Furnace Stoker

Join Date: Jul 2005

near SF, CA

Quote:
Originally Posted by swaaye
The OC in the name becomes far less useful as you go down. For example, the Geforce FX5200 OC is like only 20MHz above stock on both RAM and GPU. Remember though that anything in the FX series should not be bought anymore. Anything below a FX5950 Ultra is absolutely pathetic for modern games. Even the 5950Ultra has trouble competing with a lowly Radeon 9600 at times.

My friend's 6800GT OC is running 370MHz instead of 350MHz on core, not sure about RAM. If the prices are equal the decision is obvious, but considering that basically ALL cards can overclock to BFG's OC levels you should not pay more for them.
Most NVidia GPU's can handle a +20mhz overclock, but if you have a higher grade core, you can clock much higher. By going with a pre-screened GPU from manufacturers like BFG mounting better-than-average heat sinks, you do give yourself a little more head-room for higher overclocks. Even if you choose not to overclock, that lifetime warranty will still come in handy if Murphy's laws decide to assert themselves.

Check out pricewatch... I'm seeing similar prices between BFG and other card makers.

swaaye

swaaye

Frost Gate Guardian

Join Date: May 2005

Lord Shar, I agree with you 100%. But if you're not an overclocker, you should probably just go with the cheapest card.

Also here are my 3dmark2005 scores from a few cards that I've tested lately. All default settings.

X800GTO @ 550/1000 = 5320
6800GT @ 370/1020 = 5123
6800Go @ 370/770 = 4355
6600GT @ 590/1200 = 3997
9700 @ 390/350(700) = 2572

Systems were not identical but they are quite close.

Techie

Techie

Frost Gate Guardian

Join Date: Nov 2005

Fairfield, Ohio

Mo/W

Oh and the GT wins because of higher clock speed, why not unlock some pipes in that 6800GT and lets see what happens.

Here are some scores I got my with my friends cards:

Vanilla 6800 450/1020 - 4500
6600GT 600/1180 - 5200
6800GT at 420/1100 with pipes unlocked - 5700

I don't know anyone with an x800GT so I can't compare. But it makes a big difference what pipes and OC'ing can do. That and my dry ice

Oh, and btw you are way off about the 6800GT coming 100+ mhz out of the box.

eVGA 6800GT: 350mhz core: http://www.newegg.com/Product/Produc...82E16814130215

BFG 6800GT OC: 370mhz core: http://www.newegg.com/Product/Produc...82E16814143025

Take a look at the price difference, about $25 for a 20mhz increase.

lord_shar

lord_shar

Furnace Stoker

Join Date: Jul 2005

near SF, CA

Quote:
Originally Posted by Techie
Oh and the GT wins because of higher clock speed, why not unlock some pipes in that 6800GT and lets see what happens.

Here are some scores I got my with my friends cards:

Vanilla 6800 450/1020 - 4500
6600GT 600/1180 - 5200
6800GT at 420/1100 with pipes unlocked - 5700

I don't know anyone with an x800GT so I can't compare. But it makes a big difference what pipes and OC'ing can do. That and my dry ice

Oh, and btw you are way off about the 6800GT coming 100+ mhz out of the box.

eVGA 6800GT: 350mhz core: http://www.newegg.com/Product/Produc...82E16814130215

BFG 6800GT OC: 370mhz core: http://www.newegg.com/Product/Produc...82E16814143025

Take a look at the price difference, about $25 for a 20mhz increase.
Both the 6800 GT and Ultra have full 16 pipes enabled, so what are you unlocking???

The BFG 7800GTX's run their memory clocks at 1300mhz instead of stock 1200mhz, but this doesn't apply to the 6800 predecessors. Notice I didn't name any Nvidia GPU's in my last post Also, you're confusing core clock with memory clock speeds.

An extra $25 for +20mhz, lifetime warranty, better heatsink, higher grade core, and 24/7 support is easily worth it. This is a no-brainer

twicky_kid

twicky_kid

Furnace Stoker

Join Date: Jun 2005

Quite Vulgar [FUN]

few questions for ya. i have been having a problem after i added my video card. it only happens in certian situations but i haven't been able to figure it out.

system specs

1.6 ghz athlon cpu
1 ghz ddr ram
R9600XT-VIO ATI video card

that video card is smoking. it has features that the normal 9600 doesn't have. it has a 500 mhz core speed (built in overdrive for safe OC). not sure what speed is after OC (probly 600+). only 4 pipelines didn't know about that till i read this post i'll be sure to look at that when i buy a new card.

my monitor is a sony, not sure of model, made for graphics enginering and is HD ready ($60 for damn cables). right now i'm set on 1024x768 resolution (highest i can go is 2048x1526).

the problem that i have is an acute drop in fps. i ran a torture test and monitored the fps. this program had ball bearings flying around the screen while the camera was moving extremely fast. it held steady at 35 fps but every 5 seconds or so it would drop to 3 fps then come back up. when i play GW i run it window mode. my 1 gig of ram allows me to run many different things while i play (mainly music on winamp). if i minimize my GW window and run open my browser or winamp and start it up when i bring GW back up i get the acute drop in fps and the game skips. i cannot get this to stop while i'm playing. i shut the game down completely and bring it back up. sometimes it stops and sometimes there is nothing i can do but restart the comp. this is very annoying in 8vs8 when there is alot going on. the skips are killing my playablity.

i have installed the drivers that came with the card. after that i checked for updates. installed recent update and shot everything to hell so i rolled it back. toned down resolution still no help. stop running other programs and will still skip if i minimize and restore. can't figure this out.

lord_shar

lord_shar

Furnace Stoker

Join Date: Jul 2005

near SF, CA

Quote:
Originally Posted by twicky_kid
few questions for ya. i have been having a problem after i added my video card. it only happens in certian situations but i haven't been able to figure it out.

system specs

1.6 ghz athlon cpu
1 ghz ddr ram
R9600XT-VIO ATI video card

that video card is smoking. it has features that the normal 9600 doesn't have. it has a 500 mhz core speed (built in overdrive for safe OC). not sure what speed is after OC (probly 600+). only 4 pipelines didn't know about that till i read this post i'll be sure to look at that when i buy a new card.

my monitor is a sony, not sure of model, made for graphics enginering and is HD ready ($60 for damn cables). right now i'm set on 1024x768 resolution (highest i can go is 2048x1526).

the problem that i have is an acute drop in fps. i ran a torture test and monitored the fps. this program had ball bearings flying around the screen while the camera was moving extremely fast. it held steady at 35 fps but every 5 seconds or so it would drop to 3 fps then come back up. when i play GW i run it window mode. my 1 gig of ram allows me to run many different things while i play (mainly music on winamp). if i minimize my GW window and run open my browser or winamp and start it up when i bring GW back up i get the acute drop in fps and the game skips. i cannot get this to stop while i'm playing. i shut the game down completely and bring it back up. sometimes it stops and sometimes there is nothing i can do but restart the comp. this is very annoying in 8vs8 when there is alot going on. the skips are killing my playablity.

i have installed the drivers that came with the card. after that i checked for updates. installed recent update and shot everything to hell so i rolled it back. toned down resolution still no help. stop running other programs and will still skip if i minimize and restore. can't figure this out.
You might want to check for spyware in the background... these can cause momentary spikes in CPU load while trying to log keys and make outbound connections.

Techie

Techie

Frost Gate Guardian

Join Date: Nov 2005

Fairfield, Ohio

Mo/W

Ok so I added the following:
  • x800GTO/x800GT under recommended
  • ATI/Nvidia Popular Manufacturers

Hope that suits everyone.

NeXuS8

NeXuS8

Lion's Arch Merchant

Join Date: Jul 2005

Wild Bladez

W/Mo

Quote:
Originally Posted by EternalTempest
Does the 6600GT support sm3.0 (where the x800GTO card doesn't)?
Yea it does and all the line of ATI doesent only the new ones X1800

lord_shar

lord_shar

Furnace Stoker

Join Date: Jul 2005

near SF, CA

Quote:
Originally Posted by Techie
Ok so I added the following:
  • x800GTO/x800GT under recommended
  • ATI/Nvidia Popular Manufacturers

Hope that suits everyone.
Looks good...

Now we need a thread for mice! j/k

A NERD1989

Krytan Explorer

Join Date: Aug 2005

Elite Black Ops

W/Mo

what is overclocking i cant find out anywhere, whats it mean? whats it do

swaaye

swaaye

Frost Gate Guardian

Join Date: May 2005

I wouldn't put X800GT in there. It's an 8-pipe card that costs almost the same as the 12-pipe X800GTO (stupid naming!!).

Couple of other ideas:

Add links to these excellent card roundups:
Tom's Hardware's VGA Charts Summer 2005 PCIe
Tom's Hardware's VGA Charts Summer 2005 AGP
Beyond3D's fantastic board/chip info chart <-- This is bar-none the best place to find out what you actually are buying or have already. Incredible.

Also, please tell people that they should NOT buy Geforce FX cards. As in any NV card with FX in the title.

My Radeon 8500 can run Guild Wars at 1680x1050 high quality. The post processing effects should be turned off though. But 8500 can definitely do a lot better than low quality. I tried GW on a Radeon 7200 PCI and a Geforce 2 MX200 (lol) and those cards had to run about 800x600 low quality.

Techie

Techie

Frost Gate Guardian

Join Date: Nov 2005

Fairfield, Ohio

Mo/W

Ok I will add the following, and the 8500 correction.

lord_shar

lord_shar

Furnace Stoker

Join Date: Jul 2005

near SF, CA

I'm still not sure why the 6800-Ultra isn't even mentiond in the full-eye-candy card list. The reason can't be price since the 7800GTX is also listed. It is noticibly faster than my 6800GT in the same box (yes, I have both) and is still one of the quickest AGP solutions available.

I also agree with Swaaye's previous post mentioning that the FX-series is utter trash...

Old Dood

Old Dood

Middle-Age-Man

Join Date: May 2005

Lansing, Mi

W/Mo

The November issue of Millinum PC magazine has a Nice Video Card list. Breaks it down into price groups. Good review. The October issue has a decent How To build a Computer article.

EternalTempest

EternalTempest

Furnace Stoker

Join Date: Jun 2005

United States

Dark Side Ofthe Moon [DSM]

E/

Quote:
Originally Posted by swaaye
Also, please tell people that they should NOT buy Geforce FX cards. As in any NV card with FX in the title.
Quote:
Originally Posted by lord_shar
I also agree with Swaaye's previous post mentioning that the FX-series is utter trash...
My Geforce FX 5700 Ultra 128mb does very well and plays GW great as well as many other games. I agree there is no reason to go with an FX card now, but this is due to the Geforce 6x series out and the 6600 being extremely affordable and an agp version of it as well. I do admit that latter cards were much better but at the time it was a positive step up from the Geforce 4 series and the 6xxx series did not exist.

swaaye

swaaye

Frost Gate Guardian

Join Date: May 2005

If it runs well I'm quite happy for you. Guild Wars is not pixel shader heavy and that is why. In fact I believe only the post-processing effect uses a shader.

It is not debatable that the Geforce FX series has absolutely terrible pixel shading performance. The 4 year old Radeon 9700, which came out even before the first GeforceFX card, is several times faster than the fastest GeforceFX at pure shading. Most games still are not very heavy on shading (undoubtedly because of the userbase of cards with limited shading capability). However games like FarCry, Everquest 2, Half Life 2, FEAR, and others like them will run EXTREMELY poorly unless the game has a specifically optimized path for GeforceFX.

The biggest reason R300 (9700) is faster than NV30 (FX) is that 9700 is really a more standard GPU design than NV30 was. nVidia created a pretty unique chip, one that changed the way graphics pipelines worked, and it was really ahead of its time. This unfortunately carried with it serious performance challenges though. ATI played it safe while NV played for the future. Neither was a better decision really, but for performance ATI definitely won out. 9700 was an absolute beast for the time. NV3x has less processing resources for DX9 effects than R300 by a long shot, and NV3x requires advanced compiler technology in the driver along with proper utilization by the game to get decent performance. R300 was a LOT easier to work with.

Half Life 2, for example, actually runs FX cards in DirectX 8 mode because although they have "full" DX9 support, the performance is just too terrible for them to run at that level, even with the developers attempting to tweak the game specifically for these cards.

The GeforceFX is basically a highly tuned DirectX 8 level card with DX9 shading capabilites there as features, but not with viable performance. The Geforce 4 Ti series is actually faster than most Geforce FX cards, but your 5700 probably is faster than or at least equal to a Ti4600. A Geforce FX5600 or lesser card is not faster. I had a friend upgrade from a Ti4400 once to a FX5600 Ultra and he told me flat out that he was quite disappointed.

EternalTempest

EternalTempest

Furnace Stoker

Join Date: Jun 2005

United States

Dark Side Ofthe Moon [DSM]

E/

The reason why I responded the way I did was due to the "harsh" language used to slam the FX.

I grant you that the Ati 96xx+ and up gen was better then the Nvidia Geforce FX 5700 ultra at that time and the pixel shader was much better in that battle then the Nvidia FX pixel shader at that time.

The situation had flipped to Nvidia 6 and 7 series better then current Ati, and the upper end brand new ati stuff just know coming out is on par, slightly better in some cases. In 6 months who knows, it will be next gen Ati vs next gen Nvidia and I will be reviewing many different web sites and reviews on them to see what is better.

Quote:
However games like FarCry, Everquest 2, Half Life 2, FEAR, and others like them will run EXTREMELY poorly unless the game has a specifically optimized path for GeforceFX.
A lot of games will drop to dx8 mode due to the sheer number of cards out there that don't support full dx9. I do own Farcry, HL2, and Doom3 - they all play on at least medium or higher settings, look great and are very playable. I also play with them having EAX turned on too (doom 3 got the support from the most current patch + having the openAL drivers installed).

I'm also aware that the FX line is a bit out of date for next gen games coming out now. It's two geneartions behind. I will be picking up Fear very soon and expect to play it on low to med settings.

lord_shar

lord_shar

Furnace Stoker

Join Date: Jul 2005

near SF, CA

Quote:
Originally Posted by EternalTempest
The reason why I responded the way I did was due to the "harsh" language used to slam the FX.

...<SNIP>...
Sorry about that -- I'll keep the rhetoric more toned down next time.

I tend to favor NVidia cards over ATI. I've owned every generation of GeForce GPU until the FX5000 line, which I completely skipped due to its serious performance issues. My current graphics card is a 6800 Ultra. The 7800 GTX is my next planned video card, but I haven't purchased it yet since I'm trying to decide between new PCIe-equipped desktop vs. laptop (Dell M170 w/ Nvidia 7800-Go GTX).

Ironsword

Academy Page

Join Date: May 2005

Newport News Va

Unknown Warriors of Ascalon

W/R

I got some odd problem I have a desktop PC with 768 mb sys ram and a nvidia 5200fx 128mb ram card my cpu is 3.0ghz but my game locks up when I open my inventory and the textures flicker quite a bit ingame but my Laptop with a 2.8ghz processor 512sys ram and an ATI9200 64mb video card runs perfectly on high settings I really want to play GW on my desktop pc but I can't deal with the game locking up the way it does so I'm forced to play on an inferior laptop because it runs the game better.

I don't know why this is happening but it shouldn't be since my video card is better in my desktop and there's more sys ram and cpu power. you guys may say that even though ATI supports the game that there won't be any difference with nvidia cards but I've noticed otherwise and it's really frustrating because I've had the exact opposite problem with BF2 which is why I took an ATI card out of my desktop and replaced it with the nvidia for now I'll stick to the laptop until I can get a realy good video card from ATI which will run all my games properly

lord_shar

lord_shar

Furnace Stoker

Join Date: Jul 2005

near SF, CA

Quote:
Originally Posted by Ironsword
I got some odd problem I have a desktop PC with 768 mb sys ram and a nvidia 5200fx 128mb ram card my cpu is 3.0ghz but my game locks up when I open my inventory and the textures flicker quite a bit ingame but my Laptop with a 2.8ghz processor 512sys ram and an ATI9200 64mb video card runs perfectly on high settings I really want to play GW on my desktop pc but I can't deal with the game locking up the way it does so I'm forced to play on an inferior laptop because it runs the game better.

I don't know why this is happening but it shouldn't be since my video card is better in my desktop and there's more sys ram and cpu power. you guys may say that even though ATI supports the game that there won't be any difference with nvidia cards but I've noticed otherwise and it's really frustrating because I've had the exact opposite problem with BF2 which is why I took an ATI card out of my desktop and replaced it with the nvidia for now I'll stick to the laptop until I can get a realy good video card from ATI which will run all my games properly
If you've swapped video cards with different GPU brands on your desktop, then you might have mixed ATI + NVidia drivers loaded in memory. There are a couple of video driver cleaner utilities out there, but I can't think of any names at the moment.

Also, the latest NVidia WHQL drivers have a known texture corruption bug in GW. Load the previous NVidia driver set should correct this.

Techie

Techie

Frost Gate Guardian

Join Date: Nov 2005

Fairfield, Ohio

Mo/W

Well oddly enough, DriverCleaner is the name of a great program :O

Google it and check it out.

And I will add 6800 Ultra, but seriously for that dough just get a 7800GTX.

lord_shar

lord_shar

Furnace Stoker

Join Date: Jul 2005

near SF, CA

Quote:
Originally Posted by Techie
...<SNIP>...

And I will add 6800 Ultra, but seriously for that dough just get a 7800GTX.
I agree, but the 7800GTX is only available for PCIe slots. Older motherboards will only have AGP slots available, so for an older system, the 7800-series isn't an option. And yes, the 6800Ultra can run full eye candy at all resolutions.

Techie

Techie

Frost Gate Guardian

Join Date: Nov 2005

Fairfield, Ohio

Mo/W

Yes I added it. It is great for running games at high resolutions, I will agree with you on that. And yes the fact it is AGP is another bonus. I guess I could recommend it if you want full eyecandy at the highest or close to highest resolutions.

EternalTempest

EternalTempest

Furnace Stoker

Join Date: Jun 2005

United States

Dark Side Ofthe Moon [DSM]

E/

Nvidia just added the Geforce 6800 GS to there line up.
The 68xx series cards from worst to best in the series.

GeForce 6800 LE -> GeForce 6800 -> **GeForce 6800 GS** -> GeForce 6800 GT -> GeForce 6800 Ultra.

Also read how the 6800 Ultra is no longer being made.
http://www.guru3d.com/article/Videocards/278/

lord_shar

lord_shar

Furnace Stoker

Join Date: Jul 2005

near SF, CA

Strange, they have two listed 7800GTX, but the 512MB version has a much higher GPU + memory clock!

EternalTempest

EternalTempest

Furnace Stoker

Join Date: Jun 2005

United States

Dark Side Ofthe Moon [DSM]

E/

Quote:
Originally Posted by lord_shar
Strange, they have two listed 7800GTX, but the 512MB version has a much higher GPU + memory clock!
I didn't notice it either untill you pointed it out. Something else in the conclusion, that caught my eye this card beats the yet to be released ATI x1600 XT at the same price point. The only think I could not find if the new card is Agp & PCIexpreess or Pciexpress only. I'm pretty sure they tweaked the core gpu to tune it better the the existing 6x series cards.

Techie

Techie

Frost Gate Guardian

Join Date: Nov 2005

Fairfield, Ohio

Mo/W

The x1600XT will be a highly talked-about card once released, I might actually be able to test one depending.

Oh and the 6800LE is a superb card for unlocking pipes and shaders. Leadtek's stock fan can do wonders.

MaglorD

Jungle Guide

Join Date: May 2005

Quote:
Originally Posted by EternalTempest

The situation had flipped to Nvidia 6 and 7 series better then current Ati, and the upper end brand new ati stuff just know coming out is on par, slightly better in some cases. In 6 months who knows, it will be next gen Ati vs next gen Nvidia and I will be reviewing many different web sites and reviews on them to see what is better.

Comparing Nvidia 6000 series and X800 series from ATI, Nvidia isn't better. The Nvidia card 6800 Ultra cannot match the ATI X850 XTPE when AA/AF are notched up. Except for SM3.0 support, which arguably favours Nvidia over ATI's SM2.0b, the Nvidias aren't better.

The Nvidia 7000 series is meant to be compared with ATI's X1800 series. The jury is still out on this...

lord_shar

lord_shar

Furnace Stoker

Join Date: Jul 2005

near SF, CA

Quote:
Originally Posted by MaglorD
Comparing Nvidia 6000 series and X800 series from ATI, Nvidia isn't better. The Nvidia card 6800 Ultra cannot match the ATI X850 XTPE when AA/AF are notched up. Except for SM3.0 support, which arguably favours Nvidia over ATI's SM2.0b, the Nvidias aren't better.
ATI's benchmarks with the x800's and x850's are a bit skewed due to their catalyst driver's dynamic texture filtering. Also referred to as "bri-linear filtering," ATI's x800 series dynamically shifts between bilinear + trilinear filtering modes to achieve the best benchmarks. However, the X800's/850's could not perform true trilinear filtering unless you turn off ATI's filtering optimizations, but once you disabled this feature, ATI's X800's/850's fell behind NVidia's 6800's. ATI caught a lot of flak for the above and finally conceded by adding an "off" switch to their optimized texture filtering.

Why does the above matter? Simple: ATI was compromising video quality for the sake of benchmarks. NVidia did this in the past as well with their fx5000 series, so they're not squeaky-clean either. However, neither company should be resorting to such driver tweaks given the speed of their current card lines.

Drivers can always be updated, but you're stuck with the video card until you toss it, so you might was well get the best hardware possible until the next best thing comes out.

Quote:
Originally Posted by MaglorD
The Nvidia 7000 series is meant to be compared with ATI's X1800 series. The jury is still out on this...
Actually, it's the other way around -- NVidia dropped a bombshell with the 7800GTX's release and immediate availability in mass volume. ATI could not immediately counter this move, so here we are 5 months later and just seeing the X1800's going through the usual trickle retail release.

ATI's X1800 series will be faster in some games (e.g.,Far Cry), and slower in others (Quake3-4, Doom3, etc). So it looks like equilibrium will be restored between the two competitors, which is good news for consumers like us.