graphics cards?

Mss Drizzt

Mss Drizzt

Lion's Arch Merchant

Join Date: Mar 2005

communist state of NJ

I would love it if they used an AMD 2400+ for the GPU. Then we would have one kick ass Graphic card.

vas moon

vas moon

Ascalonian Squire

Join Date: Mar 2005

shiverpeak

Zealots of Shiverpeak

Quote:
Originally Posted by Kirbie
you sure you didnt had any problems with geforce 6800gt(256mb ddr3) in guildwars?

i also have geforce 6800gt and gig of ram, but i have 2 problems which i posted in previous thread....which i reported to AreanNet.

what is manufacturer of your card? and what driver were you using?
mine is from bfg, and my driver was forceware 71.84

if driver was different, it makes sense(that it is driver problem)...but if we were using same driver....

and yes...i notice that image with nvidia card(6800gt) looks more luxurious.

im using the 6.6.9.3 10/29/2004 driver from Nvidia and the card is also from BFG. Did you get the 'new' driver to work?

Kirbie

Lion's Arch Merchant

Join Date: Mar 2005

Quote:
Originally Posted by vas moon
im using the 6.6.9.3 10/29/2004 driver from Nvidia and the card is also from BFG. Did you get the 'new' driver to work?

hmm...it must be a driver problem
yes, my new driver works well on desktop....except 1 visible problem
temperature display displays the temperature around 20% higher than what it should.

until next beta event....i dont know the graphics problem im having is fixed or not.
but i know for sure now that it is a driver problem since 66.93 worked well...and 71.84 doesnt
thanks vas moon

LordFu

LordFu

Ascalonian Squire

Join Date: Feb 2005

Mid-MO

|Dickheads 1n Excess|

Quote:
Originally Posted by Kirbie
hmm...it must be a driver problem
The newest drivers have a confirmed bug that causes certain cards to run signifigantly hotter, along with several unconfirmed ones. I suggest rolling back your drivers until the next version. Newer is not always better.

Kirbie

Lion's Arch Merchant

Join Date: Mar 2005

Quote:
Originally Posted by LordFu
The newest drivers have a confirmed bug that causes certain cards to run signifigantly hotter, along with several unconfirmed ones. I suggest rolling back your drivers until the next version. Newer is not always better.
you sure about this?
because according to drivers manual
it doesnt say driver will "make" your card to run hotter.
but
it does say it will "report" 20% higher than what actual temperature is.
which means...technically if my card is running at 50 celcius, the temperature display will report that as my card is running at 60 celcius. so my card is not really running at 60 celcius, but running at 50.

where was your source of information?
i got my information from offcial nvidia drivers manual....and that was a reason i kept the driver. if they say it will "make" my graphics card to run hotter...then i obviously would have roll back to previous ones.
after all...no one wants to run things hotter(technical term-wisely) right?

so which one is the correct information? please help me with this...so that i can make my decision according to correct information.

Loviatar

Underworld Spelunker

Join Date: Feb 2005

nvidia has shaded the truth a bit at times in the past

have you taken a temp probe and checked actual temp?

it might be reporting what it accually sees

if it is simply reporting high then that is something very easty to fix

AS FOR VALUE

i am waiting for the results of a 6600GT SLI test which should be very interesting if it blows off a 500 dollar 6800 ULTRA

LordFu

LordFu

Ascalonian Squire

Join Date: Feb 2005

Mid-MO

|Dickheads 1n Excess|

Your right, the temps just display incorrectly according to Nvidia. There are some issues with the drivers though. Check this out, and see if any of these confirmed and open issues are effecting you.

http://download.nvidia.com/Windows/7...Notes_7184.pdf

Mss Drizzt

Mss Drizzt

Lion's Arch Merchant

Join Date: Mar 2005

communist state of NJ

The problem that the Nvidia cards have is with temp. If the card is running hot then you will not see the performance that you should. Same goes for the ram on the card.

I second the sugestion to take an outside temp prob. and see what is what.

Kirbie

Lion's Arch Merchant

Join Date: Mar 2005

thanks all
just in case...i rolled back my driver to previous one
better to prevent it rather than feel sorry later..so i decided to take safe side.
current driver has few problems(like two i mentioned previously) with guildwars anyways
again..thanks

oh..SLi will perform well...geforce 6600 is pretty good card too.
i hear it will double the performance as each card displays half of the screen each.
it may raise the framerates...but im not sure about improving graphic effects.
thanks

Loviatar

Underworld Spelunker

Join Date: Feb 2005

Quote:
Originally Posted by Kirbie

oh..SLi will perform well...geforce 6600 is pretty good card too.
i hear it will double the performance as each card displays half of the screen each.
it may raise the framerates...but im not sure about improving graphic effects.
thanks
i know

the only question is whether a pair of sub 200 $ 6600GT cards will beat a 500-550 $ 6800 ULTRA single card solution

the detail will be the same but in this case the FPS is the variable

Lews

Desert Nomad

Join Date: Mar 2005

Seattle, Washington

R/E

No they will not.

2 6600's are almost equal to a single 6800 GT, but not quite, sense the GT has 256 mb of ram, while the 6600's have only 128 each, which is not doubled.

Loviatar

Underworld Spelunker

Join Date: Feb 2005

sometimes real life figures fly in the face of paper assumptions which is why i am waiting for the test results

what they benchmark at will be of interest

accademic of course as i could not afford either solution

Mss Drizzt

Mss Drizzt

Lion's Arch Merchant

Join Date: Mar 2005

communist state of NJ

Here you go check it out for yourselves.

http://www.guru3d.com/article/Videocards/186/1/

Lews

Desert Nomad

Join Date: Mar 2005

Seattle, Washington

R/E

Or
http://www.techreport.com/reviews/20...i/index.x?pg=1


And that other article suppourts it saying "6600GT in SLI mode? You will pay the same price and performance wise the 6800 GT will be better in higher resolutions as it has 256-bit memory in an actual 256 MB configuration. From that point of view you can't beat the 6800 GT.

I'm throwing in yet another little something. Buy a 6800 GT and you will be able to upgrade to SLI in the future when you need it. If you buy 2 6600GT's now your PCI-Express graphics card slots are full.

So the only reason not to buy a 6800 GT would be your budget. If you can not afford a 6800GT, well then a 6600GT makes a lot more sense. With that in mind you can buy another 6600GT when you can afford it. So the logic behind this is a little bit complex. But if we forget the "cool" factor then buying a 6800 GT would make more sense."

Lansing Kai Don

Banned

Join Date: Mar 2005

Kansas

Quote:
Originally Posted by Virtuoso
The 9800Pro will blow you away if you find the FX5200 acceptable. And the 256MB of RAM on the FX5200? A marketing gimmick. That card will see minimal performance increase, if any, by bumping it up to 256MB RAM.

-Virt
I guess my audiophile nature took priority, because I feel like I lost value here. The 5200 did just as beautifully (if not more so... cuz now UT2004 lags like the dickens... on lower settings than the 5200). Sorry but I am NOT impressed AGAIN with ATI... my vote is for nVidia (you live and learn I guess)

Lansing Kai Don

Lews

Desert Nomad

Join Date: Mar 2005

Seattle, Washington

R/E

I have used both, I like both equally. But I am sorry, hthe FX series of nVIDIA was crap.

Kirbie

Lion's Arch Merchant

Join Date: Mar 2005

Quote:
Originally Posted by Lews
I have used both, I like both equally. But I am sorry, hthe FX series of nVIDIA was crap.
yes..i have used both and like them all equally too...each company has its own unique pros and cons.
so i go with performance for price to pick my card
this year...geforce 6800gt have been chosen ...hopefully it will last long though. technology is such a headache sometime.

Mss Drizzt

Mss Drizzt

Lion's Arch Merchant

Join Date: Mar 2005

communist state of NJ

Quote:
Originally Posted by Kirbie
yes..i have used both and like them all equally too...each company has its own unique pros and cons.
so i go with performance for price to pick my card
this year...geforce 6800gt have been chosen ...hopefully it will last long though. technology is such a headache sometime.

18 months. That is how fast things have been doubling since the beging. With a 3 year life span. Gee that just sucks now that I see it in writing.

William of Orange

William of Orange

Krytan Explorer

Join Date: Feb 2005

La Crosse, Wisconsin

Thousand Tigers Apund Ur Head, The Consulate

http://www.bestbuy.com/site/olspage....&ref=10&loc=01

The price has been chopped to $59.99, with an additional $20 rebate makes it $39.99. The card is better than the recommended card for the game, so do you think it'd be a decent card to buy just based on the price? It seems like a fairly good card, especially if it's only $40.

Edit: Ehh, but it needs an AGP interface, and last I knew my motherboard has a PCI interface...

Loviatar

Underworld Spelunker

Join Date: Feb 2005

mosr recent motherboards (last 5 years or more and newer) have pci and an AGP SLOT

look in at the mb and see if there are 4-5 places to insert a card of one color and a different colored one offset maybe an inch from the others (4 in a row and 1 maybe an inch higher)

it is a 40 dollar card and in this case you WILL get only what you pay for

note that it is also ONLY a 64 bit pipeline while the others are 128 and 256 bit pipelines

CHOKE POINT for data transfer

William of Orange

William of Orange

Krytan Explorer

Join Date: Feb 2005

La Crosse, Wisconsin

Thousand Tigers Apund Ur Head, The Consulate

Quote:
Originally Posted by Loviatar
mosr recent motherboards (last 5 years or more and newer) have pci and an AGP SLOT

look in at the mb and see if there are 4-5 places to insert a card of one color and a different colored one offset maybe an inch from the others (4 in a row and 1 maybe an inch higher)

it is a 40 dollar card and in this case you WILL get only what you pay for

note that it is also ONLY a 64 bit pipeline while the others are 128 and 256 bit pipelines

CHOKE POINT for data transfer
Screw it then, might as well just wait. Kind of figured that it was just a product they wanted to get rid of, and I'd rather get something that will be decent in quality and value.

Loviatar

Underworld Spelunker

Join Date: Feb 2005

Quote:
Originally Posted by William of Orange
Screw it then, might as well just wait. Kind of figured that it was just a product they wanted to get rid of, and I'd rather get something that will be decent in quality and value.
just what i do

drool at the top stuff and wait until it comes way down in price

then buy the budget version of the top dog

6600GT instead of 6800 ultra comes to mind

and it will last me for its full 3 year warranty (leadtek even supplies an overclocking utility of their own)

Zantos

Pre-Searing Cadet

Join Date: Mar 2005

I am using a latop with a built in ATI mobility Radeon 9000. PCG-GRV680

Will I have problems running the game?

I came across this post in the VN boards of guildwars called
"Any fix for the ATI Radeon bug yet?"

It is talking about laptops with ati 9000 will not work, and must use "-noshaders", is this true?

what exactly is "-noshaders" ?

I am really looking forward to the release and play it on my laptop but will be disappointed if it will not run.

tastegw

tastegw

Krytan Explorer

Join Date: Mar 2005

SoCal

E/

i use a gforce 4 mx 420, witch is very small compaired to whats out now, but it runs super smooth so far.

the only problem i have, is that i dont see the aura on the bosses, witch really sux.

Loviatar

Underworld Spelunker

Join Date: Feb 2005

Quote:
Originally Posted by SuperJ24
i use a gforce 4 mx 420, witch is very small compaired to whats out now, but it runs super smooth so far.

the only problem i have, is that i dont see the aura on the bosses, witch really sux.
that is because nvidia cut the cost of the MX line by removing everyhing which made the GF4 line and reduced it to a fast G2 video card

what made the difference between the GF2 line and the GF3-4 was pixel and vertex shaders

the reason you dont see the glow is because it uses directx 8/9 protocols and the MX line is only able to see direct x 7 protocols (same as GF2 cards)

jdwoody

jdwoody

Krytan Explorer

Join Date: Feb 2005

Austin

I have a 6800 gt and GuildWars looks and plays beautifully! I showed a guy at work one of my screen shots (1280x1024 w/ 4x anti-aliasing) and he called BS that it was really a game...

William of Orange

William of Orange

Krytan Explorer

Join Date: Feb 2005

La Crosse, Wisconsin

Thousand Tigers Apund Ur Head, The Consulate

So I just bought myself the ATI Radeon 9250 PCI Card, 256 MB version. Just getting ready to install it now, so hopefully everything goes well. I'll be checking back here if anything goes wrong though, and I'm guessing I'll mess it up the first time around. Hopefully this "ginormous" Getting Started book keeps me on the right track though...

Lews

Desert Nomad

Join Date: Mar 2005

Seattle, Washington

R/E

130

http://www.newegg.com/app/ViewProduc...102-369&depa=1

125
http://www.newegg.com/app/ViewProduc...102-510&depa=1

William of Orange

William of Orange

Krytan Explorer

Join Date: Feb 2005

La Crosse, Wisconsin

Thousand Tigers Apund Ur Head, The Consulate

Hallelujah!!!

The new card's in, the graphics are absolutely beautiful, I opened up the Guild Wars log in screen, and there was practically no lagging this time and I could notice an amazing change in how the picture looks. Best $129.99(+tax) I've ever spent

Divinity

Academy Page

Join Date: Apr 2005

Would a Radien 7000 AGP 64 megabites work fine?

Mss Drizzt

Mss Drizzt

Lion's Arch Merchant

Join Date: Mar 2005

communist state of NJ

Maybe. I'm not quite sure.

William of Orange

William of Orange

Krytan Explorer

Join Date: Feb 2005

La Crosse, Wisconsin

Thousand Tigers Apund Ur Head, The Consulate

I would think that running 64 MB on your video card would be shorting yourself. If anything, a 128 MB card would probably work decently; 256 MB is excellent, just like the one I'm using (except a 9250 PCI card). The card is definitely nice; as I'm typing this, I'm sprawled out on the floor with the computer hooked up to the TV instead of the monitor, just because I can

Lews

Desert Nomad

Join Date: Mar 2005

Seattle, Washington

R/E

Haha. Lucky....


No, an ATI 7000 will not work. Can't you just buy a http://www.flybear.com/ati9600xt_256.html

Cult_Of_One

Cult_Of_One

Ascalonian Squire

Join Date: Mar 2005

Quote:
Originally Posted by William of Orange
Hallelujah!!!

The new card's in, the graphics are absolutely beautiful, I opened up the Guild Wars log in screen, and there was practically no lagging this time and I could notice an amazing change in how the picture looks. Best $129.99(+tax) I've ever spent
Glad to hear it cuz I have a GEcube ati radeon 2950 256 card ordered. I don't know where you got your card from but mine was $76.00 from Komusa. I'm hoping the GEcube version has a slight performance increase...I read documentation on a bunch of 128's and the GEcube had a core clock speed of 270 Mhz as opposed to the standard 250 Mhz.

Divinity

Academy Page

Join Date: Apr 2005

I need to know for SURE; if a 7000 ATI will work. I mean work, not look like ultimate graphics, just work adequetly.

William of Orange

William of Orange

Krytan Explorer

Join Date: Feb 2005

La Crosse, Wisconsin

Thousand Tigers Apund Ur Head, The Consulate

Well, I'm sure that the card would work, but I couldn't guarantee much. The game ran on my computer when it was just the on-board Intel Extreme Graphics card, but it produced some lag issues when complicated visuals came onto the screen. Your card is most likely a step up from the on-board one, but I would guess that the outcome would be somewhat similar.

Cult: The only reason that mine cost more was because it was actually manufactured by ATI, rather than just being the ATI technology and manufactured by somebody else. I was asking around a few days ago on the IRC channel about it, and the person I was talking to was saying how the non-ATI manufactured ones were always of a slightly lesser quality; still get the point across, but not as well as with the actual brandname one. But who knows? Maybe you just happened to find a nice bargain, or maybe it was an AGP card, since most of the ones I've seen which looked higher end but were less expensive were AGP ones. I've never looked into it a whole lot though, so don't quote me on that.

Divinity

Academy Page

Join Date: Apr 2005

Well I think/HOPE it'll work. It ran fine with Star Wars Galaxies, so i'm really hoping it will work.

Lews

Desert Nomad

Join Date: Mar 2005

Seattle, Washington

R/E

It will run 10 times better on $100 card.

Sin

Banned

Join Date: Mar 2005

The Joint :p

Mine is a pci interface on a clunky system and I started with an Radeon 7000. I could not run a Warcraft III map with more than 2v2 at 640x480x16 without excessive lag, and the 2v2 was somewhat choppy. At first I thought it was the awesome power of my 1.3 ghz celeron had been seriously exceeded however I popped for a nvidia 5200fx pci. Now I can host a 4v4 or FFA at 800x600x32 only slight lag when all armies are max food. Guildwars has no lag at all for me at 1024x768x32. So IMHO I would seriously consider not using a radeon 7000 to play GuildWars unless I absoultely had to and don't mind having all features off and viewing the game at its lowest possible display resolution--although you will likely still have intolerable lag even then.

Hope this was useful

P.S. If you look up some benchmarks on the internet the Radeon 7000 is really really slow.

Lews

Desert Nomad

Join Date: Mar 2005

Seattle, Washington

R/E

It will not work, read the requriements: ati 8500