Will Guild Wars ever add support for SLI cards?

Pey

Frost Gate Guardian

Join Date: Aug 2005

United Spanish Buddies

W/E

Well, im quite sad since i saw GWEN onlinewelten videos (mostly the fiery woods). I have a 7950gx2 and i´m afraid it´s not gonna run properly, (it doenst even run nice now at 1280x1024 4xaa and high details (except for medium shadows)

Are the deva gonna add support in the future? or they don´t even care?

Cheers

lucifer_uk

lucifer_uk

Wilds Pathfinder

Join Date: Nov 2006

Nottingham, England

The Venerable Truth [TvT] The Venerable Alliance [TvH] [TvL]

R/

I should hope SLI will be looked at for GW2. I have my board ready to take another card but theres no point for me as most games don't even support let alone fully use the tech.

Loviatar

Underworld Spelunker

Join Date: Feb 2005

my way overclocked dual evga 7900 GS SLI works perfectly so it must be you not GW.

you would need game compatability with dual procs but not with the vido signal sent to the cards

Pey

Frost Gate Guardian

Join Date: Aug 2005

United Spanish Buddies

W/E

As far as i now, with sli you dont gain fps, not even 1.

Predator [CD]

Frost Gate Guardian

Join Date: Apr 2006

Quote:
Originally Posted by Pey
Well, im quite sad since i so GWEN onlinewelten videos (mostly the fiery woods). I have a 7950gx2 and i´m afraid it´s not gonna run properly, (it doenst even run nice now at 1280x1024 4xaa and high details (except for medium shadows)

Are the deva gonna add support in the future? or they don´t even care?

Cheers
What do you mean? GW doesn't work with SLI or what?

Tachyon

Tachyon

Forge Runner

Join Date: Nov 2005

Stoke, England

The Godless [GOD]

W/

THe 7950GX2 doesn't utilise SLi though as it's a single card with two independant GPU's on there. So I doubt it's SLi that's causing your problem.

What exactly do you mean by "it doesn't run nice"?

Pey

Frost Gate Guardian

Join Date: Aug 2005

United Spanish Buddies

W/E

Quote:
Originally Posted by Predator [CD]
What do you mean? GW doesn't work with SLI or what?
Exactly, it doesnt support sli, you get no gain whether you use one gpu or two.

Quote:
Originally Posted by Azagoth
THe 7950GX2 doesn't utilise SLi though as it's a single card with two independant GPU's on there. So I doubt it's SLi that's causing your problem.

What exactly do you mean by "it doesn't run nice"?
7950gx2 does use SLi since it´s two cards in one. It has 2 gpus, as i said before, i get no gain in fps whether i use 1 or 2.

At 1280x1024 4x High (except shadows) and Multisampling (from nv control panel), in some places such as outside the Grand Court of Sebelkeh, i get 20-25 fps.

Tachyon

Tachyon

Forge Runner

Join Date: Nov 2005

Stoke, England

The Godless [GOD]

W/

No it's not! SLi is the use of two cards connected by the SLi bridge. The 7950GX2 is just two GPU's on the same card.

If you're getting that bad a performance from a GX2 then I suggest you take it back and get a replacement. I can run GW at full settings and get back at least 150+ fps from my 512Mb 7900GS (600/1500), so in theory you should be getting at least the same.

Zodiak

Jungle Guide

Join Date: May 2005

Gatineau, Qc, Canada

Kiss of Anguish [KISS]

P/W

SLI support is something that can be forced in any game, just like SLI 64X AA (which I use LOL)

Either go into your video card control panel and change the SLI render mode from Auto (Automated Driver option that choses the best SLI mode but does NOT work) to either Split-Frame Rendering mode which will render each half of the image on each video card and re-assemble it at a cost of 10% performance Or chose Alternate Frame Rendering which will render each other frame on one of the video cards.

It is best to chose Alternate Frame Rendering method over Split-Frame Rendering Method 1 as the first has a performance hit and has issues with various games.

SLI Auto render mode is HIGHLY discouraged as it does not work.

On a second note, If your using the newer NVIDIA drivers and absolutly Hate the horrendous new control panel and wish to revert to the Classic Control Panel, then you can do so with a simple search over the Internet. I also use this control panel in conjunction with nHancer for Nvidia Cards that can be found here. http://www.nhancer.com/ Which is very simple to use. Either create a profile for guildwars or use the Global Mode.

It is possible to enable SLI 64x AA with nHancer... squares look ROUND! O_O

Zodiak

Jungle Guide

Join Date: May 2005

Gatineau, Qc, Canada

Kiss of Anguish [KISS]

P/W

On a SECOND Note: Even if your video cards are set to SLI in your video card control Panel...

SLI will NOT take effect in most cases if the Rendering method is left at the Default setting of Auto, which gives the driver the option of what render method to chose. This option does NOT work.

You Must set your Rendering method to either Alternate Rendering or Alternate Rendering 2

Split-Frame Rendering: This rendering method splits the same image in two parts which are rendered on two different cards then put back together. This results in a 10% performance Lost. Alternate Render Method 2 is suggested over Method 1
Alternate Frame Rendering: This rendering method will render every other frame on a seperate video card and is the rendering method of choice.

Kuldebar Valiturus

Kuldebar Valiturus

Desert Nomad

Join Date: Nov 2006

Garden City, Idaho

The Order of Relumination (TOoR)

R/

All I found in reference to SLI at the Guild Wars Support site:


Quote:
SLI and CrossFire
Question

Are Scalable Link Interface (SLI) and/or CrossFire supported?

Answer

Scalable Link Interface (SLI) and CrossFire are supported for all of our games. However, please be aware that for some games you will not see a significant performance increase. If you encounter any problems, we recommend using Alternate Frame Rendering (AFR) mode as opposed to Split Frame Rendering (SFR) for SLI or Scissor for CrossFire. If that does not help, try temporarily disabling SLI/CrossFire and using only one card for testing purposes.

Zodiak

Jungle Guide

Join Date: May 2005

Gatineau, Qc, Canada

Kiss of Anguish [KISS]

P/W

lol thats what I said

it's best to use Alternate Frame Rendering

Although as mentioned, you may not see a significant increase in performance, I believe that through Alternate Frame Rendering I have been able to increase the graphics such as the AA sampling (standard 4x in Guildwars) to something considerably higher, while keeping stable FPS

Etta

Etta

Forge Runner

Join Date: Jun 2006

Mancland, British Empire

Quote:
Originally Posted by Pey
At 1280x1024 4x High (except shadows) and Multisampling (from nv control panel), in some places such as outside the Grand Court of Sebelkeh, i get 20-25 fps.
7900 GT SLI here, highest setting and everything on, same resolution, I still got high double digits - triple digits FPS. 20-25 fps doesn't sound right. Have you got Vsync on? What's your driver ver? XP or Vista? Monitor refresh rate? Cpu, memory?

Pey

Frost Gate Guardian

Join Date: Aug 2005

United Spanish Buddies

W/E

I´m currently using
Vista x64 @ Drivers 162.22
C2D 6300 @ 2200
2gb OCZ 800mhz
1280x1024 @ 60mhz (17´CRT )
Vsync in game is off
Whether i use Single GPU or any of the Alternate frame rate is that same
Enabling Anisox16 doenst affect performance

Dont get confused, inside towns (with few people) i get 150-200fps
But in some places outside towns i get 20-25, in others betweem 40-80fps

This is my nvidia control panel setting


Can someone post their SLI config and what drivers they are using? maybe it´s a Vista thing, i don´t know

EDIT: I just noticed that:
With AA4x for example outside the gate of fear i get 25fps
With AA2x 60fps
With no AA 100fps

It´s a bit strange, i know AA is quite heavy to run... but not that much, any clues?

scrinner

Wilds Pathfinder

Join Date: Jan 2006

I dont believe Guildwars was made to use x64. Only reason its running is due to x32 Emulation I believe. Try using a x32 Operating System

Zodiak

Jungle Guide

Join Date: May 2005

Gatineau, Qc, Canada

Kiss of Anguish [KISS]

P/W

There are also numerous issues that Vista 64bit has that the 32bit version does not and should be avoided severely.

I also recommend going to 32bit Vista

Etta

Etta

Forge Runner

Join Date: Jun 2006

Mancland, British Empire

Well there're some noticable differents between our set up, I'm still using XP pro and I have 3gb of rams and my cpu is AMD. The reason that I'm sticking with XP for now is, well just have a peak at Nvidia user forum, you'll see tons of angry villager and their thoughts about Nvidia Vista SLI driver.

AA & AT are overated imo hurt my eyes sometime, I always have Vsync on off as well.

steamee

Pre-Searing Cadet

Join Date: Jul 2007

Nvidia's site lists Guild Wars as supported on their game list.
http://sli.nvidia.com/object/slizone2_game.html

I have dual Asus EN7900GT TOP cards in SLI. Run SLI and get excellent frame rates all around running at 1600x1200. Can't remember the settings off-hand...at work.

Flightmare

Flightmare

Lion's Arch Merchant

Join Date: Nov 2006

NL

Infinite Omega Negatives

N/

Quote:
Originally Posted by scrinner
I dont believe Guildwars was made to use x64. Only reason its running is due to x32 Emulation I believe. Try using a x32 Operating System
GW runs flawless on my XP pro x64, a game isn't a driver that just wouldn't work. Downgrading is really a loss of mony AND performance.

Zodiak

Jungle Guide

Join Date: May 2005

Gatineau, Qc, Canada

Kiss of Anguish [KISS]

P/W

its not an issue of weither of not people should use Windows XP 64-bit or not, its just that when many flaws found themselves in the version of Windows Vista 64-bit that are not in the 32-bit. Reviewers have been telling people to stay away from it for some time now.

On another hand, 64-bit processing can only be taken advantage of if the software, application or game has support for this, just like multi-core support, and I dont know of any (that supports 64-bit) as far as I can tell. Sure you can still play them, you just wont see the benefit between both.

The softwares that I DO know that take advantage of 64-bit processing are development software such as the Adobe Creative Suite line, CAD Softwares, Video and Sound editing.

Dex

Dex

Wilds Pathfinder

Join Date: Dec 2005

Chicago, IL

Black Belt Jones

R/Me

Quote:
Originally Posted by Azagoth
No it's not! SLi is the use of two cards connected by the SLi bridge. The 7950GX2 is just two GPU's on the same card.

If you're getting that bad a performance from a GX2 then I suggest you take it back and get a replacement. I can run GW at full settings and get back at least 150+ fps from my 512Mb 7900GS (600/1500), so in theory you should be getting at least the same.
Actually, the 7950GX2 is SLI. I know it's using only one PCI-E slot, but it uses SLI internally. You even get all of the SLI options in the nVidia drivers and you can switch between the various types of SLI rendering. It really is two GPUs in SLI with an internal bridge using a single 16x PCI-E lane.

That being said, what has already been said about SLI is applicable. Certain types of SLI rendering work better with certain games, and some games don't benefit from it much at all. Personally, I see SLI as a waste of GPU power unless you need ultra-high resolutions. Other than that SLI doesn't offer the types of speed increases at "normal" resolutions that would warrant all that extra hardware. It's just plain inefficient.

Zodiak

Jungle Guide

Join Date: May 2005

Gatineau, Qc, Canada

Kiss of Anguish [KISS]

P/W

Quote:
Originally Posted by Dex
That being said, what has already been said about SLI is applicable. Certain types of SLI rendering work better with certain games, and some games don't benefit from it much at all. Personally, I see SLI as a waste of GPU power unless you need ultra-high resolutions. Other than that SLI doesn't offer the types of speed increases at "normal" resolutions that would warrant all that extra hardware. It's just plain inefficient.
Ok lets shut this guy down.

http://www.bjorn3d.com/read.php?cID=1024&pageID=2907
http://www.bjorn3d.com/read.php?cID=1024&pageID=2908

SLI results in a visual performance increase from 30%-60% up to 100% percent.

Read any review

Dex

Dex

Wilds Pathfinder

Join Date: Dec 2005

Chicago, IL

Black Belt Jones

R/Me

Quote:
Originally Posted by Zodiak
Ok lets shut this guy down.

http://www.bjorn3d.com/read.php?cID=1024&pageID=2907
http://www.bjorn3d.com/read.php?cID=1024&pageID=2908

SLI results in a visual performance increase from 30%-60% up to 100% percent.

Read any review
Lol. "Shut me down"?

"Visual performance increase"? Hiliarious. SLI provides so little performance gain for the dollar I'd rather save my money and upgrade my GPU more often. You're entitled to your opinion, though. In my 22 years of gaming I've found that maintaining a single high-end GPU is FAR more cost-effective. 0% - 60% (under IDEAL circumstances, and typically not in framerate, so the gain is subjective) is not great for 100% more cost IMHO, especially when you can wait a bit and simply upgrade to the next generation of GPUs 6 months later for what that second card cost you and reap better benefits. I'm not telling you what to spend your money on, so no need to "shut me down".

However, the "bang for your buck" with SLI is pathetic (from my perspective), so I guess it's all a matter of how much you care about what you're getting for your money. If you want to buy two video cards just to get insane AA (which I can hardly tell is there, honestly), that's your choice. That's just my opinion, although I am a computer engineer that's been gaming for 22 years (and yes, I do read plenty of reviews), so I do have SOME knowledge on the subject.

No need to get aggresive, though, I'm not attacking your glorious video cards.

Zodiak

Jungle Guide

Join Date: May 2005

Gatineau, Qc, Canada

Kiss of Anguish [KISS]

P/W

22 years of gaming and you call SLI pathetic?

I would re-examine those 22 years sir. 30%, 60% to 100% performance increase not worth your dollar? ideal conditions? You better tell that to the hundreds of hardware reviewing website that they were wrong and to go back to bed.

Just because your wrong doesnt mean you have to be grouchy about it

Dex

Dex

Wilds Pathfinder

Join Date: Dec 2005

Chicago, IL

Black Belt Jones

R/Me

Quote:
Originally Posted by Zodiak
22 years of gaming and you call SLI pathetic?

I would re-examine those 22 years sir. 30%, 60% to 100% performance increase not worth your dollar? ideal conditions? You better tell that to the hundreds of hardware reviewing website that they were wrong and to go back to bed.

Just because your wrong doesnt mean you have to be grouchy about it
I'm not "wrong" because my opinion differs from yours. Paying 200% price for 130-160% performance under ideal conditions (I know exactly what the increases are like, thanks) I don't consider a good value, especially since what you're actually getting over a single high-end card is often hardly noticible until next-generation games come out, at which time a superior single-GPU solution will be available for the price you would have paid for that second card. You're just not getting what I'm saying. I own a 7900 SLI setup (along with a few other systems including a single-GPU x1950xtx, which is far nicer than the SLI setup for several reasons), and I regret it. I can tell you first-hand that a significant number of games hardly take advantage of it and some even have major problems if it's enabled, you don't have the right drivers, etc. I've done the math plenty of times, and for my money it's more cost-effective to simply upgrade your single-GPU solution more often than it is to run SLI. Yes, SLI is a little faster, but you don't see those fabled 60% - 100% gains in most games until you get up into +2000x2000 resolutions. I know the facts, and I simply don't think it's worth it. It's my opinion, and I'm not alone (just Google, "is sli worth it" and start reading if you don't believe me). Just because it differs from yours (in that I don't care to spend 200% to get 30-60% gains when they don't make a hugely noticable difference in current games) doesn't make me wrong. I simply prefer to make the most of my upgrade dollars. The hardware sites don't contradict what I'm saying. It's our interpretation of the numbers that differs. Sorry, but I just disagree with you on the practical value of it.

I don't think it's worthwhile to pay $800 for 140fps when I'm already getting 100fps on max settings for $400. Guild Wars is a great example for this. Nobody is going to notice the difference in how well an SLI dual-8800GTX system runs GW over a single 8800GTX. Conversely, if I'm only getting 30fps in a "bleeding edge" game, getting 42fps isn't going to be an enormous coup considering it costs double the price. I'll hold onto my $400 for a few months until I can get my next-gen card and get 70fps, and perhaps the next generation of shader/GPU features. Get what I'm saying? In my personal experience, Oblivion was hardly a better gaming experience with dual 7900's than it was with one 7900. My $350 purchase of the seconds card was pretty much a waste of money.

By the way, 60-100% gains are extremely few and far-between. The huge 100% gain people expect is largely nVidia marketing. You're only going to get those numbers in mainstream games with good SLI support under ideal conditions (higher resolutions, excellent SLI support in the game, the newest drivers, you've set the SLI rendering mode for the game properly, etc). Any hardare review that covers more than the same old suite of games nVidia uses in their own marketing will support that statement. We don't disagree on the numbers, we disagree on the practical real-world benefits of them and what they're worth in terms of price. Yes, the bang-for-the-buck you get from SLI and Crossfire is pretty pathetic, IMO. It's not a good value. You pay a lot for little significant NOTICABLE effect. I understand what the numbers mean. My opinion that it's not worth the cash is my own (and well-founded from my perspective).

Seriously, though, you bought a nice high-end SLI setup, so I understand your stance, and I'm not going to argue with you about it. All I'm saying is that for many people it's not a wise investment, ok? Sorry if I came off as harsh, but implying that I don't know what I'm talking about simply because I disagree with you is silly.

Zodiak

Jungle Guide

Join Date: May 2005

Gatineau, Qc, Canada

Kiss of Anguish [KISS]

P/W

Quote:
Seriously, though, you bought a nice high-end SLI setup, so I understand your stance, and I'm not going to argue with you about it. All I'm saying is that for many people it's not a wise investment.
I can seriously agree that SLI is not for everyone, all I'm saying is that the increased performance is there. I do agree though that to get that increased performance, you must be running the most high end demanding games and that yes Guildwars is not one of them.

Quote:
Originally Posted by Dex
I'm not "wrong" because my opinion differs from yours. Paying 200% price for 130-160% performance under ideal conditions
By the way, 60-100% gains are extremely few and far-between. The huge 100% gain people expect is largely nVidia marketing.
However, this statement is the exact same thing as buying a new CPU. Very often you WILL end up paying almost 200% the price of the old and end up getting less then stellar performance from it, just thought I would point out the comparisson.

I am also done argueing, I think we both had good points, I just had to let people know that yes SLI can be a great thing, its just not for everyone and you must know how and where to take full advantage of it.