Will Guild Wars ever add support for SLI cards?
Pey
Well, im quite sad since i saw GWEN onlinewelten videos (mostly the fiery woods). I have a 7950gx2 and i´m afraid it´s not gonna run properly, (it doenst even run nice now at 1280x1024 4xaa and high details (except for medium shadows)
Are the deva gonna add support in the future? or they don´t even care?
Cheers
Are the deva gonna add support in the future? or they don´t even care?
Cheers
lucifer_uk
I should hope SLI will be looked at for GW2. I have my board ready to take another card but theres no point for me as most games don't even support let alone fully use the tech.
Loviatar
my way overclocked dual evga 7900 GS SLI works perfectly so it must be you not GW.
you would need game compatability with dual procs but not with the vido signal sent to the cards
you would need game compatability with dual procs but not with the vido signal sent to the cards
Pey
As far as i now, with sli you dont gain fps, not even 1.
Predator [CD]
Quote:
Originally Posted by Pey
Well, im quite sad since i so GWEN onlinewelten videos (mostly the fiery woods). I have a 7950gx2 and i´m afraid it´s not gonna run properly, (it doenst even run nice now at 1280x1024 4xaa and high details (except for medium shadows)
Are the deva gonna add support in the future? or they don´t even care? Cheers |
Tachyon
THe 7950GX2 doesn't utilise SLi though as it's a single card with two independant GPU's on there. So I doubt it's SLi that's causing your problem.
What exactly do you mean by "it doesn't run nice"?
What exactly do you mean by "it doesn't run nice"?
Pey
Quote:
Originally Posted by Predator [CD]
What do you mean? GW doesn't work with SLI or what?
|
Quote:
Originally Posted by Azagoth
THe 7950GX2 doesn't utilise SLi though as it's a single card with two independant GPU's on there. So I doubt it's SLi that's causing your problem.
What exactly do you mean by "it doesn't run nice"? |
At 1280x1024 4x High (except shadows) and Multisampling (from nv control panel), in some places such as outside the Grand Court of Sebelkeh, i get 20-25 fps.
Tachyon
No it's not! SLi is the use of two cards connected by the SLi bridge. The 7950GX2 is just two GPU's on the same card.
If you're getting that bad a performance from a GX2 then I suggest you take it back and get a replacement. I can run GW at full settings and get back at least 150+ fps from my 512Mb 7900GS (600/1500), so in theory you should be getting at least the same.
If you're getting that bad a performance from a GX2 then I suggest you take it back and get a replacement. I can run GW at full settings and get back at least 150+ fps from my 512Mb 7900GS (600/1500), so in theory you should be getting at least the same.
Zodiak
SLI support is something that can be forced in any game, just like SLI 64X AA (which I use LOL)
Either go into your video card control panel and change the SLI render mode from Auto (Automated Driver option that choses the best SLI mode but does NOT work) to either Split-Frame Rendering mode which will render each half of the image on each video card and re-assemble it at a cost of 10% performance Or chose Alternate Frame Rendering which will render each other frame on one of the video cards.
It is best to chose Alternate Frame Rendering method over Split-Frame Rendering Method 1 as the first has a performance hit and has issues with various games.
SLI Auto render mode is HIGHLY discouraged as it does not work.
On a second note, If your using the newer NVIDIA drivers and absolutly Hate the horrendous new control panel and wish to revert to the Classic Control Panel, then you can do so with a simple search over the Internet. I also use this control panel in conjunction with nHancer for Nvidia Cards that can be found here. http://www.nhancer.com/ Which is very simple to use. Either create a profile for guildwars or use the Global Mode.
It is possible to enable SLI 64x AA with nHancer... squares look ROUND! O_O
Either go into your video card control panel and change the SLI render mode from Auto (Automated Driver option that choses the best SLI mode but does NOT work) to either Split-Frame Rendering mode which will render each half of the image on each video card and re-assemble it at a cost of 10% performance Or chose Alternate Frame Rendering which will render each other frame on one of the video cards.
It is best to chose Alternate Frame Rendering method over Split-Frame Rendering Method 1 as the first has a performance hit and has issues with various games.
SLI Auto render mode is HIGHLY discouraged as it does not work.
On a second note, If your using the newer NVIDIA drivers and absolutly Hate the horrendous new control panel and wish to revert to the Classic Control Panel, then you can do so with a simple search over the Internet. I also use this control panel in conjunction with nHancer for Nvidia Cards that can be found here. http://www.nhancer.com/ Which is very simple to use. Either create a profile for guildwars or use the Global Mode.
It is possible to enable SLI 64x AA with nHancer... squares look ROUND! O_O
Zodiak
On a SECOND Note: Even if your video cards are set to SLI in your video card control Panel...
SLI will NOT take effect in most cases if the Rendering method is left at the Default setting of Auto, which gives the driver the option of what render method to chose. This option does NOT work.
You Must set your Rendering method to either Alternate Rendering or Alternate Rendering 2
Split-Frame Rendering: This rendering method splits the same image in two parts which are rendered on two different cards then put back together. This results in a 10% performance Lost. Alternate Render Method 2 is suggested over Method 1
Alternate Frame Rendering: This rendering method will render every other frame on a seperate video card and is the rendering method of choice.
SLI will NOT take effect in most cases if the Rendering method is left at the Default setting of Auto, which gives the driver the option of what render method to chose. This option does NOT work.
You Must set your Rendering method to either Alternate Rendering or Alternate Rendering 2
Split-Frame Rendering: This rendering method splits the same image in two parts which are rendered on two different cards then put back together. This results in a 10% performance Lost. Alternate Render Method 2 is suggested over Method 1
Alternate Frame Rendering: This rendering method will render every other frame on a seperate video card and is the rendering method of choice.
Kuldebar Valiturus
All I found in reference to SLI at the Guild Wars Support site:
Quote:
SLI and CrossFire Question Are Scalable Link Interface (SLI) and/or CrossFire supported? Answer Scalable Link Interface (SLI) and CrossFire are supported for all of our games. However, please be aware that for some games you will not see a significant performance increase. If you encounter any problems, we recommend using Alternate Frame Rendering (AFR) mode as opposed to Split Frame Rendering (SFR) for SLI or Scissor for CrossFire. If that does not help, try temporarily disabling SLI/CrossFire and using only one card for testing purposes. |
Zodiak
lol thats what I said
it's best to use Alternate Frame Rendering
Although as mentioned, you may not see a significant increase in performance, I believe that through Alternate Frame Rendering I have been able to increase the graphics such as the AA sampling (standard 4x in Guildwars) to something considerably higher, while keeping stable FPS
it's best to use Alternate Frame Rendering
Although as mentioned, you may not see a significant increase in performance, I believe that through Alternate Frame Rendering I have been able to increase the graphics such as the AA sampling (standard 4x in Guildwars) to something considerably higher, while keeping stable FPS
Etta
Quote:
Originally Posted by Pey
At 1280x1024 4x High (except shadows) and Multisampling (from nv control panel), in some places such as outside the Grand Court of Sebelkeh, i get 20-25 fps.
|
Pey
I´m currently using
Vista x64 @ Drivers 162.22
C2D 6300 @ 2200
2gb OCZ 800mhz
1280x1024 @ 60mhz (17´CRT )
Vsync in game is off
Whether i use Single GPU or any of the Alternate frame rate is that same
Enabling Anisox16 doenst affect performance
Dont get confused, inside towns (with few people) i get 150-200fps
But in some places outside towns i get 20-25, in others betweem 40-80fps
This is my nvidia control panel setting
Can someone post their SLI config and what drivers they are using? maybe it´s a Vista thing, i don´t know
EDIT: I just noticed that:
With AA4x for example outside the gate of fear i get 25fps
With AA2x 60fps
With no AA 100fps
It´s a bit strange, i know AA is quite heavy to run... but not that much, any clues?
Vista x64 @ Drivers 162.22
C2D 6300 @ 2200
2gb OCZ 800mhz
1280x1024 @ 60mhz (17´CRT )
Vsync in game is off
Whether i use Single GPU or any of the Alternate frame rate is that same
Enabling Anisox16 doenst affect performance
Dont get confused, inside towns (with few people) i get 150-200fps
But in some places outside towns i get 20-25, in others betweem 40-80fps
This is my nvidia control panel setting
Can someone post their SLI config and what drivers they are using? maybe it´s a Vista thing, i don´t know
EDIT: I just noticed that:
With AA4x for example outside the gate of fear i get 25fps
With AA2x 60fps
With no AA 100fps
It´s a bit strange, i know AA is quite heavy to run... but not that much, any clues?
scrinner
I dont believe Guildwars was made to use x64. Only reason its running is due to x32 Emulation I believe. Try using a x32 Operating System
Zodiak
There are also numerous issues that Vista 64bit has that the 32bit version does not and should be avoided severely.
I also recommend going to 32bit Vista
I also recommend going to 32bit Vista
Etta
Well there're some noticable differents between our set up, I'm still using XP pro and I have 3gb of rams and my cpu is AMD. The reason that I'm sticking with XP for now is, well just have a peak at Nvidia user forum, you'll see tons of angry villager and their thoughts about Nvidia Vista SLI driver.
AA & AT are overated imo hurt my eyes sometime, I always have Vsync on off as well.
AA & AT are overated imo hurt my eyes sometime, I always have Vsync on off as well.
steamee
Nvidia's site lists Guild Wars as supported on their game list.
http://sli.nvidia.com/object/slizone2_game.html
I have dual Asus EN7900GT TOP cards in SLI. Run SLI and get excellent frame rates all around running at 1600x1200. Can't remember the settings off-hand...at work.
http://sli.nvidia.com/object/slizone2_game.html
I have dual Asus EN7900GT TOP cards in SLI. Run SLI and get excellent frame rates all around running at 1600x1200. Can't remember the settings off-hand...at work.
Flightmare
Quote:
Originally Posted by scrinner
I dont believe Guildwars was made to use x64. Only reason its running is due to x32 Emulation I believe. Try using a x32 Operating System
|
Zodiak
its not an issue of weither of not people should use Windows XP 64-bit or not, its just that when many flaws found themselves in the version of Windows Vista 64-bit that are not in the 32-bit. Reviewers have been telling people to stay away from it for some time now.
On another hand, 64-bit processing can only be taken advantage of if the software, application or game has support for this, just like multi-core support, and I dont know of any (that supports 64-bit) as far as I can tell. Sure you can still play them, you just wont see the benefit between both.
The softwares that I DO know that take advantage of 64-bit processing are development software such as the Adobe Creative Suite line, CAD Softwares, Video and Sound editing.
On another hand, 64-bit processing can only be taken advantage of if the software, application or game has support for this, just like multi-core support, and I dont know of any (that supports 64-bit) as far as I can tell. Sure you can still play them, you just wont see the benefit between both.
The softwares that I DO know that take advantage of 64-bit processing are development software such as the Adobe Creative Suite line, CAD Softwares, Video and Sound editing.
Dex
Quote:
Originally Posted by Azagoth
No it's not! SLi is the use of two cards connected by the SLi bridge. The 7950GX2 is just two GPU's on the same card.
If you're getting that bad a performance from a GX2 then I suggest you take it back and get a replacement. I can run GW at full settings and get back at least 150+ fps from my 512Mb 7900GS (600/1500), so in theory you should be getting at least the same. |
That being said, what has already been said about SLI is applicable. Certain types of SLI rendering work better with certain games, and some games don't benefit from it much at all. Personally, I see SLI as a waste of GPU power unless you need ultra-high resolutions. Other than that SLI doesn't offer the types of speed increases at "normal" resolutions that would warrant all that extra hardware. It's just plain inefficient.
Zodiak
Quote:
Originally Posted by Dex
That being said, what has already been said about SLI is applicable. Certain types of SLI rendering work better with certain games, and some games don't benefit from it much at all. Personally, I see SLI as a waste of GPU power unless you need ultra-high resolutions. Other than that SLI doesn't offer the types of speed increases at "normal" resolutions that would warrant all that extra hardware. It's just plain inefficient.
|
http://www.bjorn3d.com/read.php?cID=1024&pageID=2907
http://www.bjorn3d.com/read.php?cID=1024&pageID=2908
SLI results in a visual performance increase from 30%-60% up to 100% percent.
Read any review
Dex
Quote:
Originally Posted by Zodiak
Ok lets shut this guy down.
http://www.bjorn3d.com/read.php?cID=1024&pageID=2907 http://www.bjorn3d.com/read.php?cID=1024&pageID=2908 SLI results in a visual performance increase from 30%-60% up to 100% percent. Read any review |
"Visual performance increase"? Hiliarious. SLI provides so little performance gain for the dollar I'd rather save my money and upgrade my GPU more often. You're entitled to your opinion, though. In my 22 years of gaming I've found that maintaining a single high-end GPU is FAR more cost-effective. 0% - 60% (under IDEAL circumstances, and typically not in framerate, so the gain is subjective) is not great for 100% more cost IMHO, especially when you can wait a bit and simply upgrade to the next generation of GPUs 6 months later for what that second card cost you and reap better benefits. I'm not telling you what to spend your money on, so no need to "shut me down".
However, the "bang for your buck" with SLI is pathetic (from my perspective), so I guess it's all a matter of how much you care about what you're getting for your money. If you want to buy two video cards just to get insane AA (which I can hardly tell is there, honestly), that's your choice. That's just my opinion, although I am a computer engineer that's been gaming for 22 years (and yes, I do read plenty of reviews), so I do have SOME knowledge on the subject.
No need to get aggresive, though, I'm not attacking your glorious video cards.
Zodiak
22 years of gaming and you call SLI pathetic?
I would re-examine those 22 years sir. 30%, 60% to 100% performance increase not worth your dollar? ideal conditions? You better tell that to the hundreds of hardware reviewing website that they were wrong and to go back to bed.
Just because your wrong doesnt mean you have to be grouchy about it
I would re-examine those 22 years sir. 30%, 60% to 100% performance increase not worth your dollar? ideal conditions? You better tell that to the hundreds of hardware reviewing website that they were wrong and to go back to bed.
Just because your wrong doesnt mean you have to be grouchy about it
Dex
Quote:
Originally Posted by Zodiak
22 years of gaming and you call SLI pathetic?
I would re-examine those 22 years sir. 30%, 60% to 100% performance increase not worth your dollar? ideal conditions? You better tell that to the hundreds of hardware reviewing website that they were wrong and to go back to bed. Just because your wrong doesnt mean you have to be grouchy about it |
I don't think it's worthwhile to pay $800 for 140fps when I'm already getting 100fps on max settings for $400. Guild Wars is a great example for this. Nobody is going to notice the difference in how well an SLI dual-8800GTX system runs GW over a single 8800GTX. Conversely, if I'm only getting 30fps in a "bleeding edge" game, getting 42fps isn't going to be an enormous coup considering it costs double the price. I'll hold onto my $400 for a few months until I can get my next-gen card and get 70fps, and perhaps the next generation of shader/GPU features. Get what I'm saying? In my personal experience, Oblivion was hardly a better gaming experience with dual 7900's than it was with one 7900. My $350 purchase of the seconds card was pretty much a waste of money.
By the way, 60-100% gains are extremely few and far-between. The huge 100% gain people expect is largely nVidia marketing. You're only going to get those numbers in mainstream games with good SLI support under ideal conditions (higher resolutions, excellent SLI support in the game, the newest drivers, you've set the SLI rendering mode for the game properly, etc). Any hardare review that covers more than the same old suite of games nVidia uses in their own marketing will support that statement. We don't disagree on the numbers, we disagree on the practical real-world benefits of them and what they're worth in terms of price. Yes, the bang-for-the-buck you get from SLI and Crossfire is pretty pathetic, IMO. It's not a good value. You pay a lot for little significant NOTICABLE effect. I understand what the numbers mean. My opinion that it's not worth the cash is my own (and well-founded from my perspective).
Seriously, though, you bought a nice high-end SLI setup, so I understand your stance, and I'm not going to argue with you about it. All I'm saying is that for many people it's not a wise investment, ok? Sorry if I came off as harsh, but implying that I don't know what I'm talking about simply because I disagree with you is silly.
Zodiak
Quote:
Seriously, though, you bought a nice high-end SLI setup, so I understand your stance, and I'm not going to argue with you about it. All I'm saying is that for many people it's not a wise investment. |
Quote:
Originally Posted by Dex
I'm not "wrong" because my opinion differs from yours. Paying 200% price for 130-160% performance under ideal conditions
By the way, 60-100% gains are extremely few and far-between. The huge 100% gain people expect is largely nVidia marketing. |
I am also done argueing, I think we both had good points, I just had to let people know that yes SLI can be a great thing, its just not for everyone and you must know how and where to take full advantage of it.