Below 60FPS?

Hey_homies

Hey_homies

Lion's Arch Merchant

Join Date: Jul 2006

W/

Many people seem to run guild wars maxed out with older cards at a constant 60fps. I have a 260GTX (192 Core), Running GW 1920x1080 windowed maxed out. I have v-sync on but my fps dips to about 40 sometimes and can get as low as 30's in Kamadan even with v-sync off. Anyone know the reason why? It can't be overheating because it stays at around 55-60. I know my card is well capable of running guild wars...Here are my specs:

E8400- 4.2ghz
4Gb RAM- 920Mhz
260GTX Core 192
ASuS P5Q SE

tom999

tom999

Frost Gate Guardian

Join Date: Jan 2009

House of Lord Darcia [HoLD]

W/

think yourself lucky
my fps is usually around 15 lol
but all runs smoothly

now OT: it dips just coz of amount of data incoming as u noticed in spamadan where every1 is selling like no tomorrow

riktw

Wilds Pathfinder

Join Date: Jul 2008

netherlands

Mo/E

its normal, GW only uses 1 CPU core, and in some areas its CPU limited.
you need some sick clock speeds to get 60FPS constand in spamadan.
i have a HD4870*2 and i got FPS spikes in spamadan to.
[email protected]

Killamus

Guest

Join Date: Oct 2008

That's odd. I'm running an E8400 at 3.0ghz, 2 gigs of ram, and an 8800GT, and I'm usually at 60FPS with vsynch on, and 100ish with it off, even in spamadan. Of course, turning really fast while in the upper area drops this like a fly, but with me idling, there isn't a problem. Try turning off AA, I find that helps me a lot.

Bob Slydell

Forge Runner

Join Date: Jan 2007

Quote:
Originally Posted by Hey_homies View Post
Many people seem to run guild wars maxed out with older cards at a constant 60fps. I have a 260GTX (192 Core), Running GW 1920x1080 windowed maxed out. I have v-sync on but my fps dips to about 40 sometimes and can get as low as 30's in Kamadan even with v-sync off. Anyone know the reason why? It can't be overheating because it stays at around 55-60. I know my card is well capable of running guild wars...Here are my specs:

E8400- 4.2ghz
4Gb RAM- 920Mhz
260GTX Core 192
ASuS P5Q SE
You can get good FPS on an OLD computer with maxed settings, but these people most likely have smaller screens = smaller resolution.

Your giant recoultion of 1920x1280 means your gfx card must pump out more pixels per second resulting in a lower fps.

I have an ATI Radeon x1600 with 256 MB Video RAM (pretty dated card bu it gets the job done successfully even on maxed nice looking gfx) and maxed out at 1680x1050 res on a 20" Widescreen. My FPS on average is in the 30's-40's and prob only reaches 60 looking a the ground or a wall, rofl.

Elder III

Elder III

Furnace Stoker

Join Date: Jan 2007

Ohio

I Will Never Join Your Guild (NTY)

R/

you should be getting better FPS with your setup - I suspect you have a driver "incompatibility" - search the web for different drivers - I have a 7900gs that only got 18-20 fps in Kamadan and 45ish outside - I put some Forceware dirvers in there and it went up to 50ish in Kamadan and a steady 60 (Vsync) outside - Vista 32bit 1440x900 res btw.

magao

Academy Page

Join Date: Jul 2008

Australia

Order of Pussycat Mountain [OPCM]

N/

One other thing - you should try enabling DirectX 9 triple buffering using D3D Overrider (comes with RivaTuner). This *may* help somewhat with the dips.

Braxton619

Braxton619

Desert Nomad

Join Date: Jul 2008

A/W

I am running 60FPS with VSYNC with these specs:

500GB Hard Drive
3GB RAM
Microsoft Windows XP Home Edition
ATI Radeon HD 2600 XT

I can most games on highest settings. Another thing that will help is installing the latest drivers.

Bob Slydell

Forge Runner

Join Date: Jan 2007

Quote:
Originally Posted by Leet Tankur View Post
I am running 60FPS with VSYNC with these specs:

500GB Hard Drive
3GB RAM
Microsoft Windows XP Home Edition
ATI Radeon HD 2600 XT

I can most games on highest settings. Another thing that will help is installing the latest drivers.
Thats because Radeon HD 2600 is an awesome card.

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

actually, it's quite bad compared to the OP's.

Hey_homies

Hey_homies

Lion's Arch Merchant

Join Date: Jul 2006

W/

Quote:
Originally Posted by Leet Tankur View Post
I am running 60FPS with VSYNC with these specs:

500GB Hard Drive
3GB RAM
Microsoft Windows XP Home Edition
ATI Radeon HD 2600 XT

I can most games on highest settings. Another thing that will help is installing the latest drivers.
Wait, so your saying your 2600XT is more powerful than my 260GTX?
I have the latest Nvidia Drivers 186.18. I don't get what could be the problem...

Bob Slydell

Forge Runner

Join Date: Jan 2007

Quote:
Originally Posted by Hey_homies View Post
Wait, so your saying your 2600XT is more powerful than my 260GTX?
I have the latest Nvidia Drivers 186.18. I don't get what could be the problem...
Nah noone is saying anything is better than another, just look for the latest drivers instead of waiting around here for useless posts lol.

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

your graphic card might be defective, or maybe it is not getting sufficient power. it should EASILY handle GW at that resolution. btw, what's your performance at full screen? usually graphic cards perform worse in windowed mode.

Wrath Of Dragons

Wrath Of Dragons

Burninate Stuff

Join Date: Aug 2005

New Mexico

E/Mo

Download this: http://www.techpowerup.com/gpuz/
Run it, and get back to us with the numbers it is reporting, and what it should be running at (from the box, or mfg product page)

Quaker

Quaker

Hell's Protector

Join Date: Aug 2005

Canada

Brothers Disgruntled

If you are running WinXP, try running the game full screen (not windowed) and see if it makes a difference.
You could also try running the cpu at stock speed to see if it changes things.

Edit: I have a computer connected to my TV which runs at 1920x1080. It has an AMD x2 7750 cpu and an HD4850 video card. GW keeps a steady 60 fps, but it's running at less than max settings. 1920x1080 is a lot of pixels to move around, that fps might not be too unusual.

Hey_homies

Hey_homies

Lion's Arch Merchant

Join Date: Jul 2006

W/

Quote:
Originally Posted by moriz View Post
your graphic card might be defective, or maybe it is not getting sufficient power. it should EASILY handle GW at that resolution. btw, what's your performance at full screen? usually graphic cards perform worse in windowed mode.
I have a 550Watt Corsair with 41amps on 12v so it should have enough power. Its not defective because i can run all others games fine such as crysis. I don't know whats wrong. I even overclocked the hell out of it to the speeds of 740/1481/1235 and still it gets below 60. Even when its full screen!


Here is Stock Speeds:



Overclocked:

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

my HD4890 gives me 120+ fps in 8v8 combat, with all settings maxed at 2048x1152. comparatively, the GTX 260 should easily play GW at 1920x1080 with similar performance.

Elder III

Elder III

Furnace Stoker

Join Date: Jan 2007

Ohio

I Will Never Join Your Guild (NTY)

R/

"latest drivers" don't always mean they are the best or most optimized for a game llike GW - which is 4 yrs old btw. Try some non NVIDIA drivers and see if it makes any difference.

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

what elder meant by "non nvidia" drivers, are modified nvidia drivers. don't try installing radeon drivers... those won't work.

alternatively, try forceware version 169.21. that one has always worked well for most people.

Lord Sojar

Lord Sojar

The Fallen One

Join Date: Dec 2005

Oblivion

Irrelevant

Mo/Me

Quote:
Originally Posted by Chrisworld View Post
Nah noone is saying anything is better than another, just look for the latest drivers instead of waiting around here for useless posts lol.
Or I could post, and all hints of uselessness fade away faster than beer at a Nascar event.

" Try some non NVIDIA drivers and see if it makes any difference."

No.


The issue could be an issue our driver department has at low priority right now, unfortunately.

The GT200 series has a known issue with texture reads from large single files (over 3GBs) GW.dat is a clear example of this. This happens to be an issue more with Vista than XP (go figure)

It affects about .9-1.4% of systems on average.

HOWEVER... I don't believe that to be the case here.

Your power supply is most likely the culprit.

With 41A on the 12v+ rail, and a TVR of 550w, it is highly unlikely that your PSU is feeding the GTX260 enough power (especially since it is a 192 core rev0 card)

Your best route would be to upgrade the power supply in the future, something 650w+ with a 50A+ combined 12v+ rail. In the meantime, disable PhysX acceleration (if it is enabled), and disable AA or set it to 1x max.

The reasoning? Although you have "enough" power to power the card, the 192 core rev0 GTX 260 was more power hungry than its predecessors, and you have a greatly overclocked version (meaning it is drawing more power) You could try dropping the clock speeds low (by about 50+ MHz on everything) If that fails, I could do a a custom delta clocked vBIOS for you to try to alleviate the issue until you upgrade the PSU.

Post results from my recommendations here, and we can take steps to improve performance.

Brett Kuntz

Brett Kuntz

Core Guru

Join Date: Feb 2005

I bet you $5 the problem is disk access, and you probably have my ksmod installed too, which will slow down player loads by 4x. GW streams textures on the fly and this is the #1 issue for low frame rates. If you have crappy disk access speeds, a fragmented GW.DAT, or other programs using your disk in the background (bittorrent), you will get low frame rates regardless of your hardware. Every time a player loads into a map, the GW engine pauses while it's texture is generated and then loaded.

GW also does suffer from EvictManagedResources() bugs from time to time (driver errors) where it wipes vidoe memory completely every so often and forces everything to get reloaded. It happens on certain maps when the camera is looking certain directions. Blocking the call to EMR() solves the problem, and so does updating or reverting your drivers to another version.

Quote:
Your power supply is most likely the culprit.

With 41A on the 12v+ rail, and a TVR of 550w, it is highly unlikely that your PSU is feeding the GTX260 enough power (especially since it is a 192 core rev0 card)
His GPU pulls 14 amps MAX. In GW, more like < 10.
His CPU pulls 7 amps overclocked.
His mobo, HDD, RAM, fans, another 5 amps.

Total pull is 300 watts. And that is being liberal because no game or program can max out all system components at once.

550w PSU's easily run overclocked i7's and 2 GPU's. His old dual core and single GPU setup is no match for even a no-name brand 400w PSU.

Lord Sojar

Lord Sojar

The Fallen One

Join Date: Dec 2005

Oblivion

Irrelevant

Mo/Me

Quote:
Originally Posted by Kuntz View Post
His GPU pulls 14 amps MAX. In GW, more like < 10.
His CPU pulls 7 amps overclocked.
His mobo, HDD, RAM, fans, another 5 amps.

Total pull is 300 watts. And that is being liberal because no game or program can max out all system components at once.

550w PSU's easily run overclocked i7's and 2 GPU's. His old dual core and single GPU setup is no match for even a no-name brand 400w PSU.
Let me point out a few flaws in that argument. First off, it assumes that I don't know anything about the internal core design of the GT200 series. It also assumes that Amperes are pulled, which is laughable at best.

Your GPU doesn't "pull amperes", it needs a specific amount at the given DiELG junctions to operate smoothly (this being after conversion and structuring). Amperes measure current, that is electron per point of relation. If you care to argue to semantics of electrical physics, I have a PhD to back me up, and you have...?

You think of a PSU as a 2 dimensional object, whereas, in the realm of DC (direct current if you aren't up to date on electrical terms), the PSU serves many more dynamics. Typically, the 12v+ rails are shared as well, because manufacturers are cheap and don't want to build specific capacitor/resistor series, as it costs extra. PSUs are also not perfectly efficient, as nothing in the world is. Best case scenario, you are looking at ~86% efficiency at the PCapacitor, and down the wire?.... maybe 70% or less... not great I am afraid.

There is significant iL5 gate leakage in rev0 of the GTX260 and subsequently the GTX280. In addition to that leakage, there are ramp up issues with core to display output. English? That means the card uses and wastes power. This is a fundamental flaw with most first revision cards. While in a perfect world, with no DiELG leakage or e- inversion, sure, maybe (and that is a huge maybe), but real physics doesn't permit perfection I'm afraid...

I will not argue what PSU requirements are. A 550 watt PSU (assuming 2 hard disks, 2 optical drives, Core2 Quad, mid high end motherboard, high speed low latency RAM, and 3+ peripherals, with a GTX260 rev0) isn't really enough unless you had extremely low settings, and even that is stretching it. Now, one could argue that is you setup the system PERFECTLY, using very specific parts, it probably would be. But, in all honesty, his system is not the case in point.

Sorry, but until we get superconductive materials perfected, you won't see a PSU that can output near what it should at the wire. This is fact, not fiction. If you want to debate me, fine, by all means.

The fact is, with iL5 leakage, and the fact that PSUs and PC electrical systems are FAR from perfect (Ohm can back me up here, as will Kirchhoff in regards to internal complex!) You can find all the data you want, but unless his PSU is running at the maximum theoretical efficiency it can output, and his component are at the most minimal leakage that can be allowed (all his parts are "perfect" off the assembly line), I am afraid it isn't enough.

Again, keep in mind, the driver issue is magnified by the lack of power.
The problem isn't so much that there isn't enough power, but that there isn't enough power to overcome the driver issues with large texture unpacking.

Brett Kuntz

Brett Kuntz

Core Guru

Join Date: Feb 2005

You are incorrect. There is not one article or review on the internet that shows any single GPU setup pulling more than 550 watts from an outlet.

Not.
Even.
Close.

I know you work for nVidia, but you were horribly wrong on memory latencies and why bus-width has a direct effect on the size and quality of a graphics chip. For someone who claims to work for nVidia, your knowledge on even the most basic computer areas is non-existant.

I can't really argue against your GTX/PSU post because you basically just posted a bunch of mumbo-jumbo lingo that has nothing to do with my point. It is a fact, no single GPU system, regardless of HDD's, ODD's, Fans, and motherboard, can pull over 500 Watts. The most hardcore Quad's with the most hardcore GPU's pull around 300 Watts during gaming.

Hell, this single link proves my entire point:

http://www.anandtech.com/video/showdoc.aspx?i=3408&p=9



Note that even the monster GTX 280 only pulls 313 Watts from the wall under load, even on a liberal 85% efficiency PSU that is a total system draw of 266 Watts. A 266 Watt PSU could run a Quad-Core QX9770 Extreme @ 3.2 GHz, 4 sticks of RAM, and a GTX 280.

I know many people who run OC'd 4GHz Quad-Cores and 2 GPU's on 550 Watt PSU's and never have issues. Considering 750 Watts PSU's are known to run 4 GPU systems for benchmarking and [email protected].........I'm pretty sure a 550 Watt system can easily handle 1 GPU.

MBTW

MBTW

Academy Page

Join Date: Jan 2009

W/

i run the game at 3-7 fps at all times. i cant play most of NF and EotN because my graphics card is not compatible. I'm not sure about factions. so... i am below 60 fps

Martin Alvito

Martin Alvito

Older Than God (1)

Join Date: Aug 2006

Clan Dethryche [dth]

As a dispassionate observer:

Kuntz, you are talking past him, and he's not going to take you seriously until you debate him on the merits. His argument logically boils down to: "pull" doesn't exist as a theoretical construct, but is merely an oversimplification of the physics. As an approximation, it is a useful explanatory tool...except when it does not work. He argues that it does not work in this particular case.

You attempt to dismiss his argument as technical mumbo-jumbo, but if you cannot offer a coherent explanation for why "pull" is a valid interpretation of the physics of this problem, you have lost this battle. Your external citation proves a lot less than you think it does. A measurement of system wattage does not address the core question of whether this card getting the power it needs to operate well.

You argue that he was technically wrong on two previous issues but offer no proof in either case. Additionally, the reference of JUST your post in the thread where he responded and you did not take up the gauntlet appears intellectually disingenuous.

Of course, the only way of actually resolving who is correct would be through resort to empirical evidence. The best advice I can provide the OP is to choose the recommendation you feel to be most accurate, implement it, and report your findings. Only move on to the other recommendation if the initial resolution proves unsatisfactory.

KZaske

KZaske

Jungle Guide

Join Date: Jun 2006

Boise Idaho

Druids Of Old (DOO)

R/Mo

If the op is using a vx550 Corsair PSU it has only a single +12v rail and is certified to be 80% efficent at full load. HardOCP actually awarded it thier rare gold award. The vx550 has not been out very long so I have no clue as to the specific model he is using.
As for the driver issue mentioned by Rahja, that is the primary reason I will not update the driver for my 9600GT or the 7900GS (AGP). The newer the driver, the worse the performance. Shoot, the 7900GS is still using a 96series driver that provides 50+FPS on a 1440x900 monitor. If the OP's PSU is not a vx series, then Rahja it the nail on the head. The previous generations of PSU put out by Corsair were consistantly rated beyound any reason. I have a PSU from Corsair marked 550w with a max power output of 490w.
I would be willing to bet that the overclocking he did to the card did not result in a corsponding increase in FPS. If it did, the PSU is good, but I bet that the increase was less than 25% of what was expected.

Hey_homies

Hey_homies

Lion's Arch Merchant

Join Date: Jul 2006

W/

Quote:
Originally Posted by KZaske View Post
If the op is using a vx550 Corsair PSU it has only a single +12v rail and is certified to be 80% efficent at full load. HardOCP actually awarded it thier rare gold award. The vx550 has not been out very long so I have no clue as to the specific model he is using.
As for the driver issue mentioned by Rahja, that is the primary reason I will not update the driver for my 9600GT or the 7900GS (AGP). The newer the driver, the worse the performance. Shoot, the 7900GS is still using a 96series driver that provides 50+FPS on a 1440x900 monitor. If the OP's PSU is not a vx series, then Rahja it the nail on the head. The previous generations of PSU put out by Corsair were consistantly rated beyound any reason. I have a PSU from Corsair marked 550w with a max power output of 490w.
I would be willing to bet that the overclocking he did to the card did not result in a corsponding increase in FPS. If it did, the PSU is good, but I bet that the increase was less than 25% of what was expected.
I thought Corsairs were quality PSU? The PSU is a VX series and If I'm not getting enough power from my PSU, why Am I able to run Crysis with decent frames then? I did a Crysis Benchmark. You be the judge if my card is getting enough juice or not.

BTW, This was done with my card overclocked

Run #1- DX10 1280x1024 AA=4x, 64 bit test, Quality: VeryHigh ~~ Overall Average FPS: 32.175
Run #2- DX10 1680x1050 AA=4x, 64 bit test, Quality: VeryHigh ~~ Overall Average FPS: 24.895
Run #3- DX10 1920x1080 AA=4x, 64 bit test, Quality: VeryHigh ~~ Overall Average FPS: 21.42

Brett Kuntz

Brett Kuntz

Core Guru

Join Date: Feb 2005

Quote:
Originally Posted by Martin Alvito View Post
As a dispassionate observer:

Kuntz, you are talking past him, and he's not going to take you seriously until you debate him on the merits. His argument logically boils down to: "pull" doesn't exist as a theoretical construct, but is merely an oversimplification of the physics. As an approximation, it is a useful explanatory tool...except when it does not work. He argues that it does not work in this particular case.

You attempt to dismiss his argument as technical mumbo-jumbo, but if you cannot offer a coherent explanation for why "pull" is a valid interpretation of the physics of this problem, you have lost this battle. Your external citation proves a lot less than you think it does. A measurement of system wattage does not address the core question of whether this card getting the power it needs to operate well.

You argue that he was technically wrong on two previous issues but offer no proof in either case. Additionally, the reference of JUST your post in the thread where he responded and you did not take up the gauntlet appears intellectually disingenuous.

Of course, the only way of actually resolving who is correct would be through resort to empirical evidence. The best advice I can provide the OP is to choose the recommendation you feel to be most accurate, implement it, and report your findings. Only move on to the other recommendation if the initial resolution proves unsatisfactory.
You both have an excellent ability to sound like you know what you're taking about, using sophisticated words and lingo. Too bad you're both wrong.

Quote:
Originally Posted by Martin Alvito View Post
A measurement of system wattage does not address the core question of whether this card getting the power it needs to operate well.
Yes it does. That system was running on a 1200 Watt PSU. If the GTX 260 and GTX 280 needed more power, they would have consumed more power.

You're basically saying that in homies 550 Watt system, the GTX 260 has some how magically detected it's a 550 Watt PSU, and is purposely pulling less power? Then why is it, in a related article, a GTX 260 is shown to only pull 274 Watts from the wall TOTAL SYSTEM? Did that card magically detect that there was only a 1200 Watt PSU and decide to pull less power too? What kind of special Unicorn Fairy Magic PSU does the GTX 260 need to work at full power then? LOL

Quote:
Originally Posted by Hey_homies View Post
I thought Corsairs were quality PSU? The PSU is a VX series and If I'm not getting enough power from my PSU, why Am I able to run Crysis with decent frames then? I did a Crysis Benchmark. You be the judge if my card is getting enough juice or not.

BTW, This was done with my card overclocked

Run #1- DX10 1280x1024 AA=4x, 64 bit test, Quality: VeryHigh ~~ Overall Average FPS: 32.175
Run #2- DX10 1680x1050 AA=4x, 64 bit test, Quality: VeryHigh ~~ Overall Average FPS: 24.895
Run #3- DX10 1920x1080 AA=4x, 64 bit test, Quality: VeryHigh ~~ Overall Average FPS: 21.42
Do yourself a favour and ignore the PSU related BS in this topic. They have you running around wasting your time testing things. Check out these two related benchmarsk, both on a 3.2GHz Quad-Core, check your frame-rates in relation to them:







Crysis is one of the most benchmarked games around now, so with a bit of searching, you could probably find someone with a similar system as yours and check your FPS against theirs.

KZaske

KZaske

Jungle Guide

Join Date: Jun 2006

Boise Idaho

Druids Of Old (DOO)

R/Mo

Quote:
Originally Posted by Hey_homies View Post
I thought Corsairs were quality PSU? The PSU is a VX series and If I'm not getting enough power from my PSU, why Am I able to run Crysis with decent frames then? I did a Crysis Benchmark. You be the judge if my card is getting enough juice or not.

BTW, This was done with my card overclocked

Run #1- DX10 1280x1024 AA=4x, 64 bit test, Quality: VeryHigh ~~ Overall Average FPS: 32.175
Run #2- DX10 1680x1050 AA=4x, 64 bit test, Quality: VeryHigh ~~ Overall Average FPS: 24.895
Run #3- DX10 1920x1080 AA=4x, 64 bit test, Quality: VeryHigh ~~ Overall Average FPS: 21.42
Please, rerun the tests with the card at default speeds. We already have the stats for default and over-clocked so it shoudl be east to see.
Corsair does make a good product, now. When they started out, they were like everone else, buying someone elses product and putting thier label on it.

After looking over the benchmarks posted so kindly by Kuntz, I was advise you back up your driver to a previous version. If you chose to do that you will want to clean off the left overs that the uninstaller will leave behind. Addressing Kuntz assurances that it is not the power supply, WRONG. It still could be. I have seen it happen on my own system. When I upgraded the power supply, I got an improvement of more than 20% in frame rates on the 9600GT. The PSU was the ONLY thing changed.

Brett Kuntz

Brett Kuntz

Core Guru

Join Date: Feb 2005

Quote:
Originally Posted by KZaske View Post
Addressing Kuntz assurances that it is not the power supply, WRONG. It still could be. I have seen it happen on my own system. When I upgraded the power supply, I got an improvement of more than 20% in frame rates on the 9600GT. The PSU was the ONLY thing changed.
It is already proven as fact, using science, beyond any doubt, that a GTX 260 Quad-Core system with 4 sticks of RAM only needs 233~ Watt PSU. His PSU is 550 Watts.

His PSU is well known for running two 4870's CFX or two 260's SLI. Two GTX 260's in SLI needs about a 400 Watt PSU (if you want to cut it close). Obviously, that why people with two 4870/260's buy 550 Watt PSU's.

Brett Kuntz

Brett Kuntz

Core Guru

Join Date: Feb 2005

http://www.extreme.outervision.com/PSUEngine

His system, guessing some stats, needs around a 277 Watt PSU.

Lets have some fun though, lets see what you can run on a 550 Watt PSU:

-E8400 at 100% utilization
-GTX 260 216-Core at 100% utilization
-4 sticks of RAM
-TEN SATA Hard Drives
-FIVE Optical DVD Drives
-TEN 120mm Fans (For all those HDD's lmao)
-Entire system is at 100% utilization all at once (impossible in real world condictions)

Wattage needed: 549

lol

Edit-Within 10 minutes I found numerous forum posts where people [email protected] on 260 SLI systems with Quad-Cores on 500w and 550w PSU's. And if there is ANYONE on Earth that is anal retentive about Power Consumption vs PPD [email protected]..it's [email protected] junkies.

KZaske

KZaske

Jungle Guide

Join Date: Jun 2006

Boise Idaho

Druids Of Old (DOO)

R/Mo

Quote:
Originally Posted by Kuntz View Post
http://www.extreme.outervision.com/PSUEngine

His system, guessing some stats, needs around a 277 Watt PSU.

Lets have some fun though, lets see what you can run on a 550 Watt PSU:

-E8400 at 100% utilization
-GTX 260 216-Core at 100% utilization
-4 sticks of RAM
-TEN SATA Hard Drives
-FIVE Optical DVD Drives
-TEN 120mm Fans (For all those HDD's lmao)
-Entire system is at 100% utilization all at once (impossible in real world condictions)

Wattage needed: 549

lol

Edit-Within 10 minutes I found numerous forum posts where people [email protected] on 260 SLI systems with Quad-Cores on 500w and 550w PSU's. And if there is ANYONE on Earth that is anal retentive about Power Consumption vs PPD [email protected]..it's [email protected] junkies.
I am not argueing that point at all. His 550w PSU should be able to run it just fine. But, have you considered that his PSU may not be performing as it should?
Edit: I too am a "[email protected] Junkie" that is how I found out that the previous PSU I had in my system was not functioning correctly.

Brett Kuntz

Brett Kuntz

Core Guru

Join Date: Feb 2005

Yes, his PSU could be faulty. It is unlikely though, and he can always run OCCT's PSU test to find out.

Jessica Pariah

Jessica Pariah

Frost Gate Guardian

Join Date: Jul 2006

Warrior's Isle

LF PvP/GvG Guild.

AMD Phenom II X3 720 BE @ 2.8GHz
4GB of RAM
ATi Radeon HD 4770
Coolermaster Real Power M520 520watt PSU

Constant 60 fps, except in.. you guessed it.. Kamadan.
Settings: all on highest, 1680x1050, 4x AA.
Quote:
Originally Posted by tom999 View Post
now OT: it dips just coz of amount of data incoming
No.

Notorious Bob

Notorious Bob

Frost Gate Guardian

Join Date: Mar 2009

Gwen's underwear drawer

The Curry Kings

R/

Quote:
Originally Posted by Kuntz View Post
http://www.extreme.outervision.com/PSUEngine

His system, guessing some stats, needs around a 277 Watt PSU.

Lets have some fun though, lets see what you can run on a 550 Watt PSU:

-E8400 at 100% utilization
-GTX 260 216-Core at 100% utilization
-4 sticks of RAM
-TEN SATA Hard Drives
-FIVE Optical DVD Drives
-TEN 120mm Fans (For all those HDD's lmao)
-Entire system is at 100% utilization all at once (impossible in real world condictions)

Wattage needed: 549
What credibility you *had* got flushed by this dazzlingly naive post.

While I don't work for nVidia or make other claims about *expertise", I do work in reality and real world experience proves that simply adding up Watts and then making ludicrous claims based on the result is truly dazzling in it's naivity!

Total power output for a PSU is about as useless a number as it gets when it comes to understanding if your system components are going to drag your PSU into over-current fail safe hell.

Far more important are things like the distribution and number of 12V rails and the devices hanging off them and the current and power available on each of those rails. i.e. is your single 12V rail trying to run your CPU, bus & GPU?

I've seen crap 500W PSU's (especially stock OEM PSUs) keel over with single high power GPUs because of poor rail design. Likewise there are numerous high quality lower power rated PSUs that have great designs and can manage under similar loads conditions.

-----------

For the OP this really isn't a question of power, it's really just life. Leave Crappy-dan behind venture out and see how your frame rate soars.

@Jessica I have a similar setup and my 4770 charges along at 75fps

rattex

rattex

Ascalonian Squire

Join Date: Jun 2009

South Africa

The ZA Illuminati

Rt/

60fps?
I have a AMD X2 5400+, 4gb Ram and a HD4850
keeping an average of 140fps, even in kamadan. 24" LCD 1900x1200
Think some tweaking needs to be done to systems

Malician

Oak Ridge Boys Fan

Join Date: Jun 2007

E/P

Quote:
Originally Posted by Notorious Bob View Post
What credibility you *had* got flushed by this dazzlingly naive post.

While I don't work for nVidia or make other claims about *expertise", I do work in reality and real world experience proves that simply adding up Watts and then making ludicrous claims based on the result is truly dazzling in it's naivity!

Total power output for a PSU is about as useless a number as it gets when it comes to understanding if your system components are going to drag your PSU into over-current fail safe hell.

Far more important are things like the distribution and number of 12V rails and the devices hanging off them and the current and power available on each of those rails. i.e. is your single 12V rail trying to run your CPU, bus & GPU?

I've seen crap 500W PSU's (especially stock OEM PSUs) keel over with single high power GPUs because of poor rail design. Likewise there are numerous high quality lower power rated PSUs that have great designs and can manage under similar loads conditions.
He made those statements because it appears that Corsair PSU can deliver goodly wattage over its 12V. It's very likely that the crap OEM PSU's you're talking about had a maximum true rating of 150 or 200 watts on the 12V, which was extremely common in older designs.

I know this because finding a decently priced power supply capable of supplying over 250 watts on the 12V at ZipZoomfly was surprisingly hard (they had cheap shipping to Alaska at the time).

Rahja, if the Corsair PSU needs to produce over twice of the GTX 260's TDP to run just that card capably, the card is so beyond defective it would've never been allowed out of nVidia's testing labs. To suggest otherwise is a vicious insult against a company I've gladly purchased the following from:

geforce 2 TI
geforce 6600 GT
geforce 9600 GT

had my friends purchase, due to my recommendation:

7600 GT
9600 GSO
8800 GTS 320 mb
8800 GT
9800 GT

NONE of these had this unhinged, unrestrained power use. It's obvious you know far more about electronics than I do, but what you are saying is completely unsupportable.

Brett Kuntz

Brett Kuntz

Core Guru

Join Date: Feb 2005

Quote:
Originally Posted by Notorious Bob View Post
What credibility you *had* got flushed by this dazzlingly naive post.

While I don't work for nVidia or make other claims about *expertise", I do work in reality and real world experience proves that simply adding up Watts and then making ludicrous claims based on the result is truly dazzling in it's naivity!

Total power output for a PSU is about as useless a number as it gets when it comes to understanding if your system components are going to drag your PSU into over-current fail safe hell.

Far more important are things like the distribution and number of 12V rails and the devices hanging off them and the current and power available on each of those rails. i.e. is your single 12V rail trying to run your CPU, bus & GPU?

I've seen crap 500W PSU's (especially stock OEM PSUs) keel over with single high power GPUs because of poor rail design. Likewise there are numerous high quality lower power rated PSUs that have great designs and can manage under similar loads conditions.

-----------

For the OP this really isn't a question of power, it's really just life. Leave Crappy-dan behind venture out and see how your frame rate soars.

@Jessica I have a similar setup and my 4770 charges along at 75fps
Even if I lack credibility at least I can read a simple sentence:

Quote:
I have a 550Watt Corsair with 41amps on 12v so it should have enough power. Its not defective because i can run all others games fine such as crysis. I don't know whats wrong. I even overclocked the hell out of it to the speeds of 740/1481/1235 and still it gets below 60. Even when its full screen!
The shear amount of stupidity in this topic is amazing.

Rahja, who claims to work for nVidia, has posted so many untrue and wild claims, I'm not the one with credibility issues:

http://www.guildwarsguru.com/forum/s...php?t=10288946

The following are all Rahja quotes:

Quote:
The Latency of a module of RAM is determined by its TIMINGS.
Incorrect. The latency is a combination of the timings and it's clock speed. Everyone that knows anything about computers knows this, except Rahja for some reason...but he works for nVidia!

Quote:
Latency is measured in ns (nano seconds)
Latency is measured in Clock Cycles...but he works for nVidia!

http://en.wikipedia.org/wiki/Cas_latency

Quote:
Originally Posted by Wikipedia
In synchronous DRAM, the interval is specified in clock cycles, and must be multiplied by the cycle time (i.e. divided by the clock frequency) to convert to nanoseconds.
For someone who claims to have a PhD and works for nVidia, he really has no clue when it comes to anything computer related.

I have noticed over the last few months he consistently posts misinformation, and people eat it up because he keeps bragging about his summer internship at nVidia.

Lord Sojar

Lord Sojar

The Fallen One

Join Date: Dec 2005

Oblivion

Irrelevant

Mo/Me

Kuntz... normally I would just delete posts of this inflamatory nature... but honestly, you have taken this one step too far. Insulting my education, belittling my career, and now telling me latency is measured in Hz? ARE YOU OUT OF YOUR MIND?!

First off... the statements made in that RAM guide are correct, albeit vague. In an attempt to make it more viable for average readers, one must use layman. Sorry, but I am not about to go into the specifics of memory timings and operations for the tech area of a Guild Wars forum. I have better things to do with my time, and far far more important things to deal with. Most people wouldn't even bother to read it, muchless understand it. I am glad that you, took the time to read up on exactly how it works.

However, latency is always measured in ns, not Hz. The two are inherently linked, I wouldn't dare dispute that, but the standardized measurement for latency isn't clock cycles, it is nanoseconds.

Timings across a parallel circuit and a complex circuit are ALWAYS measured in Hz. This will never change. The CAS and RAS are set as clock cycles, and of course speed makes a difference, as it does with total theoretical bandwidth. Clock speed affects all numbers, within a given cell, as that would determine the disparity between low end and high end RAM. You are overcomplicating a generic blanket statement, which is very unwise to do.

Timings =/= Latency. Latency is measured in units of time, timings are measured in cycles (Hertz). I never confused the two. I did use the wrong notations at one point (which is where I can see where the confusion came from regarding this whole issue) It is rather apparent though, if you aren't trying to gun me down reading through it, that timings and latency are separate but related terms in the guide.

Quote:
Originally Posted by Rahja the Thief
The Latency of a module of RAM is determined by its TIMINGS.
Do you know the definition of determined?

Of course, you neglected to read in context, as I clearly stated just below that...

Quote:
Originally Posted by Rahja the Thief
CAS Latency (CL) is the most widely understood number. However, do not be fooled by it. Slightly higher CAS on a much higher speed RAM is a good thing.
Now why on earth would I suggest that if I meant CAS is directly measured in nanoseconds? Speed would have no bearing if that was what was intended by the layman statements, now would it?

Oh, but thanks for pointing that out and trying to pick out a statement that suited your purposes... Gee, how nice. Feel free to find more of my posts with layman statements that can be removed and taken out of context. I'm sure it will be a valuable use of your time.

Kuntz, you are not keeping in mind a few simple things...

Because there is no additional charge being applied to the wiring leading to graphics cards, and there will be some (dependent on area and environment this can be very low or high) EM and ES interference, this can greatly affect total current (Amperes) The wattage of a PSU is irrelevant, as voltage is in a constant state of flux (ala Amperes in relation or volts and watts, OMG physics!)

This would, again, affect on wire and at gate current. I'm sorry, but I doubt you understand how forward gates and conversion work in processors, as most people don't (it isn't exactly something you can understand by reading and citing Wiki...)

But, in short, energy is lost through each medium it passes through. This electron loss in a processor is known as "leakage". DiELG (Dielectric Gate) leakage is the most common cause for excess heat in a processor of any type. With elevated iL5 gate leakage in the GT200 rev0 chips, this can be seen as power loss.

Again... following me here? Coupled with a driver flaw that is currently being worked on (that is processing very large texture files and/or small texture files out of a large file) [in this case, the gw.dat is an EXCELLENT example], power draw will be elevated, but current must remain stable. If there is ANY issue with current (which isn't pull, as you are citing... pull is wattage, which is VERY different), the card will be forced to throttle to pass the texture files through. Overly simplified, yes; correct nevertheless.

As for my position at nVidia, I am a Senior Process Design and Physical Design "Engineer". I use the term engineer loosely, as it isn't engineering in the traditional sense, but mainly dealing with process design and sampling, most specifically VSLI design and implementation. Don't deface me please... But the good news for you is!!! I won't be working for NV after October, as I am heading over to TSMC as a Jr. Process Engineer (almost the same job as NV, but more room for career advancement) So then, I guess I won't have claim to the NV throne. You try to glorify my position, but fail to realize that I have the same job that 90+ others do, in this company alone.

The fact is...

Quote:
Originally Posted by Malician
Rahja, if the Corsair PSU needs to produce over twice of the GTX 260's TDP to run just that card capably, the card is so beyond defective it would've never been allowed out of nVidia's testing labs. To suggest otherwise is a vicious insult against a company I've gladly purchased the following from
The Corsair PSU doesn't need to produce twice of anything. There are so many factors that play into at the wire current, and without detailed power specifics, it is hard to tell.

If you want, we can chalk this up to a driver issue and call it a day. The driver issue is (in all likelyhood) presenting itself, and that PSU isn't delivering enough current after conversion and e- inversion to correctly power the GTX260. The card itself isn't terribly flawed, it is within operational parameters.

The multiplicative issues that present themselves is the issue, not one or the other. You cannot simply pick and choose one possible issue. In all likelyhood, this is a driver/power/current issue; a culminated effect so to speak.

And Kuntz... I use my position at nVidia to help people on these forums with hardware related issues. My expertise has helped many, and that is why people trust my opinion. They don't blindly take my advice; they take it because it has been proven effective over the course of 2+ years.

Instead of bashing me and trying to find layman, simplified statements to prove me wrong, why don't you post on the merits of your background in electrical physics, and actually provide me proof that Ohm and Kirchhoff are wrong... Kirchhoff's two laws alone prove that my logic is sound... enlighten yourself before you deface me.

You are taking this way too personally, and I will not stand for any more flaming. If you want to continue this argument, do so in a non inflammatory manner. Provide evidence aside from total energy pull (wattage) that supports your argument. Wattage means nothing in relation to current needed to overcome a driver flaw (that again, is being corrected). Trace leakage, gate leakage, latent loss, electron inversion, converts, etc all play into total power. If there are enough flaws, the current will become instable/too low to compensate for this driver issue, resulting in performance issues.

Sorry, this is the way of things Kuntz. I am but the messenger, not the creator. But thanks for pointing out the notation issues. I just went ahead and removed the ns from those, as it was totally unintentional. I do make mistakes when posting 3 page essays, sorry. As you clearly have shown, I do make mistakes from time to time in my posts, especially when the original version of said posts are made at 01:49, 8th Jul 2008 (That's nearly 2am) So again, I made a notation error on numbers (and it wasn't just on timings after reading through it). I went ahead and corrected the errors that I found in the post just for you. I am human, I do make mistakes, but my knowledge on the subject is sound.

Lord Sojar

Lord Sojar

The Fallen One

Join Date: Dec 2005

Oblivion

Irrelevant

Mo/Me

This needs to be separated from the wall of text above, as it is rather important in finding a solution for this issue.

Hey_homies, I can try to craft a custom vBIOS for you. In order to do this, I need you to create a copy of your vBIOS so I know exact model, revision, and manufactuer of your card.

I might be able to balance the delta multiplier in the card, which should balance performance out in Guild Wars. I've done this for several others, but know that it still carries some risk. If you would like to try this as a possible solution to the issue, just let me know.

As for two other solutions to help alleviate driver issues.


Head to your nVidia Control Panel.
Under the 3D Settings category, you will see Manage 3D Settings
Go the the Program Settings Tab
Select Guild Wars (if it isn't there, navigate to it using the "Add" function. Make sure you add gw.exe and not a shortcut or other file)

Make sure Extension Limit is set to ON.
Multi Display/Mixed GPU Performance is set to Single Display Performance
Threaded Optimization is set to OFF
Maximum Pre-Rendered Frame is set to 2

Give those settings a try and see what happens.