Problem with my FPS.

Avectius

Academy Page

Join Date: Oct 2008

Here are my specs:

ATI HD4670
2 GB of RAM
Intel Dual-Core 2.66Ghz

And its REALLY weird, cause this card is powerfull enoguh to run GW at max settings, and my system specs are fine, and i never get past 60 frames per second O.o What could be causing this??? Is it my connection? ( its not suposed to mess with fps, but lets consider all possibilities here ) Or could a virus or infected file or whatnot COULD ACTUALLY affect a card´s performance? I really need advices on this, plus im getting some lag.. any help on that too?

Elder III

Elder III

Furnace Stoker

Join Date: Jan 2007

Ohio

I Will Never Join Your Guild (NTY)

R/

VSYNC it will cause your fps to not go over your monitor's default refresh rate, which is usually 60hz. You will need to disable that to get higher fps.

seriously though, 60 fps is more than enough to play GW seamlessly.

[DE]

[DE]

Hugs and Kisses

Join Date: Oct 2005

Scars Meadows

I'd hardly call that an FPS problem, your eyes can only see 29.9 fps a second or something around that; so the difference between 60 fps and higher is negligible.

Try to be more specific on the lag problem. Probably just Wintersday lag.

Avectius

Academy Page

Join Date: Oct 2008

But you dont understand... when im in towns things get worse :/ i get like 29 to 43 frames per second :/


well wehn im on ascalon city ( and there werent that much players there yesterday ) i was was walking aroun, and every now and then i kept getting minimal lag ( well i kept having quick mini-freezes ( lag... right? ) ) but not when im outside the town or outposts.

Abedeus

Abedeus

Grotto Attendant

Join Date: Jan 2007

Niflheim

R/

Quote:
Originally Posted by [DE] View Post
your eyes can only see 29.9 fps a second or something around that
24 fps. A bit less, but rounded up it's 24.

Besides, most of the (if not all) widescreen monitors support only 60 Hz refresh rate. That means only 60 fps will be shown, even if graphic card can support 100 or more.

That's why you don't see a difference between Diablo 2 with 400 FPS and Diablo 2 with 75 FPS.

The difference is SLIGHT when under 60 FPS, but only with high-speed moving objects. As long as it's above 24, you won't see much choppiness.

And the Wintersday event makes towns harder to render, thanks to snow and decorations.

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

if you play GW competitively, it's best to turn vsync off. for some strange reason, camera rotation is tied to FPS. the higher your FPS, the faster the camera can rotate.

slowdowns in busy areas is pretty common, and is not really a cause for concern.

Lord Sojar

Lord Sojar

The Fallen One

Join Date: Dec 2005

Oblivion

Irrelevant

Mo/Me

That eye can see mumbo jumbo is an urban legend. What that legend has based in reality is what your eye can comprehend as fluid movement, which is actually ~90FPS. The eye can in theory see 4000+FPS in full color, and billions in black and white. Biology lessons over.

As for your FPS dropping in town. Larger towns with more people will make your FPS take a nosedive, because of the 100s of thousands of additional polygons your GPU has to render. Unfortunately, there is no way around this. 25+ FPS is fine though.

Abedeus

Abedeus

Grotto Attendant

Join Date: Jan 2007

Niflheim

R/

Quote:
Originally Posted by Rahja the Thief View Post
That eye can see mumbo jumbo is an urban legend. What that legend has based in reality is what your eye can comprehend as fluid movement, which is actually ~90FPS. The eye can in theory see 4000+FPS in full color, and billions in black and white. Biology lessons over.
Then explain why subliminal messages exist and why people don't see 1 or 2 frames in a movie with 60-90 FPS?

Also, in theory our brain uses only 10-12% of it's power, so in THEORY we could learn 200 languages in one year. Or solve hyper-complicated math problems almost as fast as a machine. Or remember everything since the day we were born.

In theory, we could travel in space by building a space ship that can withstand an atomic blast, then use H bombs to travel enormous distances. Theorycraft is nice, but doesn't work most of the time.

Lord Sojar

Lord Sojar

The Fallen One

Join Date: Dec 2005

Oblivion

Irrelevant

Mo/Me

You don't comprehend them. That doesn't mean you don't see them. The cones and rods in your eye can easily capture the images and your subconscious can process them. The eye is an amazing thing.

And a movie uses a camera, which captures motion blur. Blur = normalized movement. That is why the newer PC games utilize motion blur, to hide lower frame rates. In games that don't utilize it, you need 60+ FPS for the game to appear to be runnng fluid, and even then, some people can see issues with that. Enter the 120Hz LCD.

Also, we use far more than 10% of our brain. Try more on the order of 100%. That is again, and urban legend. The amount of energy our brain consumes and the amount of resources devoted to its operation nullify that silly 10-12% theory. That silly theory came about from the cell types in the brain (neurons vs glial)

As for your last theory, the Orion project? Oh dear god... yeh, that was a theory in the 50's.... Fusion or antimatter accelerators are far better. LOL

Abedeus

Abedeus

Grotto Attendant

Join Date: Jan 2007

Niflheim

R/

Wait, if you need 60+ FPS in game to make it fast, why some of them are capped at 30 FPS? Just an example - Dead Space, which is pretty smooth in terms of FPS, is capped at 30.

Quote:
Enter the 120Hz LCD.
Aren't CRTs superior in terms of refresh rate?

About the brain part, how can anyone be sure how much of actualy power we use? I mean, come on, it's human brain. It's not like you can remove it, cut to pieces and then put back in, like a heart. To make such tests, you would have to experiment on living species or fresh corpse. Fresh = few minutes, because brain death appears usually in a matter of seconds or minutes.

Quote:
As for your last theory, the Orion project? Oh dear god... yeh, that was a theory in the 50's.... Fusion or antimatter accelerators are far better. LOL
Actually, I've read an article half a year ago about that. And that this project is still going.

But hey, we went a bit off the topic.

Quaker

Quaker

Hell's Protector

Join Date: Aug 2005

Canada

Brothers Disgruntled

Quote:
Originally Posted by Avectius View Post
ATI HD4670

And its REALLY weird, cause this card is powerful enough to run GW at max settings,
What makes you say that? The HD4670 is only a mid-range card, and the fps would depend upon both the resolution you are running GW at and the graphics settings.
The larger your monitor and/or rez, the more Gpu power it takes. You would probably need at least an HD4850 to run GW at 1680x1050 with everything maxed.
An HD4670 would probably require a smaller rez to run everything at max.

One of the easiest ways to increase your fps would be to reduce the AA level of the game. You may have it set to x4 - try x2 or NONE.
You could also try using "Auto Detect" and see what it comes up with.

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

a HD4670 should be able to run GW at 1680x1050, at full settings with AAx4, with power to spare.

Elder III

Elder III

Furnace Stoker

Join Date: Jan 2007

Ohio

I Will Never Join Your Guild (NTY)

R/

my 4850 runs GW at 200+ fps with all max and 4x AA @ 1680x1050.... sometimes it's lower in Kamadan, but never below 60 (or 100 for that matter)....

my 8600GTS runs GW at all max settings @ 1440x900 and maintains 60 fps everywhere but kamadan, where it may drop under 40 on occassion..... (for reference)

Blackhearted

Blackhearted

Krytan Explorer

Join Date: Jan 2007

Ohio, usa

none

Mo/

Quote:
Originally Posted by Quaker View Post
What makes you say that? The HD4670 is only a mid-range card, and the fps would depend upon both the resolution you are running GW at and the graphics settings.
The larger your monitor and/or rez, the more Gpu power it takes. You would probably need at least an HD4850 to run GW at 1680x1050 with everything maxed.
An HD4670 would probably require a smaller rez to run everything at max.

One of the easiest ways to increase your fps would be to reduce the AA level of the game. You may have it set to x4 - try x2 or NONE.
You could also try using "Auto Detect" and see what it comes up with.
Lol. You're kidding, right? Guild wars is almost 4 years old, almost any respectable and modern(non integrated) gpu can max this game out at 1680x1050 with with little to no problem. The only case you might need something like a 4850 to max this ancient game out is at a huge res like 2560x1600.

Abedeus

Abedeus

Grotto Attendant

Join Date: Jan 2007

Niflheim

R/

Guild Wars: Nightfall is not 4 years old. Engine was updated.

And GW:EN uses new, for example, fire animation. Go to the burning forest, then to Ring of Fire and compare.

Blackhearted

Blackhearted

Krytan Explorer

Join Date: Jan 2007

Ohio, usa

none

Mo/

Quote:
Originally Posted by Abedeus View Post
Guild Wars: Nightfall is not 4 years old. Engine was updated.

And GW:EN uses new, for example, fire animation. Go to the burning forest, then to Ring of Fire and compare.
Finally discovering the pixel shader and such, as well as making moderate use of it is about the only visible change to the engine. Most other things, such as textures, look the same maxed as they did 2 and a half years ago, when i first played gw.

Quaker

Quaker

Hell's Protector

Join Date: Aug 2005

Canada

Brothers Disgruntled

I can tell you that my HD2900XT was NOT capable of running GW at max settings (at 1680x1050), without the fps dipping below acceptable levels. I'm not sure how the 4670 compares to the 2900, let me check....

From what I can see from various charts, the 4670 is not as powerful as a 2900XT. So, I'd say you can't expect the HD4670 to run GW with everything maxed at a consistent 60fps.

Of course, the OP never said what his rez was/is.