what is FPS and how much should it be?
beserk
The little dot on the top right corner says fps on it im not sure what it means and it shows me numbers...how much is it supposed to be at?
Calista Blackblood
Quote:
Frame rate, or frame frequency, is the measurement of the frequency (rate) at which an imaging device produces unique consecutive images called frames. The term applies equally well to computer graphics, video cameras, film cameras, and motion capture systems. Frame rate is most often expressed in frames per second (FPS) and in monitors as Hertz (Hz). |
Painbringer
Frames per second
it is your graphic cards ability to process the info on the screen and how many times it can do it a second
Example my on board graphics gave me 20-25 fps
I installed a v card and it jumped to 45-60 fps
it is your graphic cards ability to process the info on the screen and how many times it can do it a second
Example my on board graphics gave me 20-25 fps
I installed a v card and it jumped to 45-60 fps
Snow Bunny
I run around with 25-35 FPS at any given time and I can still dshot your 3/4 cast WoH.
It's not really that choppy at <40 FPS, and it's really ping that matters, truth be told.
It's not really that choppy at <40 FPS, and it's really ping that matters, truth be told.
Calista Blackblood
Quote:
Originally Posted by Snow Bunny
I run around with 25-35 FPS at any given time and I can still dshot your 3/4 cast WoH.
It's not really that choppy at <40 FPS, and it's really ping that matters, truth be told. |

beserk
k thanks guys
Elder III
Frames Per Second and most monitors have a refresh rate of 60hz which means if you have V-Sync enabled that you will not top 60 FPS while playing. For GW 30+ is good enough in most cases. I can get 220+ with my new GPU, but there is no point..... V-Sync enabled is recommended for a smoother picture in GW.

beserk
Quote:
Originally Posted by Elder III
Frames Per Second and most monitors have a refresh rate of 60hz which means if you have V-Sync enabled that you will not top 60 FPS while playing. For GW 30+ is good enough in most cases. I can get 220+ with my new GPU, but there is no point..... V-Sync enabled is recommended for a smoother picture in GW.
![]() |
V sync?
12chars
Quaker
We need to make a distinction here between the FPS of the game and the FPS of the monitor.
The FPS of the game is basically how many times per second the client can run through a loop which calculates the position of all the objects in the game. This depends on a number of factors such as the speed of the CPU, the GPU, and the internet connection.
The FPS of the monitor is how many times per second, the monitor's screen is updated. In the case of most LCD monitors, this is 60Hz (fps).
Regardless of the game's FPS, the monitor is only updated at the 60Hz rate.
You want the game's fps to be high enough to appear to be smooth motion. Motion pictures are actually at only 24 fps, but realistically, you don't want the game to go much below 30 and getting it up near the 60Hz of the monitor would be ideal.
Without getting too technical, there is a period during a standard video signal during which no information is sent to the screen (it's to give the circuits time to reset to the start of a new scan and is a holdover from CRT days). With V-sync enabled, the graphics card only updates the screen during this period - this keeps the screen information from changing during a scan, which can cause video artifacts such as tearing. In spite of what some people may say, there is really no point in turning V-sync off - all that does is make the video card work harder by calculating extra frames that won't be displayed anyway (because the monitor updates at 60Hz)
The FPS of the game is basically how many times per second the client can run through a loop which calculates the position of all the objects in the game. This depends on a number of factors such as the speed of the CPU, the GPU, and the internet connection.
The FPS of the monitor is how many times per second, the monitor's screen is updated. In the case of most LCD monitors, this is 60Hz (fps).
Regardless of the game's FPS, the monitor is only updated at the 60Hz rate.
You want the game's fps to be high enough to appear to be smooth motion. Motion pictures are actually at only 24 fps, but realistically, you don't want the game to go much below 30 and getting it up near the 60Hz of the monitor would be ideal.
Without getting too technical, there is a period during a standard video signal during which no information is sent to the screen (it's to give the circuits time to reset to the start of a new scan and is a holdover from CRT days). With V-sync enabled, the graphics card only updates the screen during this period - this keeps the screen information from changing during a scan, which can cause video artifacts such as tearing. In spite of what some people may say, there is really no point in turning V-sync off - all that does is make the video card work harder by calculating extra frames that won't be displayed anyway (because the monitor updates at 60Hz)
beserk
OHH!!! ok thanks for that i understand clearly now.
Dante the Warlord
Oh... to get technical, just curious, but are you suggesting that its not my terrible terrible internet connection*cough* comcast *cough*giving me the lag, its my graphics card? Maybe im misunderstanding some of the things you are saying.
Quaker
Quote:
Originally Posted by Dante the Warlord
Oh... to get technical, just curious, but are you suggesting that its not my terrible terrible internet connection*cough* comcast *cough*giving me the lag, its my graphics card? Maybe im misunderstanding some of the things you are saying.
|
CPU - the CPU needs to be fast enough to run the loop. This is not normally a limiting factor in any newer computer. (But is one reason why they specify a minimum)
GPU - the GPU (Graphics Processing Unit - main chip(s) on the video card) needs to be fast enough to calculate the color and brightness of all the pixels on the screen. This obviously depends upon the number of pixels to be calculated - the higher the screen resolution, the more pixels. Also involved is the graphics settings of the game, such as AA, texture detail, lighting, shadows, etc. etc. The GPU is usually the main limiting factor for FPS. Low FPS from the GPU can be offset by reducing the graphics goodies and/or resolution.
Internet Connection - the speed and/or stability of the connection will affect how often the client and server get updates from each other. If it's really bad (laggy) or slow (dial up) it can affect the overall FPS of the game.
Basically, if the CPU and GPU are up to snuff, and seem to work fine in offline games (like say, Half Life 2), but GW is laggy, or has low fps, it's most likely that the network/internet connection is at fault. I have no personal experience with Comcast, other than hearing people bitch about it on Ventrilo.

GranDeWun
I switched from Comcast (cable) to ATT (DSL). Although my bandwidth is lower, which most people associate with 'speed', my latency (i.e. ping times) are also much lower, which is better for gaming.
This is most likely due to the software Comcast runs to 'shape' their network (i.e. kill P2P applications), which involves inspecting packets.
This is most likely due to the software Comcast runs to 'shape' their network (i.e. kill P2P applications), which involves inspecting packets.