Fps
Res Surection
what is highest amount of FPS you can get? i hear people being under 100, my bros freinds seems to be over 900. i mean overall not just in GW
briando
=/ depends on ur rig and if you have a refresh rate which caps the fps
Res Surection
mine runs at 818 fps , an apg card..............
Tachyon
Quote:
Originally Posted by Res Surection
mine runs at 818 fps , an apg card..............
|
Res Surection
hey, im not the one saying 818, some guy at some pc shop said it ran 818 fps.... i was kinda amazed myself lol,i think im gnna return it and get another vid card anyhow, an ati X800 now
Bgnome
Quote:
Originally Posted by briando
=/ depends on ur rig and if you have a refresh rate which caps the fps
|
tell those "friends" of your brother and that "guy" at "some pc shop" that they are wasting perfectly good clock cycles. they should get better monitors with better refresh rates and play with the highest resolutions. if they are still wasting frames, then they should get more monitors and go with multiscreen gaming!
Xenrath
Uh... so anyway, biggest money saving tip ever: don't let yourself get fixated by numbers such as fps. Beyond a certain point it makes little difference to your eyes anyway. After that it's just down to bragging rights about how big your numbers are...
I'd say if you're getting 70-100fps you're doing just fine
Edit: "overall" is a pretty useless comparison since it won't exist. Your mileage will vary from game to game, because of the way they've been programmed, the drivers, the hardware, the version of directx/opengl/custom code etc. You could get 999999999+fps if you play a really ancient 3D game with the latest hardware for example, it's pretty meaningless.
I'd say if you're getting 70-100fps you're doing just fine
Edit: "overall" is a pretty useless comparison since it won't exist. Your mileage will vary from game to game, because of the way they've been programmed, the drivers, the hardware, the version of directx/opengl/custom code etc. You could get 999999999+fps if you play a really ancient 3D game with the latest hardware for example, it's pretty meaningless.
Narada
Quote:
Originally Posted by Res Surection
hey, im not the one saying 818, some guy at some pc shop said it ran 818 fps.... i was kinda amazed myself lol,i think im gnna return it and get another vid card anyhow, an ati X800 now
|
Josh
^^ Lmfao, I remember that.
Same guy who also said this?
Same guy who also said this?
Quote:
Originally Posted by Res Surection
ur luky u got pci-e
the guy with pc shop has some 700$ 256mb video card imported from japan thats the one with over 900fps, mine didnt run at 818fps, he was telling me to buy one at that, lol, i asked my bro to call him and ask |
Res Surection
yes, it is actually the same guy at the shop who sad the thing about electric bill, 2 of them said that, and are u serious ukranian?
Res Surection
Quote:
Originally Posted by Bgnome
what is the use of a frame that isn't even drawn? your refresh rate limits the amount of frames that are displayed, effectively capping any FPS output. if your monitor is set at 85 Hz, it can't draw more than 85 frames per second. anything beyond that is pure fluff. once you get beyond that point, FPS is completely meaningless.
tell those "friends" of your brother and that "guy" at "some pc shop" that they are wasting perfectly good clock cycles. they should get better monitors with better refresh rates and play with the highest resolutions. if they are still wasting frames, then they should get more monitors and go with multiscreen gaming! |
BuD
Is there a command to see your FPS? Or you all using Fraps?
Res Surection
On your desktop right click on the Guild Wars icon and then goto the Properties option. Add this at the end of the target :-
-perf display fps
It should look like this afterwards:-
"D:\Games\Guild Wars\Gw.exe" -perf display fps
I can't 100% remember if they should be in inverted commas though, if it doesn't work then try adding this instead " -perf display fps".
Obviously on your system it'll look a little different as you may not have installed to the same location as me.
- as said my azagoth
you just need -fps as said by ghozer
Thats in guild wars tho..by what i kno
-perf display fps
It should look like this afterwards:-
"D:\Games\Guild Wars\Gw.exe" -perf display fps
I can't 100% remember if they should be in inverted commas though, if it doesn't work then try adding this instead " -perf display fps".
Obviously on your system it'll look a little different as you may not have installed to the same location as me.
- as said my azagoth
you just need -fps as said by ghozer
Thats in guild wars tho..by what i kno
swaaye
People can most definitely perceive higher than 30fps in a game. Frames on a computer are not like photographic frames, there is no innate motion blur to interpolate between what you see.
http://en.wikipedia.org/wiki/Frame_rate
http://en.wikipedia.org/wiki/Frame_rate
Teklord
Yes you can perceive over 30. At about 60 - 70 (refresh rate allowing) you will no longer perceive a difference. At least I read that at my favorite little review site.
OneArmedScissor
The human eye can't perceive over 16FPS (give or take a few)... Hence why when you see a wheel on a car spinning backwards on a highway, etc... Too many FPS for your eye to keep up... but notice one thing about the wheel... it's motions are very smooth.
Motions, etc., become very smooth at high FPS.
Motions, etc., become very smooth at high FPS.
Narada
I believe that Teklord and swaaye are correct on the range of frequency in human perception. If I remember correctly, my electronics teacher stated that some people are able to notice a difference in quality between television picture "quality" when going from the USA to the UK (or the other way around), as apparently TV's in the US operate at a higher frequency refresh rate (60 Hz) than they do in the UK (50 Hz). A 'flicker effect' is mentioned as apparently when your eyes become used to a faster frequency they'll pick up on the flicker, or time inbetween Raster Scans, much easier on slower display devices and sometimes cause headaches or mild discomfort.
I think that this doesn't as much apply with more modern viewing devices anymore though (the frequency discrepancy). I could be wrong however, I haven't worked on video for about a semester. Anyways, this link would probably be helpful.
I think that this doesn't as much apply with more modern viewing devices anymore though (the frequency discrepancy). I could be wrong however, I haven't worked on video for about a semester. Anyways, this link would probably be helpful.
Tachyon
From my perspective, as someone who's spent a hell of a lot of time Stateside, the only difference in TV output that I noticed is in the picture quality. i.e. the difference between our 50Hz 625 line PAL and your 60Hz 525 line NTSC. I also believe that your NTSC format only utilises 480ish lines for the picture itself with the remainder used for other things such as captions and the like.
As for our 50Hz, yes on older CRT televisions very ocassionaly you could notice a bit of flicker. However, my old CRT was 100Hz and so I never got so much as a single flicker, running in both PAL and NTSC formats.
As for our 50Hz, yes on older CRT televisions very ocassionaly you could notice a bit of flicker. However, my old CRT was 100Hz and so I never got so much as a single flicker, running in both PAL and NTSC formats.
Ghozer
In Unreal Tournament, Half Life 1 etc, on a Radeon 9800pro (128mb) I have had upwards of 300fps... (1280x1024) (on a 160Htz refresh monitor, so effectivly 160fps, but it was scan doubled, so i actually got... 300fps)
On Half Life 2 and UT2004, on the same card, i had between 70 and 90FPS (1280x1024)
With the Radeon x1600 I now have (Still AGP old motherboard) I have had upwards of 450 fps (reported) on UT/HL1 (Still on the same monitor mind..) and between 95 and 180 fps on HL2/UT2004
- 900 fps is acchievable, IF you have everything low, in a low res, on a plasma screen (as they have no refresh (not in the same sence any way))
On Half Life 2 and UT2004, on the same card, i had between 70 and 90FPS (1280x1024)
With the Radeon x1600 I now have (Still AGP old motherboard) I have had upwards of 450 fps (reported) on UT/HL1 (Still on the same monitor mind..) and between 95 and 180 fps on HL2/UT2004
- 900 fps is acchievable, IF you have everything low, in a low res, on a plasma screen (as they have no refresh (not in the same sence any way))
Res Surection
oooo i want the new unreal tournament. AGP isnt to bad, just cost more for a better card
capblueberry
i have around 25 fps and i hardly ever lag
BuD
Thanks a ton!
This is what worked for me
"C:\Program Files\Guild Wars\Gw.exe" -perf display fps
This is what worked for me
"C:\Program Files\Guild Wars\Gw.exe" -perf display fps
Res Surection
ooo i helped lycan with that one,
25fps? wat vid card u runnin?
25fps? wat vid card u runnin?
shinja
As noted above the human eye can tell the diff up to about 60fps.
While beyond this is not noticed during normal gameplay, the times when it helps to have a machine capable of more is when it suddenly has to render a lot more on your screen, ie a bunch of players or mobs suddenly pop in...which if you sustain 60 in normal play...under certain play conditions your pc can drop to half that, causing visual lag.
Therefor the benefit to having a potential of 100 FPS means less chances of visual lag when your computer is working hard, especially in towns or during frantic mass battles...which hopefully Factions will bring about mass battles.
That being said, anyone telling you they get 800 or better fps in just about any game, regardless of whether or not they work at a shop, is full of the poop.
If you need computer support you are probably better off posting somewhere like here for tips and learning to build / configure your own computer. It is no where near as difficult as some will make it out to be, and its a pretty fun hobby with actual earning potential
While beyond this is not noticed during normal gameplay, the times when it helps to have a machine capable of more is when it suddenly has to render a lot more on your screen, ie a bunch of players or mobs suddenly pop in...which if you sustain 60 in normal play...under certain play conditions your pc can drop to half that, causing visual lag.
Therefor the benefit to having a potential of 100 FPS means less chances of visual lag when your computer is working hard, especially in towns or during frantic mass battles...which hopefully Factions will bring about mass battles.
That being said, anyone telling you they get 800 or better fps in just about any game, regardless of whether or not they work at a shop, is full of the poop.
If you need computer support you are probably better off posting somewhere like here for tips and learning to build / configure your own computer. It is no where near as difficult as some will make it out to be, and its a pretty fun hobby with actual earning potential
Res Surection
dude, i never said it was in a game, just raw 900+ for his server pc on a crazy 700$ pcie
awesome sauce
from http://en.wikipedia.org/wiki/Movie_projector:
Not that wikipedia is a very reliable source, but i happen to know this is true. My fps ranges from 20 to 50 fps depending on where i am in the game and i cant tell the difference.
Quote:
The frequency at which flicker becomes invisible is called the flicker fusion threshold, and is dependent on the level of illumination. Generally, the frame rate of 16 frames per second (fps) is regarded as the lowest frequency at which continuous motion is perceived by humans. |
Teklord
So 16fps is the lowest frequency at which continuous motion is percieved. I can understand that. So all fps above 16 would be percieved as continous motion, this also makes sense. However it would seem to me that there would easily be a perception differnece between that 16fps low frequency and say double that at 32fps. The motion is going to look much more fluid. And beyond that if you jump from 30 to 60, a difference although less noticeable would still be present. Now taking that same range of 30 to 60 increasing it in intervals of 2fps over a period of 15 seconds probably will go unnoticed vs a direct jump from 30 to 60 in one second. If I'm getting across what I'm trying to get across. However even large jumps when starting above 60 - 70 fps are simply not percievable; say a jump from 80 to 120. Given a display device that could actually show that jump as well...
OT personal funny thought: this is reminding me of the conversation I had about Artificial Intelligence back last September. That was gooder too.
OT personal funny thought: this is reminding me of the conversation I had about Artificial Intelligence back last September. That was gooder too.
Revivalizt
I'm sorry I'm nub
Is there any other command you can do .. other than being able to see the Frame Per Second ?
I saw a thread somewhere .. I couldnt search it. forgive me.
Is there any other command you can do .. other than being able to see the Frame Per Second ?
I saw a thread somewhere .. I couldnt search it. forgive me.
Opeth11
Go to your Guild Wars icon, right click, properties.
Now add '-perf' to your target. It should look like this: "C:\Program Files\Guild Wars\Gw.exe" -perf
Now add '-perf' to your target. It should look like this: "C:\Program Files\Guild Wars\Gw.exe" -perf
Matsumi
Sax Dakota
Quote:
Originally Posted by Azagoth
From my perspective, as someone who's spent a hell of a lot of time Stateside, the only difference in TV output that I noticed is in the picture quality. i.e. the difference between our 50Hz 625 line PAL and your 60Hz 525 line NTSC. I also believe that your NTSC format only utilises 480ish lines for the picture itself with the remainder used for other things such as captions and the like.
As for our 50Hz, yes on older CRT televisions very ocassionaly you could notice a bit of flicker. However, my old CRT was 100Hz and so I never got so much as a single flicker, running in both PAL and NTSC formats. |
Theres A HUGE difference between NTSC and your computer monitor tho. When we get into scanlines etc, when it comes down to it, the way that NTSC scans you only effectively get 30 fps. Mainly because of the interpolated scans.
And I get 45-70 fps 1280x1024 max graphic quality, 6x AA, 16 AF.
Lurid
Quote:
Originally Posted by Res Surection
yes, it is actually the same guy at the shop who sad the thing about electric bill, 2 of them said that, and are u serious ukranian?
|
lord_shar
60 fps on my laptop -- most LCD screens are capped at 60hz due to the DVI standard. This also protects LCD's from premature pixel-degredation.