Post your GW performance!
Yanman.be
This picture has NOT been photoshopped.
Faer
I call shenanigans.
Xero Silvam
bullshit imo
makosi
How do you get your FPS that high? Mine's capped at 75.
Yanman.be
Makosi, that is probably because you have Vertical Sync activated. Vsync caps your fps at the Hz your screen has, in this occasion, 75 Hz.(It's a CRT?)
The fps is 100% guaranteed real.
Here's the full picture, you'll know why it's so high now http://home.scarlet.be/yanman/gw029.bmp
The fps is 100% guaranteed real.
Here's the full picture, you'll know why it's so high now http://home.scarlet.be/yanman/gw029.bmp
makosi
I understand now, thanks. It was VSync enabled and I do not have a CRT! How dare you. :P
Yanman.be
Well then you have a pretty new LCD or TFT (even plasma? ) then. Most of them still have 60hz. 75hz and 80+hz were CRT refresh rates.
Artdeux
Small amount of triangles, low settings, looking into a rock, yeah, it's possible. Not sure on the Ping though, that seems very low.
Now go into Window mode and make it even smaller.
Now go into Window mode and make it even smaller.
Yanman.be
I think window mode should drop the fps...
New boost found: http://home.scarlet.be/yanman/gw033.bmp
Key is to get you triangles as low as possible, and turning off the GUI helped alot too. Next step: meddling with my drivers.
Made it a straight link until you decide to make it a decent filesize.
New boost found: http://home.scarlet.be/yanman/gw033.bmp
Key is to get you triangles as low as possible, and turning off the GUI helped alot too. Next step: meddling with my drivers.
Made it a straight link until you decide to make it a decent filesize.
RTSFirebat
Moved to off topic, not really a screenshot thread.
Trakata
Decided to move this to Tech Corner, it's not really off-topic either. I think it's best suited to be here.
Tachyon
I'd still rather have my 180+ FPS in combat than 1,500+ when staring at nothing.
acidic
30 fps in Abbadon's mouth mission, 60-80fps everywhere else
lord_shar
I get a solid 60fps at 1920x1200 resolution (vsync on). My system is a Dell XPS M1710 17" gaming laptop with a GF7900GTX-512. It may not be the fastest system, but when you can play GW anywhere, it's pretty nice
EDIT: BTW, if you're playing from an LCD, there's really no point in playing with vsync off unless you want to kill your LCD sooner. There's a very good reason for DVI being capped at 60hz for LCD's...
EDIT: BTW, if you're playing from an LCD, there's really no point in playing with vsync off unless you want to kill your LCD sooner. There's a very good reason for DVI being capped at 60hz for LCD's...
Lonesamurai
25/30FPs in town's/outpost's and in GvG, 75 to 100FPs everywhere else
bhavv
29/30 minimum in crowded outposts, 40-60+ everywhere else. Max settings, AA and AF 1280x1024.
Thats with my PC @ stock speeds. Now if I whack in my 500mhz CPU + FSB OC, 100mhz on the ram and X1900xtx from 650/1550 to 700/1640.......
It would be better
Thats with my PC @ stock speeds. Now if I whack in my 500mhz CPU + FSB OC, 100mhz on the ram and X1900xtx from 650/1550 to 700/1640.......
It would be better
Yanman.be
Quote:
Originally Posted by lord_shar
BTW, if you're playing from an LCD, there's really no point in playing with vsync off unless you want to kill your LCD sooner. There's a very good reason for DVI being capped at 60hz for LCD's...
|
JonnyWarhawk
You don't even want to know. Mine is running like crud.
lord_shar
Quote:
Originally Posted by Yanman.be
Can you give some more info on that? I'm running 1 TFT and 1 LCD, My 8800GTX has 2 dvi slots, the screens are both VGA, so I have little converters attached to them...any problem if I run without vsync?
|
LCD's don't have pixel strobe fade problems like CRT's, so they don't need high refresh rates to maintain image clarity and smoothness without flicker. LCD pixels have constant luminance, so if they are set to a certain color, they stay at that color until instructed to change. CRT's, on the other hand, immediately start losing pixel luminance after being set to the desired color. Therefore, CRT pixels require faster refresh strobes to maintain a static image. This is why CRT's flicker horribly at < 72hz, while LCD's have no such problems at 60hz.
Yanman.be
So because my vid card is DVI it will automatically send out @ 60Hz?
lord_shar
Quote:
Originally Posted by Yanman.be
So because my vid card is DVI it will automatically send out @ 60Hz?
|
easyg
Just installed GW on a new computer (just built a couple weeks ago)
Graphics settings are: Full screen 1680x1050 (monitor Samsung 226BW 22" widescreen), AA 4X, visual quality highest....
Getting ~60 FPS in towns (Ascalon City, LA, etc). 200-300 FPS everywhere else. Usually around 230-40 most places.
AW9D (I975X)
E6600 (3.2GHz - multi decrease to 8 x 400 FSB)
Ballistix DDR2 800 (running in sync w/ bus)
Sapphire HD 2900XT (stock settings)
Catalyst drivers (XP pro): 6.14.10....
As everyone has said about the HD 2900XT, performance takes a major hit with AA on higher rez. FPS goes up to 300-400+ with AA turned off lol!
Graphics settings are: Full screen 1680x1050 (monitor Samsung 226BW 22" widescreen), AA 4X, visual quality highest....
Getting ~60 FPS in towns (Ascalon City, LA, etc). 200-300 FPS everywhere else. Usually around 230-40 most places.
AW9D (I975X)
E6600 (3.2GHz - multi decrease to 8 x 400 FSB)
Ballistix DDR2 800 (running in sync w/ bus)
Sapphire HD 2900XT (stock settings)
Catalyst drivers (XP pro): 6.14.10....
As everyone has said about the HD 2900XT, performance takes a major hit with AA on higher rez. FPS goes up to 300-400+ with AA turned off lol!
Etta
lol, Yanman, what resolution setting is it? max detail or lowest? I used a rather high resolution, max setting, full anti x4. When in town like say LA1, I got 50-60. When outside I got a lot higer fps, 80 upward with real people, triple digit when with heroes.
The pings are 70-250 on a good day, went above 750 mark on a bad day.
The pings are 70-250 on a good day, went above 750 mark on a bad day.
Yanman.be
The high fps is offcourse because of everything on low, 800*600, low settings etc. I'm going to test various command line flags to improve the fps.
Also, if anyone finds a spot with less than 5000 triangles, please post it!
Also, if anyone finds a spot with less than 5000 triangles, please post it!
Malice Black
60-80FPS on the notebook (specs below) running 1680x1050 on a 17" screen with everything on high and 4xAA
Rock CTX Pro
2GHz
2GB RAM
7950GTX
Got rid of the desktop. Will build a new one when games start demanding DX10
Rock CTX Pro
2GHz
2GB RAM
7950GTX
Got rid of the desktop. Will build a new one when games start demanding DX10
N E D M
my CRT monitor has a default refresh rate of 60, and my lcd has 75 . . . . ..
onerabbit
LoL .. wth.
mines like
FPS: 10 - 30
Ping: 60 - 200 ..
my comp is like 3 years old though =p
mines like
FPS: 10 - 30
Ping: 60 - 200 ..
my comp is like 3 years old though =p
Yanman.be
New frontier: trying to pass 2800:
Boone
Yanman, what's your normal fps then? on what settings? :P
oh, I did an attempt.. got to 1600 fps o.o
oh, I did an attempt.. got to 1600 fps o.o
Yanman.be
I've never seen my fps drop below 60. 200 fps on average.
Nite_Creeper
In towns I get 70-150 fps
Outside 200-350 fps
1280x1024 resolution
4x AA
Max Settings
running a 8800GTS 640MB
[email protected] GHz (had it at 3.2GHz stable but temps went over 50 c at 100% under orthos on stock HSF)
2 GB G.Skill Ram
680i LT mobo
Outside 200-350 fps
1280x1024 resolution
4x AA
Max Settings
running a 8800GTS 640MB
[email protected] GHz (had it at 3.2GHz stable but temps went over 50 c at 100% under orthos on stock HSF)
2 GB G.Skill Ram
680i LT mobo
Quaker
One thing that needs to be cleared up here is that there 'may' be a difference between fps (frames per second) and the game's 'update rate'. I'm not sure about this in GW but I've seen similar things in other games.
To put it simply-ish - your graphics card/driver is responsible for updating your screen. If you turn 'vsync' on so the graphics update only happens during the vertical interval, you will get an fps that is the same as your monitor's vertical rate. (For most LCDs, this is 50-75Hz) If you turn it off, you will get faster 'fps' but the card may not be displaying anything different in those extra frames.
The update rate (fps) of your video card isn't necessarily tied directly to the rate at which the server updates the game data (your position, other characters position, who shot who, etc.), and doesn't necessarily affect the 'smoothness' of the game play or anything - unless it gets really low. Ping is a factor, however.
As you all know, sometimes the game play gets choppy anyway, regardless of you video fps, because the data coming from the GW server is lagging or slow.
For example, the OP was getting '1200' fps - but 1200 fps of the same blank wall. There was nothing for the video card to 'update' so it could calculate it very quickly. Meanwhile, the game, still going on, would be updating his position, and the positions/actions of others in the game at some rate that I'm sure would be much less than 1200 times per second
Years ago, I ran a TFC (a first person shooter) server. I could watch the performance of the server while it was in use. Because of the servers cpu, ram, etc., it usually ran at an 'update rate' of around 80 cps (cycles per second.) - that is, the server would complete a 'cycle' and update the game's information 80 times per second. I had people connect to the server who said that, with vsync turned off, they were getting 150-200 fps. Well, no matter how many times per second their video card was updating their screen, they were actually only getting 80 'new' frames from the server per second, anyway.
To put it simply-ish - your graphics card/driver is responsible for updating your screen. If you turn 'vsync' on so the graphics update only happens during the vertical interval, you will get an fps that is the same as your monitor's vertical rate. (For most LCDs, this is 50-75Hz) If you turn it off, you will get faster 'fps' but the card may not be displaying anything different in those extra frames.
The update rate (fps) of your video card isn't necessarily tied directly to the rate at which the server updates the game data (your position, other characters position, who shot who, etc.), and doesn't necessarily affect the 'smoothness' of the game play or anything - unless it gets really low. Ping is a factor, however.
As you all know, sometimes the game play gets choppy anyway, regardless of you video fps, because the data coming from the GW server is lagging or slow.
For example, the OP was getting '1200' fps - but 1200 fps of the same blank wall. There was nothing for the video card to 'update' so it could calculate it very quickly. Meanwhile, the game, still going on, would be updating his position, and the positions/actions of others in the game at some rate that I'm sure would be much less than 1200 times per second
Years ago, I ran a TFC (a first person shooter) server. I could watch the performance of the server while it was in use. Because of the servers cpu, ram, etc., it usually ran at an 'update rate' of around 80 cps (cycles per second.) - that is, the server would complete a 'cycle' and update the game's information 80 times per second. I had people connect to the server who said that, with vsync turned off, they were getting 150-200 fps. Well, no matter how many times per second their video card was updating their screen, they were actually only getting 80 'new' frames from the server per second, anyway.