VisionTek Radeon X1300 256MB PCI - Anyone else have?

Trizkit

Ascalonian Squire

Join Date: Jul 2006

Team HAXX

Rt/Mo

Since the video card does work with the other games and the problem is only occurring with the GW, it will not be a problem with the video card and its driver. The game manufacturer should come out with an update with their game to correct the problem. We are limited to the troubleshooting specific to a game. If our new driver does not resolve the problem, then there’s a problem within the software of the game.



Thank you,

Visiontek Tech Support

-----------------------

Straight from them.

I was told I need to find enough people with the same problem before they could patch it.

Problem with the Visiontek radeon X1300 PCI (not pci express)-- is you only get 10 FPS max, 5 fps normal. When you can run fear at high graphics with playable framerate on this pci card, there's no excuse it can't run Guild Wars.


So does anyone else have this card?

I've done google searches and found that others have had this same problem.

Drakkenauken

Pre-Searing Cadet

Join Date: Nov 2006

W/

It's a GW problem Anet should fix the performance issues that come out lately

Trizkit

Ascalonian Squire

Join Date: Jul 2006

Team HAXX

Rt/Mo

I told them that it was on thier side, and they simply closed the support thread and told me nothing of fixing it.

Tachyon

Tachyon

Forge Runner

Join Date: Nov 2005

Stoke, England

The Godless [GOD]

W/

Just how do you know it's down to the graphics card? It could be one of many things causing your frame rate problem.

Also, I doubt very much that you can run FEAR at "high" settings on that card, well unless you're running at 800x600x16 that is. For a start, it's PCI so the graphics bandwidth is severely limited.

Trizkit

Ascalonian Squire

Join Date: Jul 2006

Team HAXX

Rt/Mo

Why would I lie about it? Seriously?

Resolution is 800x600x32 for fear.

If a 9200 SE 128MB (PCI) can run GW at playable framerate( usually around 20-25 FPS), then me completely removing the 9200 SE and all of its drivers and using the x1300 and it can't run it with playable frame rates, its nothing but the guildwars build.

dronex

dronex

Lion's Arch Merchant

Join Date: Dec 2005

Mo/

Quote:
Problem with the Visiontek radeon X1300 PCI (not pci express)-- is you only get 10 FPS max, 5 fps normal. When you can run fear at high graphics with playable framerate on this pci card, there's no excuse it can't run Guild Wars.
Duh yeah lol with a x1300 PCI playable frame rates in FEAR come on ! and high graphics ... i bearly run it on minimum @ 1024 with a 6800 ultra (fear combat)

http://www.gpureview.com/Radeon-X1300-PCI-card-460.html from what i see this card shouldnt preform better than an old 9000pro in dx8 mode (with wich i get 10 fps max)

and after many threads about that i still DONT think GuildWars has problem.

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

i think this problem plagues people with mid-line level cards. it's probably something to do with a bug in some of the new graphics, which older cards do not support. my geforce fx5500 PCI can run the game at 1280x1024 resolution with high textures with no problems: i get around 18-40 fps.

basically, the cards that can run the game with good fps currently are: old cards, and really cutting edge cards.

Stemnin

Stemnin

Krytan Explorer

Join Date: Nov 2005

Mo/Me

Quote:
CNET editor's take
ATI's Radeon X1300 series isn't suited for gaming, even by a low-end card's standards. Fortunately, its video capabilities save it from the trash heap.
Tom's Hardware didn't have much else to say for this card.. how much RAM and CPU does your computer have, by the way?

Trizkit

Ascalonian Squire

Join Date: Jul 2006

Team HAXX

Rt/Mo

I have 2 gigs of ram(max my motherboard supports), and 3.5 ghz HT p4.


My computer only has PCI slot interface so I'm limited. From reviews that I researched X1300 was the best PCI card available.

And I'm not RED ENGINE GORED ENGINE GORED ENGINE GORED ENGINE GOing lying about fear, for christ sakes. Why in the hell would I lie about it? Buy the RED ENGINE GORED ENGINE GORED ENGINE GORED ENGINE GOing card and try it for yourself if you don't believe me. I didn't say my FPS was threw the roof or anything, but it's playable for me, wich it hovers around 20-30.

On source I usually get 40-60 FPS with all recomended settings(high).


Quote:
Originally Posted by techspot.com

Cons:
There are just a few games that this card just doesn't like. Guild Wars and Call of Duty 2 just don't like this card. Guild Wars (my system specs are over the requirements) runs only at 7 fps at max settings and only at ~10 at the lowest settings(compared to 15-20 on a 9200se). Call of Duty 2 also hates this card. It runs at 10 fps maxed out without AA, with AA it goes down to 1-3 (compared to 20-30 on a 9200se). Even when i tried the lowest settings (like 800x600, all details lowest or none and such) it only went to around 17 or less fps.

If I would have read that review first, I would not have bought the card, because I wanted more FPS in GW, and didn't think to google guildwars and x1300 PCI.

I figured if it could run BF2, and fear, guildwars was not going to be a problem.

ducktape

ducktape

Krytan Explorer

Join Date: Jul 2005

W/R

What exactly did you say to Guild Wars support? If they ask you do to something and you just say "no it's your fault, fix it" I can see why they would view that as being uncooperative and close your ticket.

Did you give them links to all of the posts from other users or reviewers saying that their X1300 cards run terribly in GW? Did you tell them that your 9200 gets better frame rates than the X1300? That's the kind of stuff that can get them to patch the game or harass ATI.

It is actually most likely a problem with ATI's drivers, their drivers over the last few months have sucked on cards that are not PCI-express. You should try running Catalyst 6.4 - it was recommended by ATI as the most stable version for the X1XXX series cards up until Catalyst 6.10, but I found that the Catalyst 6.9 drivers worked best for me. If using the old Catalyst does not improve your card's performance in GW, try opening a new ticket with Guild Wars support and play along with any diagnostic steps they suggest that are not terribly inconvenient (such as format and reinstall windows) and you shall find that they are actually quite helpful.

Trizkit

Ascalonian Squire

Join Date: Jul 2006

Team HAXX

Rt/Mo

Quote:
Originally Posted by ducktape
What exactly did you say to Guild Wars support? If they ask you do to something and you just say "no it's your fault, fix it" I can see why they would view that as being uncooperative and close your ticket.

Did you give them links to all of the posts from other users or reviewers saying that their X1300 cards run terribly in GW? Did you tell them that your 9200 gets better frame rates than the X1300? That's the kind of stuff that can get them to patch the game or harass ATI.

It is actually most likely a problem with ATI's drivers, their drivers over the last few months have sucked on cards that are not PCI-express. You should try running Catalyst 6.4 - it was recommended by ATI as the most stable version for the X1XXX series cards up until Catalyst 6.10, but I found that the Catalyst 6.9 drivers worked best for me. If using the old Catalyst does not improve your card's performance in GW, try opening a new ticket with Guild Wars support and play along with any diagnostic steps they suggest that are not terribly inconvenient (such as format and reinstall windows) and you shall find that they are actually quite helpful.

I was not rude, and I did all of thier tests.
ISP tests, downloading programs and running it for them, and they just kept sending me in circles.

I linked them to the other posts, and they just asked me to run more tests, I did.

I gave them all of the information inside and out. Eventually after them making me run threw more of the SAME tests, getting the same results.. them going hmm everything looks to be fine.. try this - I told them that "its on the GW's side not my side, can you please look into it?" and they closed the ticket.

Upon reading up, I read that 6.3 Is the best driver for OpenGL games, and from what I've read and remember GW is a open GL game.

I'm going to download it and try it, I remember downgrading drivers trying to get it to work before, but lowest I wen't was 6.6 or 6.5 I believe.

Horseman Of War

Horseman Of War

Desert Nomad

Join Date: Jun 2006

The Cult of Doom

P/

i use an ATI card, and during the time that Ive had it- 6.4 is definately the best driver to use- although the Catalyst Control Center I find the best use of it was not ever installing it. Ive gone through a lot of different versions (i think up to 6.7) before making this decision- didnt do a lot of forum searching, just my own experience over the last half a year or so.

and dude i know you arent lying about fear because I was able to get oblivion to run on my old card

MegaMouse

MegaMouse

Wilds Pathfinder

Join Date: Jan 2006

south mississippi

Warriors Of Melos WOM

E/N

As stated earlier in this post the problem is not with Guild Wars, its your card. The bandwidth in the old style pci slots ios shared with everything within that section. If you have a sound card and moden in any other pci slots they are taking away from the video card. It realy wont matter what card you use if you stay with pci style cards, they will all perform about the same crappy. If you can afford it I would get a computer with AGP or even better the new PCI-Express slots and upgrade.

Meag Mouse

Trizkit

Ascalonian Squire

Join Date: Jul 2006

Team HAXX

Rt/Mo

Quote:
Originally Posted by MegaMouse
As stated earlier in this post the problem is not with Guild Wars, its your card. The bandwidth in the old style pci slots ios shared with everything within that section. If you have a sound card and moden in any other pci slots they are taking away from the video card. It realy wont matter what card you use if you stay with pci style cards, they will all perform about the same crappy. If you can afford it I would get a computer with AGP or even better the new PCI-Express slots and upgrade.

Meag Mouse
As I stated earlier, I'm using a freakin' 9200 SE PCI 128 MB and it runs GW fine.

one that runs at 200 engine core, 200 graphics core.


I put in a x1300 256 MB, 465 engine core, 265 graphics core- and it barly runs.

9200 can't even play half of the games I own, the x1300 can play them all with playable framerates.


It runs every game fine, but a non graphic intense game like Guild Wars. I'm not saying GW has bad graphics - but it's def. not a system hog like some MMORPGS I've played. (EQ2).


For those that care - I tried downgrading my drivers to 6.4, same fps issue, tried 6.3 same thing.
I really think the game just does not like the PCI version of this card.

dronex

dronex

Lion's Arch Merchant

Join Date: Dec 2005

Mo/

Quote:
Originally Posted by Trizkit
As I stated earlier, I'm using a freakin' 9200 SE PCI 128 MB and it runs GW fine.

one that runs at 200 engine core, 200 graphics core.


I put in a x1300 256 MB, 465 engine core, 265 graphics core- and it barly runs.

9200 can't even play half of the games I own, the x1300 can play them all with playable framerates.


It runs every game fine, but a non graphic intense game like Guild Wars. I'm not saying GW has bad graphics - but it's def. not a system hog like some MMORPGS I've played. (EQ2).


For those that care - I tried downgrading my drivers to 6.4, same fps issue, tried 6.3 same thing.
I really think the game just does not like the PCI version of this card.
Where 9200 is a DX 8.1 card and the x1300 is DX9

Have you actualy tried to run the x1300 in dx8 mode !?

Trizkit

Ascalonian Squire

Join Date: Jul 2006

Team HAXX

Rt/Mo

I've actually tried that - But everytime I try to install a DX8, its tells me the version I have is better then my current.


Is there a way to switch "modes"?

I've never heard of such trickery.

Tachyon

Tachyon

Forge Runner

Join Date: Nov 2005

Stoke, England

The Godless [GOD]

W/

Searching and reading FTW!

http://www.guildwarsguru.com/forum/s...ad.php?t=94814

Trizkit

Ascalonian Squire

Join Date: Jul 2006

Team HAXX

Rt/Mo

Didn't work.

dronex

dronex

Lion's Arch Merchant

Join Date: Dec 2005

Mo/

Quote:
Originally Posted by Trizkit
I've actually tried that - But everytime I try to install a DX8, its tells me the version I have is better then my current.


Is there a way to switch "modes"?

I've never heard of such trickery.
Ok since the forum search dosent work for me and i cant link the thread with the commandline options here... if you know how to use switches add a -dx8 to your shortcut if not heres how u do it:

thats right mouse click on your GW shortcut
Then check your graphics options ingame and see what renderer says if its directx8 it worked and u can see if there is any fps difference.

MegaMouse

MegaMouse

Wilds Pathfinder

Join Date: Jan 2006

south mississippi

Warriors Of Melos WOM

E/N

Look at the core and memory speed on the X1300 vs the same on the 9200. You will see a big difference. The main problem is thqat the X1300 is a budjet card and does niot have the support for several games. I am not saying that it cant run other games fine. You will need to get a card that can handle the graphics that Guild Wars has. The 9200 was a top of the line board when it came out(I still have 1 in a desktop that runs fine. I still say you should upgrade to a better computer that has the slot to take a better video card.

Mega Mouse

cannonfodder

cannonfodder

Tech Monkeh Mod

Join Date: May 2005

Good Old North East of England

Mo/Me

As I have said numerous times, comparing memory clock and core speeds is pointless, what matters in the case of graphics chips/cards is pixel pipelines, vertex shader units, wether your memory is DDR, DDR2, GDDR3, or the newest GDDR4, also the memory bus i'e 32bit, 64bit, 128bit, 256bit, as again with this higher is always better.

sadly as has been said the X1300 is a low end card, and will not give the performance you deem acceptable. the 9200 although older technology is actually a better card.

Trizkit

Ascalonian Squire

Join Date: Jul 2006

Team HAXX

Rt/Mo

Then why can the x1300 run games, and EVRY other game EXCEPT guild wars better then the 9200?

Since it's a better card?

Please make sense.

And if clock and core speeds is pointless, why when you overclock - that's what you are overclocking?

cannonfodder

cannonfodder

Tech Monkeh Mod

Join Date: May 2005

Good Old North East of England

Mo/Me

Quote:
Originally Posted by Trizkit
Then why can the x1300 run games, and EVRY other game EXCEPT guild wars better then the 9200?

Since it's a better card?

Please make sense.

And if clock and core speeds is pointless, why when you overclock - that's what you are overclocking?
I do make sense, please read again what I wrote regarding performance of graphic cards, I said comparing clock speeds is irellevent and pointless, the points I made regarding the cards i.e pixel shaders etc are the important factor in performance.

Now before you make a fool of yourself read up on it please, as for overclocking it only comes into consideration on the card your overclocking.

Please do not make this into an argument over something so trivial, afterall I am right.

Trizkit

Ascalonian Squire

Join Date: Jul 2006

Team HAXX

Rt/Mo

Quote:
Originally Posted by cannonfodder
I do make sense, please read again what I wrote regarding performance of graphic cards, I said comparing clock speeds is irellevent and pointless, the points I made regarding the cards i.e pixel shaders etc are the important factor in performance.

Now before you make a fool of yourself read up on it please, as for overclocking it only comes into consideration on the card your overclocking.

Please do not make this into an argument over something so trivial, afterall I am right.
(From newegg , same cards I have. )
X1300 PCI specs.

Model
Brand VisionTek
Model VTKX1300256PCI
Interface
Interface PCI
Chipset
Chipset Manufacturer ATI
GPU Radeon X1300
Core clock 450MHz
PixelPipelines 4
Memory
Memory Clock 533MHz
Memory Size 256MB
Memory Interface 128-bit
Memory Type GDDR2
3D API
DirectX DirectX 9
OpenGL OpenGL 2.0
Ports
D-SUB 1
DVI 1
TV-Out HDTV/S-Video/Composite Out
VIVO No
General
Vista Ready Yes
Dual-Link DVI Supported Yes
Tuner None
RAMDAC 400 MHz
Max Resolution 2560x1600
Cooler With Fan
Operating Systems Supported Windows XP
Windows XP Media Center Edition
Windows 2000
Windows Vista



My 9200 Specs.

Brand ATI
Model 100-436009
Interface
Interface PCI
Chipset
Chipset Manufacturer ATI
GPU Radeon 9200
PixelPipelines 4
Memory
Memory Size 128MB
Memory Interface 128-bit
Memory Type DDR
3D API
DirectX DirectX 8
OpenGL OpenGL 1.3
Ports
D-SUB 1
TV-Out S-Video/Composite Out
VIVO No
General
Vista Ready No
RAMDAC 400 MHz
Max Resolution 2048x1536
Cooler Fanless
Operating Systems Supported Windows 98/ME/2000/XP


How can you say 9200 is a better gpu then x1300?

Give me links proving this - I'm not arguing with you, or trying to make myself or YOU look like an ass. So quit trying to do the same to me? I said please, isn't like I cursed you for your name or other stupid shit.

Besides; explain why 9200 can't get over 20 fps in source on high graphics but x1300 gets 40-60. I know why 9200 can't run some games, because of certain pixel shading I believe. (I'm not a 100%) but it doesn't run as fast as the x1300. It only has a heatsink to, though that wasn't affect its performance.

I have to overclock the 9200 just to get it to run at around 15 FPS on GW steady. Though I know it's not smart to overclock PCI cards, especially ones without a heatsink+fan.

Point is, the X1300 stands alot higher then the 9200. I could see if it was a 9200 PRO that they might compare, but it's just a 9200SE. It's doesn't have a better GPU build then the X1300.

If you wan't to give me links and prove me wrong, more power to you - I'm not trying to flame you and shit so don't take it that way.

I'm just looking for support as to why this card can't run GW but can run any other game. The visiontek support (makers of my x1300 card) says its the game - because the card can run any other games, and games with higher alot higher requirments then GW.



And besides the fact even if 9200 is better then 1300 - wich its not a better GPU, it still has enough power to run GW at playable framerates - not 5-10 FPS. I mean I ran GW on intergrated graphics with them FPS before.


If you would, just lock this thread - I'm oviously not going to get anywhere with this thread. I figured someone in the GW guru universe might have a x1300 pci card to, I guess I was wrong. Thread has just gone off topic, and pointless.


I'm just going to go buy a Geforce 6200 - This thread is pointless.

dronex

dronex

Lion's Arch Merchant

Join Date: Dec 2005

Mo/

http://www.gpureview.com/show_cards....1=69&card2=460 read here

good luck with the 6200 ...

Trizkit

Ascalonian Squire

Join Date: Jul 2006

Team HAXX

Rt/Mo

Looks to me the PCI X1300 comes out on top, stats wise still- and that 9200 is an AGPx8 - not a PCI 9200.

dronex

dronex

Lion's Arch Merchant

Join Date: Dec 2005

Mo/

Quote:
Originally Posted by Trizkit
Looks to me the PCI X1300 comes out on top, stats wise still- and that 9200 is an AGPx8 - not a PCI 9200.
yeh they dont have pci in the site that why i chose agp but still ... the x1300 dosent have the power to run gw in dx9
and come on ... PCI its realy time for upgrade dont you think