Problems deciding on new Video Card for New Computer

Syco Masato

Frost Gate Guardian

Join Date: Mar 2007

New Jersey

D/

Hello, just recently I've decided to buy a new computer, and I have already picked out one that is perfect for me
Only problem is, I can't really decide on a video card to purchase for it, as I've been told and read online that the video chip that's included isn't really all that great.
My friend, who knows a lot more about computers than I, has suggested something that's "overclocked" and I did find some that were like this but.. after that I can't really tell what's better than what when the prices are nearly all the same.

Here are the specs for the computer I'll be buying-
http://reviews.cnet.com/desktops/gat...-32953387.html

I'd like to know everyone's opinion on a video card that's around $100 and that would be better than the chip this computer comes with

Thanks for your time.

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

just because something is overclocked, doesn't mean it's any good. an overclocked 8600GTS and a stock HD3850 tends to be very close in price, but the HD3850 will annihilate it in every category.

unfortunately, that system you posted is not very upgradable. it uses an old chipset, it has a fairly weak processor by today's standard, and the power supply it uses won't be able to power a good graphic card.

usually, trying to upgrade a cheap prebuilt computer won't work, simply because the manufacturer use parts that won't have a lot of upgrading headroom. to them, they want you to buy another system instead of upgrading. if you want a system that's cheap and have exactly what you want, you're better off building it yourself, or pay someone to build it for you.

Syco Masato

Frost Gate Guardian

Join Date: Mar 2007

New Jersey

D/

..Okay, so maybe the video card doesn't have to be the BEST, just something thats better than what this computer comes with.
And as for the computer not being very upgradable, that's okay, because its a hell of a lot better than the comp I'm using now :S

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

something like this perhaps:

http://www.newegg.com/Product/Produc...82E16814131088

that should give you enough power to play GW, and other older games easily.

just make sure there isn't another expansion card near the PCI-e x16 slot, since the cooler on that card will block it off.

Syco Masato

Frost Gate Guardian

Join Date: Mar 2007

New Jersey

D/

Quote:
Originally Posted by moriz
something like this perhaps:

http://www.newegg.com/Product/Produc...82E16814131088

that should give you enough power to play GW, and other older games easily.

just make sure there isn't another expansion card near the PCI-e x16 slot, since the cooler on that card will block it off.
So you're sure that'll run fine on the comp I've chosen? Like the power supply will handle it?

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

pretty sure. that card does not need an extra molex connector, so you should be safe.

Tarun

Tarun

Technician's Corner Moderator

Join Date: Jan 2006

The TARDIS

http://www.lunarsoft.net/ http://forums.lunarsoft.net/

Moving to Computer Buying & Building

Syco Masato

Frost Gate Guardian

Join Date: Mar 2007

New Jersey

D/

Okay, thanks for all your help moriz.

Baratus

Baratus

Lion's Arch Merchant

Join Date: Jul 2005

Elizabethtown, NC

Deathkings of The Dark Citadel

D/Me

I personally recommend nVidia cards due to a wider range of compatibility with games an APIs. I've been an OpenGL programmer since the mid-90's and I can tell you that ATI has trouble with GL-based games to this day. They tend to have slightly faster clock speeds, but that gets trumped by having to run specific things through software.

If you only play Guild Wars, you'll be fine with that card, but if you plan on playing other games, you'll find very few that sport the ATI logo on the box, and many that display nVidia's logo. Oh and before somebody flames over the whole ATI versus nVidia, I own both ATI and nVidia-based systems, so this is a genuinely factual stand-point based on real-world experience, not some kid spouting off what he or she heard from others.

By the way, why are you buying a prebuilt if you know them on the inside? It's cheaper to build your own, and you generally get longer warranties on the internals.

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

for the OP's price range and system specs, he's limited to either the HD3600 series, or the geforce 8600GT/GTS. the HD3650 is generally the more powerful solutions, since the 8600 series are pretty much made of fail.

Quaker

Quaker

Hell's Protector

Join Date: Aug 2005

Canada

Brothers Disgruntled

Quote:
Originally Posted by Syco Masato
So you're sure that'll run fine on the comp I've chosen? Like the power supply will handle it?
It's quite possible that the power supply can handle it, but it's also possible that it won't. It depends a lot upon the quality of the power supply, and how much (if any) headroom it has. (That is, how much extra power is available beyond what the system normally needs).
Run the system for a while without the new card to make sure everything is working fine. Then install the new card - if the system starts to act a bit "flaky", (or doesn't start at all) you may want to get a larger power supply. Hopefully, you can get a standard power supply for it - I haven't heard of Gateway using proprietary connectors (a la Dell).

Baratus

Baratus

Lion's Arch Merchant

Join Date: Jul 2005

Elizabethtown, NC

Deathkings of The Dark Citadel

D/Me

The 8600 series runs Oblivion on max detail in 1280x1024 here, so if it can't handle GW, which is much less intense, I'd like to know why.

Also, the power supply is incredibly weak. A 300W PS is just enought o push the CPU, much less anything else. You do realize that to determine whether or not you have enough CPU power is to simply add up the consumption under max load by your primary devices, namely the CPU and GPU. For example, if that CPU averages 300W under full load, you're maxed for now, and one or the other will starve when playing a demanding game.

My system has a P4 in it, that requires 350W under a full load. My 7800GS requires 300W under a full load. That's 650W of power, not counting optical drives, USB, and other devices. I have a 700W ToughPower PS in mine, and have had no problems with it. I can tell you from personal experience though, that running 450W PS in a system with a 350W CPU and a 250W GPU will result in burned molex connectors and a fried PS after about a year.

Get the right PS to begin with, no matter what CPU/GPU combo you chooose. It'll save you a TON of headaches in the long-run.

Quaker

Quaker

Hell's Protector

Join Date: Aug 2005

Canada

Brothers Disgruntled

ATI recommends a 400w power supply for the HD36xx series.

Baratus

Baratus

Lion's Arch Merchant

Join Date: Jul 2005

Elizabethtown, NC

Deathkings of The Dark Citadel

D/Me

Thanks Quaker, I was trying to locate their specs and was having trouble. Still, that sounds weak, because if his CPU is only 300W, that means that the card in question only requires 100W of power? Normally they're 250W or more. I need the tech specs on those ATI cards, and I always have a heck of a time finding them when I need them.

Dark Kal

Krytan Explorer

Join Date: Dec 2006

Quote:
Originally Posted by Baratus
Also, the power supply is incredibly weak. A 300W PS is just enought o push the CPU, much less anything else. You do realize that to determine whether or not you have enough CPU power is to simply add up the consumption under max load by your primary devices, namely the CPU and GPU. For example, if that CPU averages 300W under full load, you're maxed for now, and one or the other will starve when playing a demanding game.

My system has a P4 in it, that requires 350W under a full load. My 7800GS requires 300W under a full load. That's 650W of power, not counting optical drives, USB, and other devices. I have a 700W ToughPower PS in mine, and have had no problems with it. I can tell you from personal experience though, that running 450W PS in a system with a 350W CPU and a 250W GPU will result in burned molex connectors and a fried PS after about a year.
Your CPU most certainly does not use 350W, you crazy person. Most CPUs use around 50-100W. There's no need to go above a 500W power supply (in most cases) if you run a single GPU. The quality of the PSU is just as important as it's wattage. So congrats on wasting 200+W.

Edit: The 7800GS requires around 100/150W MAX at load.

Evil Genius

Evil Genius

Lion's Arch Merchant

Join Date: Dec 2006

Australia

Mo/

Fixed it for you.
Quote:
Originally Posted by Baratus
Also, the power supply is incredibly weak. A 300W PS is more than enough to push the CPU. For example, if that CPU averages 300W under full load (which it does not), you're maxed for now, and one or the other will starve when playing a demanding game.

My system has a P4 in it, that requires possibly 100W under a full load. My 7800GS requires 200W under a full load. That's 300W of power, not counting optical drives, USB, and other devices. I have a 700W ToughPower PS in mine, and have had no problems with it. I can tell you from personal experience though, that running 450W PS in a system with a 350W CPU (that don't exist) and a 250W GPU will result in burned molex connectors and a fried PS after about a year.
Syco Masato: Have you considered building your own? Or choose parts and get the local computer store to build it? This is a better method as it means you aren't paying for cheap generic shit. Maybe consider sending the specs of the $500 PC in moriz's thread to a local store for a quote? IN Australia stores charge AU$70-80 build fee. Even with this fee your getting a much better deal.

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

the HD3600 series does not require an extra molex connector, so it is limited to the power provided via the PCI-e slot. that's a maximum of 75W. it's probably a lot lower than that, even during gaming. the upcoming HD4670 will use around 59W max, so the HD3600 series (which is less power hungy) should be less than that.

and for the life of me, i can't imagine the rest of his system using 225W. his phenom processor is a low power version, and the rest of his components won't draw much power either. his 300W power supply should be safe.

just as a reference, AMD recomments a 450W powersupply for my HD4850, which is a ridiculously powerful (and power hungry) graphic card for its price. it draws a maximum of ~140W under full load. compared it to the HD3600 series, the 400W requirement seems pretty ridiculous.

EDIT: ok, found some official numbers on the HD3650: http://www.hardware-infos.com/news.php?news=2322
definitely does not need a 400W power supply.

Quaker

Quaker

Hell's Protector

Join Date: Aug 2005

Canada

Brothers Disgruntled

Quote:
Originally Posted by Quaker
ATI recommends a 400w power supply for the HD36xx series.
I would have added some more to this, but I was in a hurry.

I think, when ATI recommends a certain size power supply, they are probably taking into account the power demands of a typical system, plus some headroom to allow for extra components and reduce the number of Tech Support calls.

That cpu, btw, is listed as requiring 95 watts.

So, as I said before, depending upon the quality of the power supply, 300watts may be enough, but it would have to be a good power supply with a true 300watts. Without going into details, power supplies can be rated in various ways and under various conditions, so sometimes the rating is more theoretical than actual.
So, also as I said before, it may be worthwhile to try it with the 300watt supply, but be prepared to need a larger one. As with any such component, it should be relatively easy to get a good, barely used 350-450 watt PS from some computer hobbiest who's upgraded. I've got a couple of 350watts and a 535watt just kicking around gathering dust.

Here's an interesting link:
http://www.extreme.outervision.com/PSUEngine
the Lite version can be used online for free. Plug in the components from the computer and see what you get.

Baratus

Baratus

Lion's Arch Merchant

Join Date: Jul 2005

Elizabethtown, NC

Deathkings of The Dark Citadel

D/Me

I would recommend you all elarn computers and get a degree before posting bogus information. For starters, I have experienced a PS failing due to running a 5700LE and a P4 at the same time under high demand. I actually burned up one of the connectors, the +12V for the CPU.

Oh, and have I mentioned that I do this kind of work for a living? You would be amazed at how often these weak power supplies cause failure or poor performance in systems. Ever hear of Perot Systems? Yeah, we don't know what we're doing.

Anyway, it's your choice. I have been in this field since the 80's, have been trained by Intel, AMD, nVidia, and others. My resume speaks for itself. You're welcome to take the advice of some HS kid if you want. It WILL be cheaper doing it this way, but the long-run costs will be greater.

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

^so says the person who claimed that a pentium 4 requires 350W by itself. you can flaunt your supposed qualifications all you want, but it all evaporates as soon as you post something ridiculous like that.

Quaker

Quaker

Hell's Protector

Join Date: Aug 2005

Canada

Brothers Disgruntled

I can't decide whether this clown is serious or just trolling, but I've got time to take the bait
Quote:
Originally Posted by Baratus
I would recommend you all elarn computers and get a degree before posting bogus information.
Is that your excuse for posting bogus information? Personally, I don't think you should post bogus info whether you have a degree or not.
Quote:
For starters, I have experienced a PS failing due to running a 5700LE and a P4 at the same time under high demand. I actually burned up one of the connectors, the +12V for the CPU.
If you are as qualified as you think you are, you would be aware that any PS except perhaps a really, really cheap one would simply shut down in the case of an overload. The fact that yours actually caused damage to a connector would indicate that it (or the mobo) had some sort of hardware failure which could have happened no matter what size it was.
Also, of course, you should be aware that the system that the OP was interested in has a AMD processor requiring 95watts, not a P4.

Quote:
Oh, and have I mentioned that I do this kind of work for a living?
I worked for 30+ years for the Defence Dept doing electronic systems maintenance on torpedo and missile systems. Some of the people I worked with (who were doing it for a living) I wouldn't ask to fix a toaster. I'm not impressed.
Quote:
Ever hear of Perot Systems?
Actually no, I haven't - sorry.

Quote:
I have been in this field since the 80's, have been trained by Intel, AMD, nVidia, and others. My resume speaks for itself.
See above - I'm still not impressed.
Quote:
You're welcome to take the advice of some HS kid if you want.
I suppose most of us were HS (I assume that's High School) kids once, but in my case it was a long time ago. But thanks anyway.

Elder III

Elder III

Furnace Stoker

Join Date: Jan 2007

Ohio

I Will Never Join Your Guild (NTY)

R/

Well, I normally don't stick my nose into this type of thread, and I don't consider myself an expert, just a person who has an avid interest in computers. However I know for a fact that you do not need 350W just for a P4.... come on I have a AMD Phenom 9850 x 4 that sucks up 125 watts, which is high for a processor, even a Quad Core beast like that.... maybe you meant to type a different number????

Oh yeah, the 8600 DOES NOT run Oblivion at Max detail..... been there and tried it. lol

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

it will probably run it at max detail at 800x600 resolution though

baratus is right on one thing though: the quality of the power supply is suspect, since it is a prebuilt pc, and manufacturers will try to use the cheapest power supply they can get away with.

for the OP: can you tell us if 300W is the maximum "burst" power, or the maximum "sustained" power? there is actually a pretty big difference between the two.

Evil Genius

Evil Genius

Lion's Arch Merchant

Join Date: Dec 2006

Australia

Mo/

Quote:
Originally Posted by Baratus
I would recommend you all elarn computers and get a degree before posting bogus information.
I assume I'm one of those that needs to "elarn computers" before posting "bogus information". So here's my critique of your comments:

Quote:
Originally Posted by Baratus
For starters, I have experienced a PS failing due to running a 5700LE and a P4 at the same time under high demand. I actually burned up one of the connectors, the +12V for the CPU.
You have experienced a PSU failing first hand? Wow, that's really special. /sarcasm. It is well known some/most cheap/generic power supplies do not deliver the wattage/amps advertised and have a higher RMA rate than quality PSUs such as Seasonic.

Quote:
Originally Posted by Baratus
Oh, and have I mentioned that I do this kind of work for a living? You would be amazed at how often these weak power supplies cause failure or poor performance in systems.
I wouldn't be amazed actually.

Quote:
Originally Posted by Baratus
Ever hear of Perot Systems?
Nice appeal to authority argument there. I would trust an hardware enthusiast/extreme overclocking rather an arrogant know-it-all who spouts his (supposed?) qualifications while defending his credibility. This is where you lost it by the way:
Quote:
Originally Posted by Baratus
My system has a P4 in it, that requires 350W under a full load. My 7800GS requires 300W under a full load.
Quote:
Originally Posted by Baratus
You're welcome to take the advice of some HS kid if you want. It WILL be cheaper doing it this way, but the long-run costs will be greater.
Nice ad hominem argument there. I'm just not sure were I suggested going the cheaper generic way? Just to clarify: I value the importance of quality components, especially power supply. Hence the Corsair HX-520 in my rig. Lets have a look at what I actually posted, rather than your fallacious straw man argument:
Quote:
Originally Posted by Evil Genius
Syco Masato: Have you considered building your own? Or choose parts and get the local computer store to build it? This is a better method as it means you aren't paying for cheap generic shit.
Hmm obviously because I suggested building his own or having a computer store build it with parts he selects/we suggest that means I am suggesting poor quality components? Anyone with a brain would know large manufacturer's cut costs by choosing generic components for their budget PCs.

Quote:
Originally Posted by Barastus
Yeah, I don't know what I'm doing.
Fixed it for you.

Baratus

Baratus

Lion's Arch Merchant

Join Date: Jul 2005

Elizabethtown, NC

Deathkings of The Dark Citadel

D/Me

Tell ya' what, I'll try to find the freaking box the thing came in and take a picture of it. What then, smart guy? Oh and mind you, this is a 32bit P4, not one of those first-generation 64bit P4s with throttling and such. Mine runs at 3.20GHz constantly, and cannot be throttled. Even if it could, at max performance it would use more power than at the lowest speed.

*EDIT*

Found the video card box, it was lying nearby with other hardware boxes. The CPU box is in a box with other things and I won't be digging it out soon. When I do, I'll post it as well. For now however, how about a picture of a video card that requires 400W with 20A on the +12v lead?!

Video Card

The image was taken from a camcorder, not my digital camera. Sorry for the low resolution. Oh and note that this is a MINIMUM requirement! This doesn't speak for a system with a real hardware soundcard (in other words, not a built-in soundcard) and a demanding CPU.

I also note that I am talking to a bunch of kids in school. Odd how your post times are always after 3:00PM. Guess that's proof enough as to who has been playing with torpedoes and missiles, or to who has very much technological knowledge of a computer at all.

Oh and just for spite, pictures of Oblivion on this 7800GS with max detail at 17~25fps simply due to a properly configured system! Note the 1280x1024 resolution. I now say that if your 8600 cannot perform at LEAST this well, you need to throw out your computer and get back to auto-mechanics.

Link
Link
Link
Link
Link
Link
Link
Link

Syco Masato

Frost Gate Guardian

Join Date: Mar 2007

New Jersey

D/

Holy crap, what has my thread become :O
To the numerous people who suggested building my own- Yes, I know its better, and I plan on doing that in the future, but for now the computer I have picked out will suffice.

Rustjive

Krytan Explorer

Join Date: Feb 2006

Quote:
The Pentium 4 brand refers to Intel's line of single-core mainstream desktop and laptop central processing units (CPUs) introduced on November 20, 2000 (August 8, 2008 was the date of last shipments of Pentium 4s). They had the 7th-generation architecture, called NetBurst, which was the company's first all-new design since 1995, when the Intel P6 architecture of the Pentium Pro CPUs had been introduced. NetBurst differed from the preceding Intel P6 - of Pentium III, II, etc. - by featuring a very deep instruction pipeline to achieve very high clock speeds (up to 4 GHz) limited only by max. power consumption (TDP) reaching up to 115 W in 3.6–3.8 GHz Prescotts and Prescotts 2M
Quote:
# Quick Roundup:GPU: GeForce FX5800 Ultra
# 128MB DDRII
# Core clock: 500Mhz; Memory clock: 1Ghz
# Board weight: 600gr
# 75 watt power consumption under full load!
Instead of pretending you're an expert, you should just learn to Google, idiot.

Rustjive

Krytan Explorer

Join Date: Feb 2006

Quote:
Originally Posted by Baratus
Anyway, it's your choice. I have been in this field since the 80's, have been trained by Intel, AMD, nVidia, and others. My resume speaks for itself. You're welcome to take the advice of some HS kid if you want. It WILL be cheaper doing it this way, but the long-run costs will be greater.
Also, here's a fun fact. Click Baratus, then click view public profile, then look at your age.

Quote:
Age: 28
You've been working on CPUs since you were 10?

U R WINNAR!

Black Belt

Ascalonian Squire

Join Date: Feb 2006

Eternum Purriah[EP]

Mo/

hey now, some people might start that early.....


no this kid is just an idiot.

gg ownage

Syco Masato

Frost Gate Guardian

Join Date: Mar 2007

New Jersey

D/

Quote:
Originally Posted by Rustjive
Also, here's a fun fact. Click Baratus, then click view public profile, then look at your age.



You've been working on CPUs since you were 10?

U R WINNAR!
NO U WINRAR
12chars

Baratus

Baratus

Lion's Arch Merchant

Join Date: Jul 2005

Elizabethtown, NC

Deathkings of The Dark Citadel

D/Me

You're telling me you weren't playing with Ataris and Commodores back in the day? Here's a fun fact, I even souped up my Ataris! Remember that 48K RAM they came with? They had three slots, each about twice as wide as an NES cartridge, and each one on my systems now hold 128K RAM chips. That was max on those systems. So yes, I have been into computers since 1986, when I started programming in Atari Basic. Any other questions I can shoot down for you?

Quote:
Instead of pretending you're an expert, you should just learn to Google, idiot.
Actually, I already shot down your theory about the 75W GPU crap you posted. See the picture of the box, or are you blind?

Also, it's time for first-grade addition since so many of you are obviously incapable of such a feat. Assume you are correct, that the CPU requires 115W and the GPU only 75W. That's 190W, correct? Now, if a high-end GPU and high-end CPU only require that little amount, why on earth are we selling 1.5GW power supplies? Why do we not simply plop in a 250W PS or a 300W PS and go with it? Well common-sense has to kick-in at some point, and you must realize that there is a reason that Gateway, Dell, HP, and others include a 300W PS with their stock systems which generally have built-in video. It's because the system needs that kind of power, and when you add a high-end video card, it needs more.

I should also inform you that a P2 system needs at least a 200W PS, and some required a 300W PS. If we were using 200W power supplies in 1994 with those old systems, and we're now running powerful P4, Core2Duo, and Core2Quad systems, doesn't common-sense dictate that you need more power to push these things? Simple laws of physics dictate that you can't get something from nothing, so if we're still using P2/P3 power supplies, where is the extra power coming from?

Oh and yes, I know about improvements in the pipe, HT, dual-cores, and all that crap, but you still need power to push it.

Rustjive

Krytan Explorer

Join Date: Feb 2006

The recommendation is for your entire system - your other PCI cards, hard drives, everything uses power.

Anyways, apparently you're the only one in the world that can do 'simple math':

http://www.xbitlabs.com/articles/vid...i_9.html#sect0

That's xbitlabs doing a quad SLI configuration and an Athlon FX-60, with a 660W PSU. Each of the GeForce 7900s they are testing recommends a 400W PSU per 'the side of the box'.

Are you telling me that you need a 2000W PSU to run quad SLI?

http://en.wikipedia.org/wiki/GeForce...Force_7800_GTX

Do me a favor, read that chart, check out the max power consumption, and clean yourself up. You look like an idiot and everyone can see.

Edit: http://www.techreport.com/articles.x/11211/16

There's another, with power measured from the wall socket, for the FULL system. Seriously, you better get Intel to retrain you.

Quaker

Quaker

Hell's Protector

Join Date: Aug 2005

Canada

Brothers Disgruntled

Quote:
Originally Posted by Baratus
Well common-sense has to kick-in at some point, and you must realize that there is a reason that Gateway, Dell, HP, and others include a 300W PS with their stock systems which generally have built-in video.
You're a real piece of work Baratus. First you try to tell us that we need 800watts or so for a PS and then counter your own arguements by saying that all these companies use 300w. Lol
And by your own reasoning , your last 2 posts were after 3pm, so you must be in high school too. I'm not surprised that you're 28 and still in HS.
(actually, I'm smart enough to know that the post times are adjusted for my particular time zone - I don't actually know what your local time was.)

At any rate, common sense and reading comprehension would tell you:

1. CPU and GPU power requirements have generally gone down for equivalent performance.
2. Most manufacturers would include a 300watt power supply in their systems because it's not cost effective, given modern cicuitry, to build a power supply with less. Plus, it gives some headroom. It's not necessarily so that they all require 300watts.
Much the same way as they include at least a 150gig hard drive because it's not cost effective to build a smaller one. (or more acurately, the drive makers don't build smaller ones.)
3. The system under discussion does NOT use a P4, so the power requirements of a P4 don't matter in this case. Also, people were not suggesting a "high end" video card or multi-SLI setup.
4. Given the above info, it's not easily determined whether or not the supplied 300watt power supply would have enough headroom to run the suggested video card. Most people suggest the OP get at least a 400watt PS anyway - which was your "expert" opinion too.

My suggestion for you Baratus is to stop posting these rants because they make you look like a doofus. (or a Troll) But you can keepit up if you want - untill the mods close the thread or something. I've got lots of time to read and answer your posts and we can always use a good laugh

Baratus

Baratus

Lion's Arch Merchant

Join Date: Jul 2005

Elizabethtown, NC

Deathkings of The Dark Citadel

D/Me

Your laughter shows off a lack of knowledge. I noted that my retaliation to the "8600 can't run Oblivion" comment with pictures of an old, AGP 7800 running it at a decent framerate with max details and high-resolution was ignored because I proved you wrong. I cannot physically prove that you need the extra power with these modern pieces of hardware to run the new systems shy of plopping a 400W PS in my system, showing you the dip in framerate in various games, putting my big PS back in, and showing you the difference, and that isn't happening because it's a hell of a lot of trouble to make a point that those of us who do this for a living know. Hell, they teach this much at the community college to first-year students! I also know that it is taught at larger colleges in the area, such as State. You're entitled to your own beliefs, but this is a cold-hard fact and not believing it shows off both a closed mind and immaturity.

I repair systems every day that are turning off randomly, and the problem is always a crappy PS. My old PS was a 450W ThermalTake with dual fans on it. I'd start playing WoW and after a while the system would hard-lock, or simply power off without shutting down. WoW was the ONLY game doing this, so I blamed it on WoW. Turns out it was the power-supply shutting down to protect itself from too much being drawn. The end-result was that my +12V lead burned up badly and ruined both the PS and the motherboard. So don't tell me that a 450W PS is enough power for a P4/3.20GHz Prescott with an Audigy II and 7800GS. I don't have any other expansion cards in the system and I don't use an USB devices beyond my keyboard and mouse. The simple fact is that 450W was too damn much on the system. I upgraded the PS and can play games like UT3 without a hitch. By your reasoning, I only needed around 200~300W. So explain this magical phenomenon to me, because I supposedly had 150W to spare.

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

i'd say your power supply was going bad, and you are drawing your entire (incorrect) opinion on one case.

you are also basing your knowledge on older/inefficient gear. a pentium 4 (and stop calling it P4. P4 is actually a pentium 2 iirc) is well known for being expensive/power hungry/inefficient. since then, CPUs have generally lower power requirements while putting out more performance. the fact that you are drawing all your examples from a pentium 4 suggests that you're out of the loop for at least 3 years. maybe you should go back to school and update your knowledge before coming back.

if you still think you are right, guess the power supply wattage of my current build. here are its specs:

intel core2duo E7200 @ 3.2ghz OC
4gb corsair dominator DDR2 800
asus P5K-VM
HiS radeon HD4850
total of 3 case fans
500gb SATA2 harddrive
LG SATA DVD/CD combo writer

keep in mind that this system is completely stable, and puts out 60+ fps in HL2:EP2 at 1680x1050 resolution, with AAx24 and AF16+trilinear filtering (and no, i'm not joking about the AAx24 part).

Evil Genius

Evil Genius

Lion's Arch Merchant

Join Date: Dec 2006

Australia

Mo/

Quote:
Originally Posted by Baratus
Tell ya' what, I'll try to find the freaking box the thing came in and take a picture of it. What then, smart guy?
Ooh nooos I have been thwarted! /sarcasm. You don't even know the power supply requirement on the box is for the whole system? For example, my 8800 Ultra box says:
Quote:
525W PCI Express-compliant system power supply with a combined 12V current rating of 37A or more*
*Minimum system power requirement based on a PC configured with an Intel Core 2 Extreme QX6700 processor
Quote:
Originally Posted by Baratus
Oh and note that this is a MINIMUM requirement! This doesn't speak for a system with a real hardware soundcard (in other words, not a built-in soundcard) and a demanding CPU.
Actually it does. The minimum requirement has easily enough headroom for a few hard drives, demanding CPU etc. And a "real hardware soundcard": I doubt they use more than a few watts (10?).

Quote:
Originally Posted by Baratus
I also note that I am talking to a bunch of kids in school. Odd how your post times are always after 3:00PM.
It's called Australia. I know it must be hard for a self-centered person like yourself to consider other countries exist, but they do. Also some/most people finish work at about 5:00pm so I guess they're schoolkids too?

Quote:
Originally Posted by Baratus
Also, it's time for first-grade addition since so many of you are obviously incapable of such a feat. Assume you are correct, that the CPU requires 115W and the GPU only 75W. That's 190W, correct? Now, if a high-end GPU and high-end CPU only require that little amount, why on earth are we selling 1.5GW power supplies?
Rustjive posted 75W as the power requirement for a FX5800 Ultra. Since when was a Pentium 4 and FX5800 Ultra high end? I will tell you why 1500W power supplies are sold:
1) For enthusiast high end rigs. For example, a member of OCAU has:
Quote:
QX9770 @ 4.25ghz - phase change cooler - Corsair XMS3 DDR3 - XFX 790i - 3 x gtx 280 in Tri Sli - X-FI xtremegamer - 1200w Thermaltake Toughpower - Antec Twelve Hundred - 2 x WD 150gb Raptors in Raid - PB 3dmark06 22322 - PB 3dmark Vantage 23500
As you see 1200Watts is safe for a beast like that.

2) Because people are gullible and conned into thinking they need 1500Watts for systems that would need something like 650watts.

Quote:
Originally Posted by Baratus
My old PS was a 450W ThermalTake with dual fans on it. I'd start playing WoW and after a while the system would hard-lock, or simply power off without shutting down.
I'm not going to address anecdotal evidence.

The point of this debate is not we believe people should use generic, low wattage power supplies for their computers. Quite the contrary, everyone here agrees good quality power supplies are essential. The point is you obviously have great deficiencies in your knowledge of computers and should therefore not bother attacking others.

Quote:
Originally Posted by Baratus
See the picture of the box, or are you blind?

Lord Sojar

Lord Sojar

The Fallen One

Join Date: Dec 2005

Oblivion

Irrelevant

Mo/Me

Quote:
Originally Posted by Baratus
Your laughter shows off a lack of knowledge. I noted that my retaliation to the "8600 can't run Oblivion" comment with pictures of an old, AGP 7800 running it at a decent framerate with max details and high-resolution was ignored because I proved you wrong. I cannot physically prove that you need the extra power with these modern pieces of hardware to run the new systems shy of plopping a 400W PS in my system, showing you the dip in framerate in various games, putting my big PS back in, and showing you the difference, and that isn't happening because it's a hell of a lot of trouble to make a point that those of us who do this for a living know. Hell, they teach this much at the community college to first-year students! I also know that it is taught at larger colleges in the area, such as State. You're entitled to your own beliefs, but this is a cold-hard fact and not believing it shows off both a closed mind and immaturity.

I repair systems every day that are turning off randomly, and the problem is always a crappy PS. My old PS was a 450W ThermalTake with dual fans on it. I'd start playing WoW and after a while the system would hard-lock, or simply power off without shutting down. WoW was the ONLY game doing this, so I blamed it on WoW. Turns out it was the power-supply shutting down to protect itself from too much being drawn. The end-result was that my +12V lead burned up badly and ruined both the PS and the motherboard. So don't tell me that a 450W PS is enough power for a P4/3.20GHz Prescott with an Audigy II and 7800GS. I don't have any other expansion cards in the system and I don't use an USB devices beyond my keyboard and mouse. The simple fact is that 450W was too damn much on the system. I upgraded the PS and can play games like UT3 without a hitch. By your reasoning, I only needed around 200~300W. So explain this magical phenomenon to me, because I supposedly had 150W to spare.

mmmk.... this thread is getting old. <font color="red" font size="3"> Let me set everyone straight. Anymore arguing this point will result in post deletion. We do not need to fight. </font>

Modern GPU cores are very energy efficient. A 450w PSU is more than enough to run what you are describing, unless you have 4+ HDDs connected with dual ROM drives.

The issue you are describing is faulty amp draw coupled with insufficient rail amp push. Basically, your PSU was low quality or defective from the start.

The power requirements for cards are actually relatively low with the exception of the GTX 280 card, and even then, at full load an 800-900w will power 2 of them. The issue with many PSUs is simply lack of rail amperes, be it on the 12v splits or 3.3v. Usually the 5v is fine, but I have seen cases where it too is low.

Baratus, you are confusing TDP with core draw. Core draw is relatively low on modern graphics cards, especially with DiEC rerouting and zoned powerdown. The requirements for cards are based on stardardized system layouts, with average components in a typical build. Those are not the actual power usages of the cards (especially the core)

And Pentium 4 processors didn't use THAT much power. They simply were inefficient at using said power, and they had terrible DiEC gate leaking, which caused thermal pocketing. The only reason they were able to run so hot without extensive damage was thanks to long pipeline design with extensive cobalt nMOS and COI. P4's are an extinct technology, and were a terrible creation in the first place. So really, that discussion is irrelevant.

Syco, I advise you budget a bit more money, and get this card:

http://www.newegg.com/Product/Produc...82E16814133242

Yes it is a PNY, but the card has been well received and got great marks in our failsafe labs. Not only that, but it is only 30 dollars more than your budget, and worth every penny.

As for power requirements: You need about 24A on the 12v rail, and about a 450watt PSU or better. I would recommend a 500w min, with 26A on the 12v rail. That card screams.

ATi cards, while impressive, draw more power compared to the new 9800GTX+ and 9800GT under most circumstances (in particular under load)

Tamuril elansar

Tamuril elansar

Wilds Pathfinder

Join Date: Jul 2007

N/

i think the 9800GT is a very good card, however you'll have to upgrade the power suply.

if your afraid to upgrade your power suply i suggest getting a lower end card like the 9500GT, not sure about the power that card uses though.

Quaker

Quaker

Hell's Protector

Join Date: Aug 2005

Canada

Brothers Disgruntled

Quote:
Originally Posted by Rahja the Thief
Not only that, but it is only 30 dollars more than your budget, and worth every penny.

As for power requirements: You need about 24A on the 12v rail, and about a 450watt PSU or better. I would recommend a 500w min, with 26A on the 12v rail. That card screams.
I think everyone, including Rahja, needs to go back to Syco's posts and consider what he has said.

Basically, he has chosen a particular computer, but wants to know what sort of video card would be better than the on-board graphics. I assume, from reading his posts, that upgrading the power supply is not an option at this time. This somewhat limits his choice of video card given the 300watt power supply, but he can still get a decent card that will play GW quite well, depending upon the rez of his monitor and the graphics settings.

So, even though a 9800GT is "only $30 more" (not including any tax) - the card plus a 500watt power supply surely is outside the budget.

@Syco - given your OP, this is what I recommend (or similar):

http://www.newegg.com/Product/Produc...82E16814102765

The recommended power supply for a 2600 is only 300watts.

You might want to read this too:

http://www.tomshardware.com/reviews/...hics,1786.html

Evil Genius

Evil Genius

Lion's Arch Merchant

Join Date: Dec 2006

Australia

Mo/

Why not just recommend a different PC rather than a new video card + PSU? Syco hasn't yet bought that Gateway - its just the one he thought he would buy. Are you open to different suggestions? If so, it looks to me that that the Gateway costs $440-$550. I will assume you were intending to pay $440. Therefore $440 + $100 (for video card) = $540. Maybe something like: http://www.newegg.com/Product/Produc...82E16883227086 or http://www.newegg.com/Product/Produc...82E16883229024.