Another year, another tax return, another upgrade...
Faer
...and another thread in which I post things and request the opinions of my fellow PC gamers. Some of you may remember my last thread about my current PC. I am happy to report that it is still kicking and still more power than I need most of the time, for games and work alike.
HOWEVER! I have been offered the opportunity to get new guts for the beast in exchange for my old ones. Never being one to turn down such an offer, this means it's time to get myself a new CPU, Motherboard, RAM, and potentially a new GPU.
Here's what I'm thinking:
Core i5 750
ASUS Maximus III Formula
G.Skill Ripjaws DDR3 1333 (4GB)
And unless GT300 comes out in the next month or two and has a better card at the same price point as the 5850, I'll be getting one of those too. There's still the matter of a good CPU cooler, but I'm sure somebody around here has a 1156 system and can give me good input on that.
Suggestions? Comments? Offers to sleep with me?
HOWEVER! I have been offered the opportunity to get new guts for the beast in exchange for my old ones. Never being one to turn down such an offer, this means it's time to get myself a new CPU, Motherboard, RAM, and potentially a new GPU.
Here's what I'm thinking:
Core i5 750
ASUS Maximus III Formula
G.Skill Ripjaws DDR3 1333 (4GB)
And unless GT300 comes out in the next month or two and has a better card at the same price point as the 5850, I'll be getting one of those too. There's still the matter of a good CPU cooler, but I'm sure somebody around here has a 1156 system and can give me good input on that.
Suggestions? Comments? Offers to sleep with me?
Lord Sojar
No. Just no....
CPU:
http://www.newegg.com/Product/Produc...82E16819115214
MOBO:
http://www.newegg.com/Product/Produc...82E16813157172
RAM:
http://www.newegg.com/Product/Produc...82E16820231278
Spend same on memory basically but for more bandwidth, spend less on motherboard and offset the cost of the better CPU. Almost the same price, more performance.
As for the GPU... wait and see.
CPU:
http://www.newegg.com/Product/Produc...82E16819115214
MOBO:
http://www.newegg.com/Product/Produc...82E16813157172
RAM:
http://www.newegg.com/Product/Produc...82E16820231278
Spend same on memory basically but for more bandwidth, spend less on motherboard and offset the cost of the better CPU. Almost the same price, more performance.
As for the GPU... wait and see.
Faer
Quote:
Quote:
Quote:
I can only wait so long, bro. When the time comes to do this, it's getting done - whether or not nvidia has decided to stop pushing back the GT300 release.
RotteN
Quote:
Assuming I don't need to update the BIOS before it will work with the CPU, that'd be fine. Not as cool looking though, and looks matter a lot. Gotta have that ricer factor.
|
http://www.newegg.com/Product/Produc...82E16813128412
It's a nice midrange mobo with everything you'll ever need for a normal gaming PC, and at least the colors sort of match.
Quaker
For a cooler:
http://www.newegg.com/Product/Produc...82E16835103065
Note the MaximumPC "KickAss" rating.
And, I agree with Rahja - you may as well get the i7-860 now instead of later. It's only $80 more.
http://www.newegg.com/Product/Produc...82E16835103065
Note the MaximumPC "KickAss" rating.
And, I agree with Rahja - you may as well get the i7-860 now instead of later. It's only $80 more.
Lord Sojar
Theo, no... go with what I recommended. That ASRock board is one of the best for the price, period. And why would you upgrade from a Corei5 to a Corei7 860? Just get the 860 now, save yourself the money in the long run. The Corei7 860 has Hyperthreading, which is a nice boon in multithreaded games, big time.
And yes, the cooler Quaker linked is a great idea.
The problem with that Gigabyte UD3 RotteN is that it only has 1 PCIe 16x slot and the other is a 4x, while the ASRock has a 16x and an 8x. It won't bottleneck a 2nd graphics card should she choose to put one in.
And yes, the cooler Quaker linked is a great idea.
The problem with that Gigabyte UD3 RotteN is that it only has 1 PCIe 16x slot and the other is a 4x, while the ASRock has a 16x and an 8x. It won't bottleneck a 2nd graphics card should she choose to put one in.
RotteN
True, forgot to mention that the board i linked to isn't good if you aspire building CrossfireX or SLi set-ups.
The ASRock board is indeed a very solid one that leaves pretty much all options open (you can for example slide in a second HD5850 down the road for a quick GPU boost at an affordable price).
As for the CPU: right now, games don't really bottleneck any i5 or i7 CPU. They're rarely optimized to take full advantage of the multiple core. But as hardware improves, so does software. So you can indeed expect multithreaded applications to be the future.
All in all, if you're set on that price, take a better CPU and a midrange (but quality) mobo instead of a mid-range CPU and a top-of-the-line mobo.
The ASRock board is indeed a very solid one that leaves pretty much all options open (you can for example slide in a second HD5850 down the road for a quick GPU boost at an affordable price).
As for the CPU: right now, games don't really bottleneck any i5 or i7 CPU. They're rarely optimized to take full advantage of the multiple core. But as hardware improves, so does software. So you can indeed expect multithreaded applications to be the future.
All in all, if you're set on that price, take a better CPU and a midrange (but quality) mobo instead of a mid-range CPU and a top-of-the-line mobo.
Faer
Quote:
It won't bottleneck a 2nd graphics card should she choose to put one in.
|
At any rate, the only reason I'd have two cards in there is to have one dedicated to PhysX (like I currently do). I actually have no idea if that would cause my first slot to go down to 8x or not, though - I heard some things about putting any sort of card (sound, whatever) in there doing that, but ehhh. My current motherboard is dual 16x so I'm safe for now, at least.
Lord Sojar
Quote:
INB4 I have a vagina.
At any rate, the only reason I'd have two cards in there is to have one dedicated to PhysX (like I currently do). I actually have no idea if that would cause my first slot to go down to 8x or not, though - I heard some things about putting any sort of card (sound, whatever) in there doing that, but ehhh. My current motherboard is dual 16x so I'm safe for now, at least. |
If you go with that ASRock board I linked, the 2nd slot will be 8x, while the 1st always stays at 16x. 8x is more than enough for a PhysX card. GF100 will be a monster, and it releases first week of March, so the wait is certainly worth it. Pricing should be equal to that of ATi's lineup.
Snograt
Never underestimate the usefulness of that additional PhysX card. Batman's cape just doesn't flap so nicely without it.
Plus, of course...
...no, there is no plus.
Plus, of course...
...no, there is no plus.
Faer
Quote:
If you go with that ASRock board I linked, the 2nd slot will be 8x, while the 1st always stays at 16x. 8x is more than enough for a PhysX card. GF100 will be a monster, and it releases first week of March, so the wait is certainly worth it. Pricing should be equal to that of ATi's lineup.
|
Then again, ATi cards likely will drop in price... But 40% more oomph than a 295 is still awesome.
Hey man you never know, the transition to using Havok for everything ever may fail, and then my 9800GT will be hella useful!
Lord Sojar
moriz
the one thing that i took away from that article is that GF100 is going to have a serious geometry rendering advantage over everything out there right now, due to its heavier utilization of its tessellator.
this can be a good and a bad thing. the good thing is, is that GF100 will give noticeable IQ improvements over the radeon HD5000 line. however, that difference will only show up if the game is specifically designed to take advantage of it. as in, games will have to be designed to have more detailed displacement maps to take advantage of the improved tessellator. this worries me, because we can be potentially seeing the next "the way it's meant to be played" shenanigan: games with that tag will look significantly different (better?) on nvidia's hardware, or run significantly faster, but ONLY if they choose to take advantage of the improved tessellator. i can't see any game developers doing this, unless money exchange hands.
anyways, it seems like a very interesting and potentially excellent product. however from all indications, general availability probably won't be on the first week of march.
this can be a good and a bad thing. the good thing is, is that GF100 will give noticeable IQ improvements over the radeon HD5000 line. however, that difference will only show up if the game is specifically designed to take advantage of it. as in, games will have to be designed to have more detailed displacement maps to take advantage of the improved tessellator. this worries me, because we can be potentially seeing the next "the way it's meant to be played" shenanigan: games with that tag will look significantly different (better?) on nvidia's hardware, or run significantly faster, but ONLY if they choose to take advantage of the improved tessellator. i can't see any game developers doing this, unless money exchange hands.
anyways, it seems like a very interesting and potentially excellent product. however from all indications, general availability probably won't be on the first week of march.
Faer
Quote:
Wouldn't be the first time!
This is also going to be a factor for me. But we'll see how that works out I guess.
Elder III
Waiting to see the prices, it sounds as if GF100 will give better FPS at high resolutions with all the eye candy turned on, but I think it will depend on the heat and power requirements (and price of course) to see if it is the best buy or not. Also waiting on more DX11 games - I really want to see tesselation in action.
Faer
Rahja says they run coolish. Everybody else says they run hot hot hot. I wonder who is right! TIME WILL TELL, EH RAHJA? EH?
Also, I may be scrapping this entire plan and doing this instead:
(Some of you may remember that I own a Samsung 2333SW already, so the 2333HD will match it perfectly on my desk - which is why I am grabbing it instead of a cheaper monitor)
Also, I may be scrapping this entire plan and doing this instead:
(Some of you may remember that I own a Samsung 2333SW already, so the 2333HD will match it perfectly on my desk - which is why I am grabbing it instead of a cheaper monitor)