my yearly upgrade

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

as some of you might know, i built my pc last year. its specs were:

-core 2 duo E7200 @ 3.2ghz (OC)
-corsair dominator 1gb x4 RAM (PC2-6400)
-Asus P5K-VM motherboard (G33 chipset)
-HiS Radeon HD 4850 512mb
-Samsung SyncMaster 2243BW
-various SATA2 harddrives, etc

for this year's upgrade, i decided to leave the CPU and motherboard and go with beefier graphics, along with a new monitor to go with it. since i was already using a Radeon and the SyncMaster series have been good to me so far, i went with these two upgrades:

-Sapphire Radeon HD 4890 OC (901/1000)
-Samsung SyncMaster 2342BWX (2048x1152 resolution)

so far, i've had it for one day, and the performance increase is quite amazing. even with dramatically higher resolution (i went from a 1680x1050), all my games are performing much better. not only is maximum and average FPS up, but there is noticeably less stuttering during fighting scenes, especially in Mass Effect and my heavily modded Oblivion (MSAA x4, AF16). needless to say, the graphic card eats GW for snack.

i've also tried to overclock, to see what the maximums are. so far, i've managed a rather weak 940/1035. the highest i've ever gotten was 950/1045, and that immediately crashed my entire system. idle temperature is around 60C, and goes up to around 90C under Furmark with the overclocking.

Wrath Of Dragons

Wrath Of Dragons

Burninate Stuff

Join Date: Aug 2005

New Mexico

E/Mo

Same monitors, moriz. Love em!

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

and i can semi confirm that the overclock+dual monitor bug that plagues the HD4850 does not transfer over to this particular card, because the card keeps the vRAM running at full speed at all times, even when idle. this might be the reason why the 4890 uses more power idle than the GTX275.

however, the underlying problem is still there; AMD merely made a workaround for it.

i'll be able to fully confirm this once i move back to my parents' place in august/september, where i'll have the room to hook up both monitors at the same time.

Brett Kuntz

Brett Kuntz

Core Guru

Join Date: Feb 2005

o/

I have the same monitor. It is literally the best monitor you can buy right now that isn't crazy expensive. I got mine for $215 CND, and the brightness blinds me like a mofo.

Other than that, don't run 4 sticks of RAM. Only 1 stick per channel, or else you can't OC that good, it fails much sooner, and most mobo's can't even handle factory OC'd ram if you use more than one stick per channel.

DDR2 1066, DDR3 anything above 1333 <- Those generally wont work with more than 1 stick per channel. Or, you generally need to increase the voltage or timings to compansate.

And I almost bought 2x 4890's with waterblocks, but I will wait for 5870's in a month and get those.

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

got mine for $240 CND. had i gone to the store two days earlier, i could've had them for $190. and yeah, i had to mess around with the brightness/colours before i can look at it for prolonged periods. it was too bright and too bluish.

other than maybe tightening the timings, i probably won't bother overclocking the RAM. i bought them cheap (two kits of 2gb), and they work well enough. it's currently in a 1:1 ratio with the FSB, and since i have no intention of overclocking my CPU any further, i don't really need to raise the bandwidth on the RAM. the timings are pretty bad though, i think they are 5-5-5-18, if i remember correctly.

Brett Kuntz

Brett Kuntz

Core Guru

Join Date: Feb 2005

5-5-5-12~ on DDR2 is the best you're gonna get at 1066~ speeds. The diff between 12 & 18 is pretty much nothing since I think that is write-block delay?? or some other delay that happens once in a blue moon playing games. CAS is the only real value that will give you anything resembling a difference.

But in the future when building a system, keep in mind both Intel and AMD's processor docs both explain that using more than 1 stick per channel of memory really hurts the internal memory controllers on the processor.