How much memory is enough?
Mss Drizzt
OK you can NEVER have enough.
I run 1 gig rdram and I love it.
768 is the min due to XP and the min it requires.
Lansing can you make your shut-down program avilable for download.
I run 1 gig rdram and I love it.
768 is the min due to XP and the min it requires.
Lansing can you make your shut-down program avilable for download.
Luggage
Quote:
Originally Posted by Mss Drizzt
OK you can NEVER have enough.
|
Well 3+ to 4GB actually is too much (or too little) atm becasue of how the adresspace is used in 32bit computing. A normal win xp pc with 4GB ram will only see between 3-3.7GB ram depending on what hardware you have in the machine.
http://support.microsoft.com/?id=279151
http://www.microsoft.com/whdc/system...AE/PAEmem.mspx
Lansing Kai Don
Quote:
Originally Posted by Luggage
Slightly OT but...
Well 3+ to 4GB actually is too much (or too little) atm becasue of how the adresspace is used in 32bit computing. A normal win xp pc with 4GB ram will only see between 3-3.7GB ram depending on what hardware you have in the machine. http://support.microsoft.com/?id=279151 http://www.microsoft.com/whdc/system...AE/PAEmem.mspx |
Lansing Kai Don
Lansing Kai Don
John TrickKnee
Okay, I know you've all been on pins and needles wondering what (if anything) the Village Idiot is going to do to his computer. After careful consideration, I've decided to drag it behind my car until I get pulled over. When the policeman says to me, "Do you know you're dragging a computer behind you?" I will respond, "I'm sorry, I can't hear you. I've got two 128Mb 40ns PC800 RDRAM RIMMs in my ears!" *RIMM shot*
But seriously folks, I just bought two more 128Mb RDRAM units to bring my system total up to 512Mb. I found two sticks (used) on eBay for 82.99 USD shipped. Hopefully, it will arrive before Friday. Now, I think my nVidia GeForce 4 (64Mb, model 420) will be the bane of my existence.
Upon further research, I've discovered that indeed there is an "access lag" with RDRAM due to the serial architecture of RDRAM (you have to go through all 4 slots; that's why you need the continuity RIMMs in what would otherwise be empty slots). DDR RAM has no such lag because its architecture provides access to each stick in parallel. The practical consequenses are that DDR is better for more numerous and smaller memory reads/writes, such as in servers. And RDRAM is better for fewer but larger memory reads/writes, such as in games. The Dell 8200 is intended as a higher-end gaming machine (or was 3 years ago.)
But seriously folks, I just bought two more 128Mb RDRAM units to bring my system total up to 512Mb. I found two sticks (used) on eBay for 82.99 USD shipped. Hopefully, it will arrive before Friday. Now, I think my nVidia GeForce 4 (64Mb, model 420) will be the bane of my existence.
Upon further research, I've discovered that indeed there is an "access lag" with RDRAM due to the serial architecture of RDRAM (you have to go through all 4 slots; that's why you need the continuity RIMMs in what would otherwise be empty slots). DDR RAM has no such lag because its architecture provides access to each stick in parallel. The practical consequenses are that DDR is better for more numerous and smaller memory reads/writes, such as in servers. And RDRAM is better for fewer but larger memory reads/writes, such as in games. The Dell 8200 is intended as a higher-end gaming machine (or was 3 years ago.)
Sin
So the RDRAM is more like a sequential file and DDR is more like a Random Access file. Good going John I knew you could figure it out!
Glad you got the ram, that's all you really need, your video card, although a more aged engine will likely be fine. Whatever you do do not upgrade the drivers if its that 71.xx nvidia package, or if you do be ready to back it down to the older drivers because those seem to be troublesome. Just play this next BWE you got 10 days after that to get another video card should it be a problem.
Glad you got the ram, that's all you really need, your video card, although a more aged engine will likely be fine. Whatever you do do not upgrade the drivers if its that 71.xx nvidia package, or if you do be ready to back it down to the older drivers because those seem to be troublesome. Just play this next BWE you got 10 days after that to get another video card should it be a problem.
John TrickKnee
You remind me of my ex-wife. And that's not a good thing.
Sin
I bet she would think it's the best thing ever happened to you, not that someone reminds you of her, but that you realize what a fine upstanding person who deserved much greater influence over your life that she is.
Loviatar
Quote:
Originally Posted by John TrickKnee
Now, I think my nVidia GeForce 4 (64Mb, model 420) will be the bane of my existence. |
if circumstances allow you to upgrade your video card pick almost any budget card that supports direct x 9
the problem with the 420 mx is that the memory on that is single data rate instead of the standard double data rate (for some reason nvidia doesnt prominently advertise this)
the other part they dont trumpet is they cut costs on the mx line from top (460) to bottom (420) by stripping everything that made the GF3 and GF4 important which was the programable pixel and vertex shaders
those are what make all those neat effects that show with direct x 8/9 but not direct x 7 which is what the MX line is stuck with
you will notice a dramatic improvement with a better card
Sin
Thank you loviatar. My 5200 is my first nvidia so don't know much about them, but have noticed most of the time their older stuff is fine with newer softwares, although not with all the glossy extras. It appears this time it won't really work at all huh?
Loviatar
Quote:
Originally Posted by Sin
Thank you loviatar. My 5200 is my first nvidia so don't know much about them, but have noticed most of the time their older stuff is fine with newer softwares, although not with all the glossy extras. It appears this time it won't really work at all huh?
|
aside from being slow on higher graphics settings i dont see any problem
PhineasToke
I have played it on 4 systems in the house, 2 with 1GIG, 1 with 768MB, 1 with 512MB. All four have the same video cards, (5200, 1 with 256MB) overclocked (and stable) PIV 2.8, PIV 2.5, AMD 2600, AMD 2200.
Zero problems. No difference except the HD in the 2.5 system has a 2MB cache, which causes a little lag.
I personally think that the game is so friendly, forgiving and well designed that 512 would be plenty.
Zero problems. No difference except the HD in the 2.5 system has a 2MB cache, which causes a little lag.
I personally think that the game is so friendly, forgiving and well designed that 512 would be plenty.
John TrickKnee
Thanks PhineasToke. Yours was certainly the most on-topic and useful post thus far. Knowing that more than 512Mb doesn't help significantly makes me feel better about my decision to add the 256Mb of RDRAM (rather than adding 512Mb or getting a new DDR RAM motherboard). I did play two public weekends and found the game worked well even with my 256Mb.
Sin: It's pointless to respond to someone who can't admit a mistake. But since I'm the Village Idiot, I will anyway. Memory is different from file systems. DDR and RDRAM are BOTH random access memory. The only difference is that RDRAM is faster for gaming, even though the signals have to pass through 4 sticks with RDRAM, and just 1 stick with DDR. RDRAM doesn't have to read every byte of memory in sequence to find what it's looking for. Okay?
Sin, Lovitar: Here are the specs on the 5200. The comment is not mine, but the reviewer's:
The good news is that GW does NOT use DirectX 9.0. It uses DirectX 8.0. So it may not be quite so awful as the reviewer states, at least for GW.
Here are the specs on my GeForce4 MX 420 (well, he didn't review the 420; the 420 just has 64Mb memory) from the same reviewer:
Clearly, my card's lack of DirectX 8.0 is the reason the GW auto-detect sets my overall graphics setting to lowest.
So now I'm looking at upgrading my video card. The 6600GT at NewEgg.com costs 175.00 USD for 128Mb of GDDR3 memory (AGP). Hang on, I have to take my dog out before he chews through my door... time passes ...
Okay, I'm back. The 6600 at NewEgg.com costs 145.00 USD for 128Mb of DDR memory (AGP).
BUT, I can get a used GeForce4 TI 4600 on eBay for around 60 to 80.00 USD. Here are the specs on it:
Another factor is that I may need a power supply bigger than my pitiful Dell box 250 Watter for a 6600GT. I dunno. My brain hurts again. I know, I'll hurt it back.
Sin: It's pointless to respond to someone who can't admit a mistake. But since I'm the Village Idiot, I will anyway. Memory is different from file systems. DDR and RDRAM are BOTH random access memory. The only difference is that RDRAM is faster for gaming, even though the signals have to pass through 4 sticks with RDRAM, and just 1 stick with DDR. RDRAM doesn't have to read every byte of memory in sequence to find what it's looking for. Okay?
Sin, Lovitar: Here are the specs on the 5200. The comment is not mine, but the reviewer's:
Quote:
Chipset: GeForce FX 5200 Memory: 128 MB DX Version: 9.00 Core Speed: 250 MHz Memory Speed: 400 MHz Bus Width: 128 bit Pixel Pipelines: 4x1 Comment: The dreadfull 5200. Nvidia's low end DX 9 card, released in 2003 was designed to compete with ATIs DX 8.1 9000 series. However, the card lacked any real power to handle DX 9 at all, and barely DX 8. The TI 4200 series that it attempted to replace is a faster card even today. Combined with a slow processor and memory configurations, along with the FX's horrible Pixel Shader 2.0 implementation, it is not a reccommended card. |
Here are the specs on my GeForce4 MX 420 (well, he didn't review the 420; the 420 just has 64Mb memory) from the same reviewer:
Quote:
Chipset: GeForce 4 MX 440 Memory: 128 MB DX Version: 7.00 Core Speed: 275 MHz Memory Speed: 512 MHz Bus Width: 128 bit Pixel Pipelines: 2x2 Comment: The GeForce 4x0 MX series. Often considered a retuned GeForce 2. These cards actually had fewer features than a GeForce 3. Being only DX 7 based, they lacked any features for any modern games using Pixel and Vertex shaders. Many of these cards made it into premanufactured PCs (Dell/Gateway/Compaq) and were sold as GeForce 4 graphics. Many consumers assumed GeForce 4 and "upgraded" only to find out their GeForce 3 they replaced was superior. |
So now I'm looking at upgrading my video card. The 6600GT at NewEgg.com costs 175.00 USD for 128Mb of GDDR3 memory (AGP). Hang on, I have to take my dog out before he chews through my door... time passes ...
Okay, I'm back. The 6600 at NewEgg.com costs 145.00 USD for 128Mb of DDR memory (AGP).
BUT, I can get a used GeForce4 TI 4600 on eBay for around 60 to 80.00 USD. Here are the specs on it:
Quote:
Chipset: GeForce 4-TI 4600 Memory: 128 MB DX Version: 8.10 Core Speed: 300 MHz Memory Speed: 650 MHz Bus Width: 128 bit Pixel Pipelines: 4x2 Comment: The TI 4600, released in February 2002, enjoyed huge success in the first half of 2002. It was unrivaled by anything out at the time except its own slower cousins. Featuring 128MB of memory and a 4x2 architecture, it smoked everything in benchmarks of the time. With the added pixel shading unit, the game soared in the first DirectX 8 games released. It enjoyed this unparalelled success until August 2002, when ATI released their 9700 series cards. |
Marksmann
I have a couple of P4 2.4ghz pc's, one with an overclocked GeForce 4400 and the other with a 4600, both with 768 mg of DDR RAM, and they run GW smoothly with no glitches or hiccups.
John TrickKnee
Thanks Marksmann, that's good to hear.
As it's very late, my brain is thinking evil thoughts. Idiotic evil thoughts, perhaps.
Does anyone think there is a possibility that GW will switch from DirectX 8 to DirectX 9 for the final release? If they stick with DX 8 for the initial release, I think it would be hard for them to switch to DX 9 later, since ppl would have equipped their systems presuming a DX 8 game. I wonder...
Or GW could provide a choice of DX 8 or DX 9 clients. Hm,mmmm....
As it's very late, my brain is thinking evil thoughts. Idiotic evil thoughts, perhaps.
Does anyone think there is a possibility that GW will switch from DirectX 8 to DirectX 9 for the final release? If they stick with DX 8 for the initial release, I think it would be hard for them to switch to DX 9 later, since ppl would have equipped their systems presuming a DX 8 game. I wonder...
Or GW could provide a choice of DX 8 or DX 9 clients. Hm,mmmm....
Lansing Kai Don
Quote:
Originally Posted by John TrickKnee
Thanks Marksmann, that's good to hear.
As it's very late, my brain is thinking evil thoughts. Idiotic evil thoughts, perhaps. Does anyone think there is a possibility that GW will switch from DirectX 8 to DirectX 9 for the final release? If they stick with DX 8 for the initial release, I think it would be hard for them to switch to DX 9 later, since ppl would have equipped their systems presuming a DX 8 game. I wonder... Or GW could provide a choice of DX 8 or DX 9 clients. Hm,mmmm.... |
Lansing Kai Don
P.S. Yes, for those of you that missed me I'll be off for awhile but I'll try to check in (studying and doing homework ahead of time so I don't have to worry about it this weekend)
P.P.S. A little bit of technical information that you probably didn't need to know. To generalize, DDR and RDRAM have the same theoretical bandwidth at a given speed (say 200Mhz) and their performance are pretty similar. But as we all know, the more RDRAM sticks the higher the bandwidth (it's beautiful, I know) so DDR came out with Dual Channel which puts them pretty much the same again (w/ two sticks). I also want to burst some bubbles... increasing your FSB frequency does not improve that much. For example, let's take a 100 MHz (CL3) bus improved to a 200MHz DUAL CHANNEL (CL2,2,2,5) bus... so you'd think that your memory access time would improve dramatically (around 80-100%). Actually, you might reach (after computations), around 17-20% during an intensive game that doesn't prefetch very well. So a higher FSB will not solve your performance bottlenecks. MORE memory however can do wonders depending on how many processes are running and the resources currently allocated. I could go into a long spiel about page tables and usage but I won't... unless you want me to . This is how I'd mark utilization of memory during a graphic intensive game. This is completely my OPINION, I haven't even performed simple computations besides probability of page frame replacement and using GW requirement of 256MB as my basis.
256MB(196) 90% Utilization
384MB(324) 72% Utilization
512MB(552) 58% Utilization
640MB(580) 50% Utilization
768MB(708) 46% Utilization
896MB(836) 43% Utilization
1024MB(964) 41% Utilization
These are just approximations (and are based on 256MB(sys requirement)-approx. 60M (system resources) and are not to be relied on.. I'm not saying worthless ( I got the simple reduced equations out of an old Operating Systems book), they just show that to an extent the program "fills the memory you give it". The more memory you have, the less page frame replacement. But you notice, after 512MB you only make minor leaps in utilization (i.e. 8%, 4% 3% 2%)... these are NOT increase in performance. In fact it's late, let's have some fun, if you have a 20% I/O wait (keyboard/mouse) therefore an 80% CPU utilization time. Then 1- p^n= 1-.8^(number of processes) let's say on average 10 (generalized) 1-.8^10= 89.3% CPU utilization. Ok, now your CPU is currently used 89.3% of the time or 8.93 sec out of 10 it is processing and not waiting for input. Make a random variable x, to associate that the probability that CPU is working. Random variable, y1, for a read from memory. y2 for a page frame replacement. Taking the gap from 512 to 640 as an example. The probability that the CPU will read from memory is x given y1 or f(x/y1)=f(x,y)/f(x). While the probability there will be a page frame replacement is f(x/y2)=f(x,y)f(x). f(x)= int(e^(-x-y),dy,0,infiniti) = e^(-x). f(x,y)=e(-x-y). So, f(x/y1)=e^(-y)*.017, which equals as y approaches 0, =.017, as approaches infiniti,=0. Same process for y2, and we get 2% or 0.02. Remember, that y1 and y2, are the probability that the CPU will read from memory and a page frame replacement. Now, if you never need to do either, you will gain a .037 or 3.7%increase in performance... if you only need to do 2 (every 10 sec or nsec etc.. any set of time) then your performance increase will be .0043 or .43% (basically nothing). So for ANY little performance increase you have to do less than 2 writes/page replacements.
I did this for the jump from 256 to 384 and found them to be 12.42% (for 0 reads from memory and no page file replacement) and a whopping 7.39% ( 2 page file replacements). NOW this is 8% performance increase just because you jumped in RAM, this SHOULD be noticeable (that's an understatement). If my numbers are wrong somebody let me know, its almost 1 in the morning and I can't really see my calculator. I just felt the need to contribute. SO in essence a jump from 256MB to 512MB, you would receive the addition of the two and get the numbers 17.55% for 0 replacements, and 9.94% for 2 replacements (all in a timespan out of 10). A 10% increase you SHOULD very well notice TrickKnee but if you would have gotten 512MB sticks and gone from 256 to 768... then I add the four(after doing the computation for 640-768). You would see a 22.47% for 0 replacements, and 11.04% for 2 replacements.... this is very close to just a 1% increase in plausible system performance (from the jump of 256 to 512) assuming my numbers aren't completely wrong. Sorry for all the typos. Hope this helped ease your mind.
John TrickKnee
I picked up a used TI 4400 card off eBay for 48.00 USD shipped. It's in and working great. My 3DMark 2001 benchmark (the latest one my MX 420 would run) is now 2.5x more than it was with the MX 420. The GW graphics slider auto-detects to the position just short of max, rather than the lowest position with the MX 420. So, I'm as happy as I'm allowed to be.
Lansing Kai Don
Quote:
Originally Posted by Mss Drizzt
OK you can NEVER have enough.
I run 1 gig rdram and I love it. 768 is the min due to XP and the min it requires. Lansing can you make your shut-down program avilable for download. |
Lansing Kai Don
P.S. If you could recommend a free hosting site, I could do it that way. OR just give it to someone that already has hosting. It's no big secret, just automation of the batch commands of killing processes. I also edited it to run my game by entering a letter (but the only way this works is if it creates small batch files for each game in the home directory, so unless you like clutter I'd recommend taking this portion out... or I can beforehand).
Lunarbunny
The trick with the socket 939 AMD64's (Like the 3200+ I have) is that they have 2 memory controllers on the die. That way they can run 2 channels at once, so you match the DDR in pairs on each channel. I'm sure I'm passing RIMM with my dual channel PC3200.
Balthasar
does anyone know a decent graphics for me to upgrade to?
also if u wanted to add more storage could u do this without affecting any of your current files?
also if u wanted to add more storage could u do this without affecting any of your current files?
John TrickKnee
LunarBunny:
You may well be correct since PC3200 is 400MHz and two channels would effectively be 800MHz, which is the speed of my RDRAM (PC800). But the DDR2 technology is spankin' new. I got PC800 three years ago, and there was PC700 and PC600 before that. So that makes RDRAM at least four years old. And the latest RDRAM is PC1600, I think (1.6GHz).
So DDR is making advances and so is RDRAM. I do realize that RDRAM is probably only 10% of the market. It was only put in high-end gaming machines, like my 3-year old Dell 8200. I guess it isn't easy selling faster RAM at 2.5x the DDR price when it's hard to quantify the advantage. Whereas the price advantage of DDR is immediate and compelling, although DDR2 prices are quite a bit higher than DDR.
You may well be correct since PC3200 is 400MHz and two channels would effectively be 800MHz, which is the speed of my RDRAM (PC800). But the DDR2 technology is spankin' new. I got PC800 three years ago, and there was PC700 and PC600 before that. So that makes RDRAM at least four years old. And the latest RDRAM is PC1600, I think (1.6GHz).
So DDR is making advances and so is RDRAM. I do realize that RDRAM is probably only 10% of the market. It was only put in high-end gaming machines, like my 3-year old Dell 8200. I guess it isn't easy selling faster RAM at 2.5x the DDR price when it's hard to quantify the advantage. Whereas the price advantage of DDR is immediate and compelling, although DDR2 prices are quite a bit higher than DDR.
Bgnome
Quote:
Originally Posted by Balthasar
does anyone know a decent graphics for me to upgrade to?
also if u wanted to add more storage could u do this without affecting any of your current files? |
Lansing Kai Don
Quote:
Originally Posted by Lunarbunny
The trick with the socket 939 AMD64's (Like the 3200+ I have) is that they have 2 memory controllers on the die. That way they can run 2 channels at once, so you match the DDR in pairs on each channel. I'm sure I'm passing RIMM with my dual channel PC3200.
|
Lansing Kai Don
Lunarbunny
I'll just say that I'm glad that I have competetive speeds for less than Rambus. This runs nice and fast, and every time I looked at RIMM I just couldn't stand the prices.
I don't think AMD ever had a contract with Rambus, so it's not even available for their CPUs.
EDIT: Yes, the true cycle speed is 2[channels]x200[MHz]=400MHz, but in the spirit of DDR, it runs at 2[channels]x200[MHz]x2[DDR]=800MHz (DDR is called what it is because it utilizes the top and bottom of the clock cycle)
I don't think AMD ever had a contract with Rambus, so it's not even available for their CPUs.
EDIT: Yes, the true cycle speed is 2[channels]x200[MHz]=400MHz, but in the spirit of DDR, it runs at 2[channels]x200[MHz]x2[DDR]=800MHz (DDR is called what it is because it utilizes the top and bottom of the clock cycle)
Lansing Kai Don
Quote:
Originally Posted by Lunarbunny
I'll just say that I'm glad that I have competetive speeds for less than Rambus. This runs nice and fast, and every time I looked at RIMM I just couldn't stand the prices.
I don't think AMD ever had a contract with Rambus, so it's not even available for their CPUs. EDIT: Yes, the true cycle speed is 2[channels]x200[MHz]=400MHz, but in the spirit of DDR, it runs at 2[channels]x200[MHz]x2[DDR]=800MHz (DDR is called what it is because it utilizes the top and bottom of the clock cycle) |
Lansing Kai Don
Venjance
running with 512 RAM on my P4 2.8
Just picked up a stick of PNY 512MB PC3200 DDR 400MHz Memory for $20 here:
http://www.tigerdirect.com/applicati...6-3906&afsrc=1
Just picked up a stick of PNY 512MB PC3200 DDR 400MHz Memory for $20 here:
http://www.tigerdirect.com/applicati...6-3906&afsrc=1