GW2 / Vista / XP / DX10 question

2 pages Page 2
Snograt
Snograt
rattus rattus
#21
Abadeus, the vast majority of the downsides you posted are due to equipment/software companies not making drivers available - it's not Vista's fault (the companies had PLENTY of time to come up with the goods).

I'm happily running GW at 150fps - so what if XP can run it at 200; I'm not going to see a difference.
zamial
zamial
Site Contributor
#22
Quote:
Originally Posted by Abedeus
Oh and that Vista's support ends in 2012, XP's in 2014. And Windows 7 comes out not-so-long-from-here. Vista is like a beta version of W7...

Everything and more here: http://en.wikipedia.org/wiki/Criticism_of_Windows_Vista

This is ALMOST true Vista service pack 0 ends in 2012, we are currently on Vista sp1, and xp will go end life at the release of Windwos7.

your link = fail

That article is based on info from 2006. news flash thats 2 years old.....

Vista is the 1st stage of the windows longhorn project and windows 7 is the completed longhorn project.

KTHXBYE
isamu kurosawa
isamu kurosawa
Desert Nomad
#23
Quote:
Originally Posted by Brianna
Err, suppose it detects what version of DirectX so you don't have to do that? Who knows, but it's already been proven now (Thanks zamial) that GW2 will have DX10, so I don't see what the problem is.

And, assuming A-Net knows what they are doing, the community will not have to piss around with switches, and such. Definitely speculation on how that will work, but like I said I'm sure they know what they are doing.
You specifiecally stated running a switch. That was all i disagree'd with, nothing else.
c
cebalrai
Jungle Guide
#24
Quote:
Originally Posted by Abedeus
Vista:
- Slow
- High requirements, offers nothing worth it
- The main advantage, Aero interface, was already in use on other PCs and can be used in XP
- Doesn't work with older machines, too old procs' and graphics
- A lot of existing ones don't work, cause there are no good drivers
- DOS things and epic games work worse than on XP
- The longer it works, the slower it gets
- Some xp games dont work too...
- Good programs don't have licenses under vista.

And that's it. Oh and that Vista's support ends in 2012, XP's in 2014. And Windows 7 comes out not-so-long-from-here. Vista is like a beta version of W7...

Everything and more here: http://en.wikipedia.org/wiki/Criticism_of_Windows_Vista
That's a horrible Wiki page. It's out of date (badly), therefore it's useless since we're talking about Vista in today's incarnation.

Most notably, it doesn't consider Vista SP1.

Just look at the awful references in that article that try to cite that Vista is slower than XP. They're flawed in ways like 1) possibly pre-SP1, and 2) they compare using systems with 1 GB of RAM.

Windows XP had major issues at release and everyone was asking why they should switch to it from Win 2000. Then XP matured, just as Vista is already doing.

And if one more person says "OMG you can't find Vista drivers for things so it suxx", I'm going to laugh... Really, who is using old fossilized hardware from last millenia? And what hardware is it?

Buy a decently modern computer IMO. Or put Linux on your old machine and freeze time!
zamial
zamial
Site Contributor
#25
Quote:
Originally Posted by cebalrai
That's a horrible Wiki page. It's out of date (badly), therefore it's useless since we're talking about Vista in today's incarnation.

Most notably, it doesn't consider Vista SP1.

Just look at the awful references in that article that try to cite that Vista is slower than XP. They're flawed in ways like 1) possibly pre-SP1, and 2) they compare using systems with 1 GB of RAM.

Windows XP had major issues at release and everyone was asking why they should switch to it from Win 2000. Then XP matured, just as Vista is already doing.

And if one more person says "OMG you can't find Vista drivers for things so it suxx", I'm going to laugh... Really, who is using old fossilized hardware from last millenia? And what hardware is it?

Buy a decently modern computer IMO. Or put Linux on your old machine and freeze time!
QFT or even better turn it into a myst box like the rest of us! (not the game)

I actually read that article and even for being dated it went on to say how almost all of the"problems" were fixed or false.
KZaske
KZaske
Jungle Guide
#26
Not really wanting to bash Vista all I can say is DRM as built into that OS sucks. You can not even time shift some TV shows, unless the broadcast company chooses to let you time shift that show.
As for the real topic: DX10 support in GW2. I got the idea from reading Gaile's posts that it was an option. If you are using Vista, you will have the option of using DX10 if you are using Win XP you will be running DX9 or not playing. I do not remember Gaile ever specifing what version of DX9, but the most common video cards at the time she made the comment were DX9c. Given developments since Gaile made her comments and the improvements offered by DX10.1 I would guess that the game would scale up to that level. Most likely an auto switch detecting the highest level your video sub-system can use and load that feature set.
Just my guess.
isamu kurosawa
isamu kurosawa
Desert Nomad
#27
I have to say i found vista quite decent myself. The only issues i had where:

It tried to install drivers for various pieces of hardware twice. Not a big problem but annoying.

Most importantly though there was a serious problem caused by a variety of things that made some people loose the ability to use usb storage devices. I got hit by that and went through 3 solutions found by other people before dropping back down to XP.
I run a variety of usb storage devices so my system was crippled without them.

Try searching around the web and you will find a variety of issues in vista that can cause this to happen. May have been fixed in sp1 but i'll just stick with XP till i do another large upgrade.

I suggest people who buy a machine with vista pre-installed give it a go though, or at the very least dual boot xp.
R
Rip_Snag
Ascalonian Squire
#28
Will GW run on a 64bit vista setup??

ty in advance
Snograt
Snograt
rattus rattus
#29
Ooh, I just replied to a thread asking the very same question...

YES IT DOES!

Very happily, in fact
R
Rip_Snag
Ascalonian Squire
#30
ty ty appreciate the help..............and cant wait for that speed lol
Lord Sojar
Lord Sojar
The Fallen One
#31
Vista SP1 is fine and dandy. It is only slow if your PC is slow. Vista may be a bit bloated, but M$FT is improving the bloat, SP1 did a lot for that issue.

I am going to be upgrading this PC I am on right now to Vista after June 20th. Ooops, I said June 20th. *cough*

Just kidding guys! But here is a little known fact...

We (nVidia) will be releasing our newest graphics solution. I present to you... the Geforce GTX 280 and Geforce GTX 260.

These new cards will feature the PhysX stream processor directly built in to the PCB. In addition, they Geforce GTX 280 will feature 240 unified stream processors while its younger brother the Geforce GTX 260 will feature 192 unified stream processors.

Finally, the Geforce GTX 280 will feature a 512bit memory interface with 1GB GDDR3 onboard, and the Geforce GTX 260 will feature a 448bit memory interface with 896MB of GDDR3 memory onboard.

Excited? You should be! I was on the team that developed the fabs for the 280, so I am pretty proud of my new baby. Get ready for some amazing power regulation and heat reduction kids!
Snograt
Snograt
rattus rattus
#32
Sell me PhysX...

If you can without breaking that NDA thang

I read nVidia were going to be incorporating it, but with the almost-zero take up of the PhysX boards, I wonder why.
Lord Sojar
Lord Sojar
The Fallen One
#33
Quote:
Originally Posted by Snograt
Sell me PhysX...

If you can without breaking that NDA thang

I read nVidia were going to be incorporating it, but with the almost-zero take up of the PhysX boards, I wonder why.
Sales and marketing isn't my thing, lol.

But, the CUDA enabled GPUs we make can use the PhysX code perfectly, because GPUs excel at mass number calculation. CUDA enables us to essentially use the leftover processing power (the parts of the GPU not being used for rendering) to perform very fast, accurate math, thus creating 3D physics for PhysX enabled games.

CUDA in and of itself is a self contained C programming environment, used primarily for massive number crunching. GPUs excel (just like the Cell processor in the PS3) at number crunching, and they run at very high speeds, with 100s of stream processors. CUDA programming allows us to use a GPU more like a CPU without effecting CPU or GPU overall performance. PhysX calculations would take a massive chunk of the CPUs time to do, but because the Geforce cards actually are never fully utilized, we can tap the extra "wiggle room" and force it to calculate physics code rather then just sitting there doing nothing.

If you are savvy about technology, I would advise you to read up on CUDA a bit more.
http://www.nvidia.com/object/cuda_home.html

The website explains a lot, and you can see what we and our partners use CUDA for. It is fairly interesting. Remember, my focus is the fab production methods (hardware), not software design. So, I know how to use some basic CUDA functions and can write code with it, but I am nowhere near as skilled as our software engineers are (those guys are geniuses)
Brianna
Brianna
Insane & Inhumane
#34
Quote:
Originally Posted by Rahja the Thief
Sales and marketing isn't my thing, lol.

But, the CUDA enabled GPUs we make can use the PhysX code perfectly, because GPUs excel at mass number calculation. CUDA enables us to essentially use the leftover processing power (the parts of the GPU not being used for rendering) to perform very fast, accurate math, thus creating 3D physics for PhysX enabled games.

CUDA in and of itself is a self contained C programming environment, used primarily for massive number crunching. GPUs excel (just like the Cell processor in the PS3) at number crunching, and they run at very high speeds, with 100s of stream processors. CUDA programming allows us to use a GPU more like a CPU without effecting CPU or GPU overall performance. PhysX calculations would take a massive chunk of the CPUs time to do, but because the Geforce cards actually are never fully utilized, we can tap the extra "wiggle room" and force it to calculate physics code rather then just sitting there doing nothing.

If you are savvy about technology, I would advise you to read up on CUDA a bit more.
http://www.nvidia.com/object/cuda_home.html

The website explains a lot, and you can see what we and our partners use CUDA for. It is fairly interesting. Remember, my focus is the fab production methods (hardware), not software design. So, I know how to use some basic CUDA functions and can write code with it, but I am nowhere near as skilled as our software engineers are (those guys are geniuses)
Looks like I'm building a new computer soon then..
Lord Sojar
Lord Sojar
The Fallen One
#35
Quote:
Originally Posted by Brianna
Looks like I'm building a new computer soon then..
Indeed, but from what I hear, AMD might be releasing some really crazy cards as well. So keep your eyes on their side of the court as well. Despite my position at nVidia, that doesn't mean I don't advocate other products as my personal opinion. From what I hear, the new AMD cards use GDDR5 memory... they might just be what AMD needs to stay in the game at this point; highly competitive graphics cards with a low price tag. We shall see at the beginning to mid June!
zamial
zamial
Site Contributor
#36
AMD, did I hear AMD? Ill bring the steaks and mallows if we are going to use those chips! At least those will cook 'em.
Serafita Kayin
Serafita Kayin
Exclusive Reclusive
#37
Yeah, AMD has a 280-killer in the works. That GDDR5 is something to consider.

If you work there, can ya get a bro some sponsorship for his project? I swear it's worth your time...
d
demonblade
Academy Page
#38
for those who runs on budget computers - they are not meant for graphic intense gaming
for those who runs on high-end computers - try vista 64bit instead
C
Cyb3r
Lion's Arch Merchant
#39
Say hello to vista 64bit

i happily run 3 os'es on this Pc, Winxp for games like freelancer and one other oldy quake 3 both of those don't work flawlesly in MP in Vista sadly and one coding tool that refuses to boot up atm in vista (work on the vista version is works in progress ^^) and for 2 music apps which don't work at all in vista, there is a newer version out however i don't have the money to buy the upgrades atm and tried the demo versions and don't like the new versions either => another reason i keep xp

Second OS : Ubuntu 64 bit say hello to the pinguin All the coding except 1 compiling is done under here (both my coding for Project Crosus (huge modmanager/mod downloader tool) and Sirius Reborn (mod for freelancer))

Third Os: Vista 64bit newer games and other stuff ^^

and while there are things i don't like in all the 3 os'es i use one atleast makes up for the other so i'm happy

Oh and rahja when is nvidia releasing better drivers for linux?
and can't wait to see those new cards either
Evil Genius
Evil Genius
Lion's Arch Merchant
#40
Quote:
Originally Posted by Rahja the Thief
I am going to be upgrading this PC I am on right now to Vista after June 20th. Ooops, I said June 20th. *cough*

Just kidding guys! But here is a little known fact...

We (nVidia) will be releasing our newest graphics solution. I present to you... the Geforce GTX 280 and Geforce GTX 260.

These new cards will feature the PhysX stream processor directly built in to the PCB. In addition, they Geforce GTX 280 will feature 240 unified stream processors while its younger brother the Geforce GTX 260 will feature 192 unified stream processors.

Finally, the Geforce GTX 280 will feature a 512bit memory interface with 1GB GDDR3 onboard, and the Geforce GTX 260 will feature a 448bit memory interface with 896MB of GDDR3 memory onboard.

Excited? You should be! I was on the team that developed the fabs for the 280, so I am pretty proud of my new baby. Get ready for some amazing power regulation and heat reduction kids!
Wow you work for NVIDIA! I have always been nvidia owner (Geforce 3, 5, 7950gx2, 8800 ultra) and its sounds like its time to sell my Ultra and tell my friends the release date is 20th of June lol.

Unfortunately those leaked pictures of the card don't say much about its performance, so can you leak any 3DMark 06 scores/Crysis benchmarks please?
(http://www.fudzilla.com/index.php?op...73&Ite mid=34)

O and btw its not quite a "little known fact": tech sites have been going on for months about it, especially in recent weeks.