ATI to launch DirectX 11 GPUs in seven weeks (Sept 1x)

Brett Kuntz

Brett Kuntz

Core Guru

Join Date: Feb 2005

Quote:
For the past week and a half, we've been hearing rumors citing various dates for the launch of AMD's DirectX 11 graphics cards… ranging from GDC in China to newly invented delays that would push the launch window in to November.

However, all of these rumors are false, given that AMD has firmed its "Cinema 3.0/DirectX 11" launch. As tradition goes, the North American launch will happen in seven weeks in San Francisco on a very special place indeed - but we're not at liberty to say where the event will take place. All we can say is that it will be someplace that neither Intel nor nVidia would even remember.
http://www.brightsideofnews.com/news...ven-weeks.aspx

vamp08

vamp08

Krytan Explorer

Join Date: Nov 2006

PA, USA

[COPY]

D/

wow, where is nvidia?

Oh, and on a second note; why is DX11 comming out when almost 95% of games are still in 9.0c?

own age myname

own age myname

Desert Nomad

Join Date: Sep 2007

Minnesota

[TAS]

R/

Quote:
Originally Posted by vamp08 View Post
wow, where is nvidia?
I was thinking the same lol.

Faer

Faer

La-Li-Lu-Le-Lo

Join Date: Feb 2006

Quote:
Originally Posted by vamp08 View Post
...why is DX11 comming out when almost 95% of games are still in 9.0c?
Because it's used for more than just gaming.

kupp

kupp

Jungle Guide

Join Date: Feb 2008

The Shiverpeaks

[KISS]

W/

I kinda feel cheated, having boasted out a good 300 euros, that's easily 450 dollars for you american folks on an Nvidia 9800GTX for DX10 a year ago. Now that DX11 is beeing released I'll have to buy a new card for it. And I'm not one of those people that rebuilds his computer every year or so, not even a part of it.

However, Directx 11 games shouldn't be coming out any time soon *I think*.

On Volt

Ascalonian Squire

Join Date: Jan 2008

Ethereal Light of Dwayna

A/

I think they will, since W7 will be more widely bought if you compare it to Vista when that released DX10.

Nanood

Nanood

Wilds Pathfinder

Join Date: Aug 2005

Supermans Crystal Palace

Legion Of The Dark Sun

No DX11 games will appear for ages. DX10 took ages for decent software to appear. I guess the people who must have the latest will jump on board and ooh and aah for a while and the rest of us will pick one up when our current card is up for replacement and at a much more affordable price.

Ec]-[oMaN

Ec]-[oMaN

Desert Nomad

Join Date: May 2005

Toronto, Ont.

[DT][pT][jT][Grim][Nion]

W/

By the time a dx11 title comes out you'd probably already want the successor to the upcoming release of GPUs anyways.....Come September is the perfect time to buy the fastest last gen GPU anyways for half the price since it was released.

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

dx11 capable hardware will likely see improvements to dx10/10.1 games. however, i feel graphic card power has gone ahead of software requirements for the most part. if you have a capable graphic card right now, there's no need to upgrade to the first generation dx11 cards.

Bob Slydell

Forge Runner

Join Date: Jan 2007

Quote:
Originally Posted by Ec]-[oMaN View Post
By the time a dx11 title comes out you'd probably already want the successor to the upcoming release of GPUs anyways.....Come September is the perfect time to buy the fastest last gen GPU anyways for half the price since it was released.
I'm about to go out and get one of these cards ive drooled over the past year if that's true.

Brett Kuntz

Brett Kuntz

Core Guru

Join Date: Feb 2005

There are 6 major DX11 games scheduled for release by the end of 2009!

And never compare anything to DX10. DX10 was a flop, 50% because it was Vista only, and 50% because nVidia demanded Microsoft nerf DX10 because they didn't have the technology to create some of the things DX10 was supposed to have; like hardware tessellation and multi-core rendering. That's why Microsoft had to create a special version of DX10 for ATI only, called DX10.1!

And in the end, DX11 cards will run DX10 games faster, just like 10 ran 9 and 9 ran 8 etc.

Elder III

Elder III

Furnace Stoker

Join Date: Jan 2007

Ohio

I Will Never Join Your Guild (NTY)

R/

ATI sure sounds like a good deal this Autumn, although I'm waiting for some reputable sites to give a few benchmarking runs and reviews before I'll buy any - either for myself or builds i do for other people.

Killamus

Guest

Join Date: Oct 2008

Quote:
Originally Posted by Kuntz View Post
There are 6 major DX11 games scheduled for release by the end of 2009!
Could you link to these? I've only been able to find 4, none of them qualify as "Major" titles. Most of the upcoming games are coming out as DX10.

Brett Kuntz

Brett Kuntz

Core Guru

Join Date: Feb 2005

Quote:
Originally Posted by Killamus View Post
Could you link to these? I've only been able to find 4, none of them qualify as "Major" titles. Most of the upcoming games are coming out as DX10.
Nope! The articles state 6 but do not specify which, and I don't care enough to investigate, since I am buying a 5870 regardless!

vamp08

vamp08

Krytan Explorer

Join Date: Nov 2006

PA, USA

[COPY]

D/

Quote:
Originally Posted by Kuntz View Post
And never compare anything to DX10. DX10 was a flop, 50% because it was Vista only, and 50% because nVidia demanded Microsoft nerf DX10 because they didn't have the technology to create some of the things DX10 was supposed to have; like hardware tessellation and multi-core rendering. That's why Microsoft had to create a special version of DX10 for ATI only, called DX10.1!
ATI may (and I REALLY mean MAY) have had an edge over nvidia in terms of tech for dx10, but they have the worst first-party drivers in the business.

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

you sir, needs to update your knowledge. it is at least 5 years out of date.

refer

refer

Jungle Guide

Join Date: Jan 2009

US

Quote:
Originally Posted by kupp View Post
I kinda feel cheated, having boasted out a good 300 euros, that's easily 450 dollars for you american folks on an Nvidia 9800GTX for DX10 a year ago. Now that DX11 is beeing released I'll have to buy a new card for it.
You have a problem. Will you see any real difference? Probably not. So why are you buying it? To keep up with the Joneses?

Brett Kuntz

Brett Kuntz

Core Guru

Join Date: Feb 2005

Quote:
Originally Posted by vamp08 View Post
ATI may (and I REALLY mean MAY) have had an edge over nvidia in terms of tech for dx10, but they have the worst first-party drivers in the business.
That couldn't be further from the truth, ATI's drivers are top quality. ATI's drivers are so good, I don't even need to reinstall them if I change GPU's, change the # of GPU's I have, or even change the location/position of those GPU's in my PCI-E slots.

The only people I hear having trouble with ATI's drivers, are the same people that would have trouble with nVidia's drivers, because both companies pretty much offer the same "driver experience". I would know, I own and deal with them all, all the time, including even their laptop variations. Both companies offer a "driver install" and "driver uninstall", and if you do both correctly, you'll never have a driver issue, unless you run into a rare bug with a specific game or scenario. And even in those cases, if you follow the #1 rule of "Don't update unless you absolutely need to", it's unlikely you run into that problem either.

Tarun

Tarun

Technician's Corner Moderator

Join Date: Jan 2006

The TARDIS

http://www.lunarsoft.net/ http://forums.lunarsoft.net/

Quote:
Originally Posted by Kuntz View Post
There are 6 major DX11 games scheduled for release by the end of 2009!

And never compare anything to DX10. DX10 was a flop, 50% because it was Vista only, and 50% because nVidia demanded Microsoft nerf DX10 because they didn't have the technology to create some of the things DX10 was supposed to have; like hardware tessellation and multi-core rendering. That's why Microsoft had to create a special version of DX10 for ATI only, called DX10.1!

And in the end, DX11 cards will run DX10 games faster, just like 10 ran 9 and 9 ran 8 etc.
I'm using a ATI DirectX 10.1 card and I've never been more pleased. Even the previous card I had in this system was ATI and it outperformed the nvidia I tried.

I especially liked this:

Direct3D 10.1 is an incremental update of Direct3D 10.0 which is shipped with, and requires, Windows Vista Service Pack 1. This release mainly sets a few more image quality standards for graphics vendors, while giving developers more control over image quality. It also adds support for parallel cube mapping and requires that the video card supports Shader Model 4.1 or higher and 32-bit floating-point operations. Direct3D 10.1 still fully supports Direct3D 10 hardware, but in order to utilize all of the new features, updated hardware is required. As of June 16, 2009, only ATI's Radeon HD 4000 and HD 3000 series, NVIDIA's GeForce 200M series and S3's Chrome 4xx GTX series of GPUs are fully compliant, NVIDIA has yet to release a DirectX 10.1 compliant desktop card. - Source

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

not to mention, the company that holds the "worst first-party drivers in the business" would be Creative (assuming you meant tech companies in general).

Omega X

Omega X

Ninja Unveiler

Join Date: Jun 2005

Louisiana, USA

Boston Guild[BG]

W/Me

Quote:
Originally Posted by Tarun View Post
NVIDIA has yet to release a DirectX 10.1 compliant desktop card.[/b] - Source
Nvidia stated that they were skipping DX10.1 a long time ago as it only makes certain things mandatory from the 10.0 spec.

DX10's failure was all Vista's bad rap due to the press chewing it up. Regardless there were a couple of good games that looked and played better in DX10. DirectX 11 is largely backwards compatible to DX10 hardware so there is no need to shell out for a new graphics card if its not that old unless you just got to have Shader Model 5.0 and Hardware Tessellation that badly. A lot of DX10 ATI cards already have a tessellation capability anyway.

Hopefully with Windows 7's success, people can finally see game makers take full advantage of the API instead of just God Rays, higher textures and volumetric smoke.

Lord Sojar

Lord Sojar

The Fallen One

Join Date: Dec 2005

Oblivion

Irrelevant

Mo/Me

There is a lot more to the story when it comes to us not adopting DX10.1

It wasn't so much an issue of us being lazy or stiffing the customer. It was purely for the reasons that DX10 was seen as roadblock to R&D, and the feature sets and runtime code were presented too late (if at all) to our development teams. By the time Microsoft actually allowed a full code release of Vista and DX10, it was too late. Blame Microsoft for that one.

They have taken a different approach with Windows7, and it will show in card release timing, driver stability, and support by game developers this time around.

Thankfully, Microsoft got a clue about how to present code and deal with Ring0 implementation. You can't just drop your baggage at the developers doorstep and say "pick that up would you?" They learned their lesson, and are taking steps to fixing it.


As for driver issues...

Let me break some shocking news on that front. We and ATi are fed up with game companies expecting us to add in driver fixes for their shady game engine code. A few companies that come to mind are Activision, Blizzard (by far the worst...), EA Europe, Capcom, LucasArts, Rockstar, and Bethesda. All of these companies are notorious for having significant amounts of holes in their code that they expect us to fill, because they can get away with it.

There is an unbelievable amount of driver fixes for WoW, Oblivion, and GTA4 to name a few (we are talking millions of lines of code per game) This isn't just us, but ATi as well (though I don't know ATi's specific numbers). Game developers need to take more responsibility for their shady coding techniques and the shortcuts they take that our driver teams have to attempt to clean up. If you play WoW, for example, you would know how terrible Dalaran is coded... and the vast majority of Northrend for that matter.

But, not all developers are known as bad eggs when it comes to code. Want to know one of the best companies in the world for engine design? iD Software! They make impeccable engines that require little driver optimization. More developers should take some lessons from companies like iD and Epic Games (developers of the Unreal engine).


As for nVidia in this whole launch issue:

We are having some issues with our chip design and TSMC's 40nm process; it's true. Yields are low (I cannot discuss details on numbers at this time), and there isn't much we can do about it but do an A2 spin. This issue will be corrected, and we think that GT300 will prove to be an excellent purchase option for Windows7 users who want to get the most out of their OS and gaming experience.

Suffice to say, ATi is experiencing similar yield issues, but I cannot and will not speak for their release time frames. They have a radically different chip design, with more PCB focus; whereas GT300 is more chip centric. It is our belief that GT300 will prove to be the best card you can buy, and will provide more than enough performance for even the most demanding enthusiast.

In a nutshell... we are confident that GT300 will prove to be an excellent solution and will take better advantage of the Windows7 environment while taking next generation gaming to the next level. Because, we have a chip that can do more than just graphics.

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

i have to agree about bethesda. fallout 3 and oblivion are possibly the buggiest games i've ever played (even though they are also the two of the greatest games ever created. go figure).

Brett Kuntz

Brett Kuntz

Core Guru

Join Date: Feb 2005

Fallout 3 was awesome, easily my favorite game in a long time! I just bought it yesterday and am going to play through it again (try-before-you-buy ill downloaded it for my first playthrough!).

And the driver support issue will never go away, because designers create horrible engines, then it leaves room for one driver company to create a special mod, just for that game, that raises frame rates, then that company gets to raise their arms in the air and scream "our GPU is faster....see here's proof!" and then the other company needs to do the same in order to not look like they have a poor performing GPU. And in some cases, these optimizations decrease image-quality, which is "cheating" frame rates. And both companies were caught cheating/lowering image quality in 3D benchmarks like 3dmark06. =\

Blackhearted

Blackhearted

Krytan Explorer

Join Date: Jan 2007

Ohio, usa

none

Mo/

Quote:
Originally Posted by Rahja the Thief View Post
There is an unbelievable amount of driver fixes for WoW, Oblivion, and GTA4 to name a few (we are talking millions of lines of code per game) This isn't just us, but ATi as well (though I don't know ATi's specific numbers). Game developers need to take more responsibility for their shady coding techniques and the shortcuts they take that our driver teams have to attempt to clean up. If you play WoW, for example, you would know how terrible Dalaran is coded... and the vast majority of Northrend for that matter.
Ha, yea. The poor coding was pretty obvious in dalaran. That area, and several other areas in northrend, run amazingly bad. I also agree that dev's need to be less lazy.

riktw

Wilds Pathfinder

Join Date: Jul 2008

netherlands

Mo/E

aw well, i got an second had HD4870*2 for 150 euro so i will wait until i can get an HD5870 or something for 150 euro.
but my HD4870*2 plays everything very good atm.
and for drivers, ati control center sucks, nvidia control center sucks, but for ati you can download drivers only (20MB)

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

nvidia's control panel is better, purely because you can have profiles binded to individual applications. CCC does not have this ability. i could go and get antitraytools, but i would really appreciate it if the official software can give me that ability.

Brett Kuntz

Brett Kuntz

Core Guru

Join Date: Feb 2005

Quote:
Originally Posted by moriz View Post
nvidia's control panel is better, purely because you can have profiles binded to individual applications. CCC does not have this ability. i could go and get antitraytools, but i would really appreciate it if the official software can give me that ability.
Uhh ATI has supported application detection as far back as I can remember. I've never personally used it, but it's always been there, even when CCC didn't exist and all your options were still in your Display Properties.

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

that's not actually what it looks like. it does basically the backward version of what i want: it loads an application IF i load a profile. i want it to load a profile when i load an application.

Brett Kuntz

Brett Kuntz

Core Guru

Join Date: Feb 2005

Quote:
Originally Posted by moriz View Post
that's not actually what it looks like. it does basically the backward version of what i want: it loads an application IF i load a profile. i want it to load a profile when i load an application.
You can either Hotkey a profile, create a desktop icon (which in turn executes an application/game), or use the little right-click menu. I am not sure why they do not include Application Detection out-right though, but I am pretty sure they did at one point in time. I've always just used defaults for the most part, and in-game options.

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

i basically want the card to underclock itself whenever i open GW. let's face it, i don't need my 4890 running 900/1000 to maintain 60 fps in GW. heck, my old 4850, with its original buggy vbios, was running GW flawlessly with it stuck in "low power 3D" for a month before i noticed.

Blackhearted

Blackhearted

Krytan Explorer

Join Date: Jan 2007

Ohio, usa

none

Mo/

Quote:
Originally Posted by moriz View Post
nvidia's control panel is better, purely because you can have profiles binded to individual applications. CCC does not have this ability. i could go and get antitraytools, but i would really appreciate it if the official software can give me that ability.
I agree quite a bit with that, as that's one thing i missed when i got this 4850. The simple to use per application profiles. I don't see why ati can't make that.

Tarun

Tarun

Technician's Corner Moderator

Join Date: Jan 2006

The TARDIS

http://www.lunarsoft.net/ http://forums.lunarsoft.net/

I've found nvidia's "control panel" to be a a piece of junk. Every system I've seen it on causes the system to lag when opening. Have yet to see that issue when opening ATI's Control Panel, and CCC works quite well, even on old systems.

Brett Kuntz

Brett Kuntz

Core Guru

Join Date: Feb 2005

With ATI, you can just install the display drivers too. So if you don't really use the CCC for anything, there is no point installing it.

Lord Sojar

Lord Sojar

The Fallen One

Join Date: Dec 2005

Oblivion

Irrelevant

Mo/Me

Quote:
Originally Posted by Tarun View Post
I've found nvidia's "control panel" to be a a piece of junk. Every system I've seen it on causes the system to lag when opening. Have yet to see that issue when opening ATI's Control Panel, and CCC works quite well, even on old systems.

It pools system information before it opens. That causes that delay. Unfortunately, no way around that.

Tarun

Tarun

Technician's Corner Moderator

Join Date: Jan 2006

The TARDIS

http://www.lunarsoft.net/ http://forums.lunarsoft.net/

There is, it's called async threading and better programming; something nvidia is lacking ,just look at those horrid drivers.

Lord Sojar

Lord Sojar

The Fallen One

Join Date: Dec 2005

Oblivion

Irrelevant

Mo/Me

Quote:
Originally Posted by Tarun View Post
There is, it's called async threading and better programming; something nvidia is lacking ,just look at those horrid drivers.
Our drivers are very well programmed. There are so many factors involved with modern driver releases, and laziness on the game developer and OS developer side didn't help drivers in the last few years. Suffice to say, Windows7 may change that a bit (but that is totally up to MSFT not being complete morons this time around)

Brett Kuntz

Brett Kuntz

Core Guru

Join Date: Feb 2005

Quote:
Originally Posted by Rahja the Thief View Post
We are having some issues with our chip design and TSMC's 40nm process; it's true. Yields are low (I cannot discuss details on numbers at this time), and there isn't much we can do about it but do an A2 spin. This issue will be corrected, and we think that GT300 will prove to be an excellent purchase option for Windows7 users who want to get the most out of their OS and gaming experience.

Suffice to say, ATi is experiencing similar yield issues, but I cannot and will not speak for their release time frames.
ATI has already released 2 generations of 40nm chips, the 4770, and is currently farming their 3rd generation which will be the 58xx series in September. They did have issue with 40nm in the first generation, which lead to yield/stock issues for the 4770. But the 2nd generation proved successful and the 4770 is well stocked across all stores now. It appears their 3rd generation 58xx's are going as planned and their Sept launch will be fine.

nVidia has still yet to complete 1 generation of 40nm, so they are a little behind in that aspect, which is why they are having the 6 month delay in release.

http://service.futuremark.com/hardware

Due to nVidia's huge die size, I think once again they will lose this generation. In fact there was a great article written by an nVidia engineer on how they learned from the GT200 series and that having a huge die size is actually a bad thing, but he also said GPU's are engineered in 3-term sets, so the GT300 was actually being engineered before the GT200 was ever really released. So they found out too late that large die sizes are detrimental to the performance & success of your chip.

4890 price: $195
GTX 280 price: $345

Both cards are pretty similar in performance. The GTX 280 die size and large memory bus-width is what drives the cost up. Large die sizes have poor yields, and cost more to manufacture, so you lose money exponentially.

There are 2 ways to increase memory bandwidth; Place more physical memory chips on your PCB so you can access more of them at once; Increase the speed in which the memory runs at.

nVidia decided to use a 512-bit bus width to access 16 64mb GDDR3 chips. Each chip has 32 pins on it, so the GPU itself needs 512 pins to hook up to all 16 chips. This increases the size and complexity of the GPU. This gave it a total of 1024mb of memory at 140gb/s bandwidth.

ATI decided to build a smaller, cheaper, simpler, easier to engineer chip with a 256-bit bus-width. This means it's die size was less than half that of the nVidia, but also meant it could only get half the memory bandwith. This is why ATI used GDDR5. GDDR5 gets twice the bandwidth as GDDR3 and 4. So in the end, both cards get roughly the same memory bandwidth, except the ATI GPU costs far less to manufacture, it's smaller, and ATI can put 2 and even 3 GPU's on a single PCB to increase performance for enthusiasts.

Tarun

Tarun

Technician's Corner Moderator

Join Date: Jan 2006

The TARDIS

http://www.lunarsoft.net/ http://forums.lunarsoft.net/

Quote:
Originally Posted by Rahja the Thief View Post
Our drivers are very well programmed
Stop lying, nvidia's drivers are horrid.

A bit old, but still very true:


Quote:
Nearly 30% of logged Vista crashes were due to NVIDIA driver problems, according to Microsoft data included in the bundle. That's some 479,326 hung systems, if you're keeping score at home, and it's in first place by a large margin -- and ATI is fourth with 9.3 percent.

Brett Kuntz

Brett Kuntz

Core Guru

Join Date: Feb 2005

Ahah that's gotta be all those laptops with nVidia solder joint problems they kept lying about, and INQ busted them using an electron microscope or something. Apple also canceled their contract and is trying to sue for their money back, since nVidia basically bricked an entire generation of Apple laptops.