Stop lying, nvidia's drivers are horrid.
A bit old, but still very true: |
I'd like to see a similar graph for XP, we all know Vista was Microsoft's idea of a bad joke. Or, at least, I hope it was.
Killamus
Stop lying, nvidia's drivers are horrid.
A bit old, but still very true: |
Faer
You're seriously going to fault Nvidia for their drivers when Microsoft crashes IT'S OWN SOFTWARE 18% of the time?
I'd like to see a similar graph for XP, we all know Vista was Microsoft's idea of a bad joke. Or, at least, I hope it was. |
Lord Sojar
Ahah that's gotta be all those laptops with nVidia solder joint problems they kept lying about, and INQ busted them using an electron microscope or something. Apple also canceled their contract and is trying to sue for their money back, since nVidia basically bricked an entire generation of Apple laptops.
|
N E D M
Fril Estelin
Our drivers are very well programmed. There are so many factors involved with modern driver releases, and laziness on the game developer and OS developer side didn't help drivers in the last few years. Suffice to say, Windows7 may change that a bit (but that is totally up to MSFT not being complete morons this time around)
|
Killamus
Tarun is right Rahja, I've got nVidia now but used to have ATI before and the lack of reactiveness and stability is tangible, although not an unsurmountable obstacle to usability. I very clearly get the feeling that nVidia's drivers, although not "bad" software, could get a lot better if programmed in a more "intelligent" and cautious manner as to asynchronous polling. ATI drivers cause less troubleshooting/complaints on online forums, while we hear quite a few on nVidia's.
|
Brett Kuntz
Ok, there is entirely too much misinformation at this point...
The issue with the mobile divisions processors wasn't solder joint issues... it was fault cobalt doping. The solder issue you speak of was a completely different issue, and that was a result of our board partners, not us. The faulty cobalt doping resulted in higher than normal heat fluctuation, and then compounded with a soldering method that didn't compensate for that, things crumbled into that bad situation. The cobalt doping issue is why we took a one time expense to fix the issues. We made good on fixing the issue, so it is really point moot. We never lied about the issue, but we did argue the end reason for the issue. People act like we released a product knowing full well that would happen. That is the farthest thing from the truth, and it is really insulting that you even suggest that. The people I work with are outstanding individuals that are working to advance the graphics industry, not ruin it. |
The problem is the same one we told you about almost a year ago, the GT200 die is too damn big. Even with a shrink, it is still too damn big, almost twice the size of its closest rival, the ATI R770/4870. Even with poor yields and an expensive board, the card loses to its much more economical rival from ATI. This means Nvidia has to fight a price war against an an opponent with lower costs. |
While you can expect ATi's offering to smaller in physical size, do not expect it to be nearly as small as the previous generation. They effectively doubled the amount of shader units, which isn't even mentioning other additions to the chip. |
Lord Sojar
And you think you stand a chance against ATI this round (58xx vs GT300)?
|
Brett Kuntz
I find it laughable that you even cite the Inquirer for a source. After this post, I'm done even arguing this or any other similar topic, seeing where you get your "news" from. It is pointless to argue with anyone who actually cites that man (Charlie) as a source, seeing as how he is the most bias, most anti nVidia, anti physics, anti cGPU, anti everything else our company tries to do person to ever grace the universe. It's like citing a vegan for a study on meat... the logic is so totally flawed and backwards... perhaps if he took his head out of his ass for 10 seconds, he might actually be worthy of being, at best, a FOX News "reporter".
|
Your math on transistor count is wrong. Total die area is not directly proportional to process nor platter area. Primary and co process design is key here. You have to take into account cache levels, chip layout, access areas, and unit type. You aren't including any of that in your "math", and neither does Charlie (how interesting!). You are only thinking of area, not density. Transistor density and implementation is far more critical here compared to chip area. The doping methods for the 40nm node are radically different, and more heavy. This drastically cuts into transistor density, and that is a huge factor you have chosen to ignore (I have no idea why...)
|
Our company is not filled with bumbling idiots you would like to believe it is.
|
[/I]There are not millions of wasted transistors in ATi's design either... you are off your rocker if you think ATi would ever waste that many transistors.
|
Here is some news for you... GT300 is so radically different from GT200 in architecture and design implementation, you can't even logically compare them. You are trying to compare a motorcycle against a car.... you don't do that... Charlie attempted to compare GT300 to Larabee... that is just hilarious and outright stupid. He is an idiot, and taking anything he says to heart is just as stupid. He is already known to be the laughing stock of the tech reporting community.
|
Here is some really bad news for you... if ATi holds to that release date, and we (nVidia) continue to have yield issues with the TSMC 40nm process... guess what ATi is going to do with their prices... they are going to raise them to kingdom come. Why? Oh, because they can! With no competition, ATi will scrape up every single dollar they can to patch their already terrible quarterly reports. You think nVidia is the only company that does that? HA HA HA. You just wait... come the end of October, you will all be complaining that the prices of the new ATi cards is outrageous, just like you did with GT200 when it released. Companies are not here to please you, they are here to make money. Try to remember that before you go fanboy mode.
|
Lonesamurai
Lord Sojar
ATI could do that, but then they will alienate people into waiting until your chip comes out, and that's probably not a good thing. In fact, that's exactly why the nVidia FUD (Fear, Uncertainty, Doubt) machine is going full tilt. You're best move at this point is to convince as many people not to buy ATI's cards in September.
What ATI will do, however, is release a very inexpensive chip, because they can, since there chip is tiny and cheap to make, and they've been stock piling 1GB GDDR5 for quite some time. They will do this, because everyone who buys ATI in September 2009 wont buy nVidia in March 2010. ATI could only lose if they released expensive cards. There is no strategy in releasing expensive cards, especially when ATI's cards are so cheap to manufacture due to their tiny die size. |
Brett Kuntz
In the world of fantasy land, where businesses that are in the red and have an advantage don't price gouge, sure! Unfortunately, we don't live in fantasy land, we live in the real world.
ATi is in dire straights, and they will price gouge like crazy having no competition. ATi isn't holier than any other company, and any other sane company would do the same. |
Oh, and as I mentioned previously... ATi's newest upper tier cards don't have the smallest die size in the world... unfortunately. Theirs got significantly larger, ours shrank. They are riding their architecture... it will work this generation, but then it's game over.
|
I hope for your sake, ATi suddenly becomes more interested in making people happy than profits... but the truth is, any company that does that goes under. ATi cannot afford to be nicey nice to their customers. If they alienate them, who cares? You either buy a new ATi DX11 card, or you go without. Not much of a choice...
|
Tarun
Lord Sojar
Brett Kuntz
This is pretty funny... you condemn one company for practicing cGPU technology from one side of your mouth, then turn around and praise another? Wow... great work there!
Larrabee is hilarious, at best. It is going to be the biggest flop the graphics card industry has seen in years, and the laughing stock of the the tech industry as a whole. If you actually knew anything about its architecture and Intel's pipe dreams with regards to that, you would laugh too. Sorry, now I am really done with this conversation! LOLOLOLOL This is too much! I told a friend of mine who actually works for ATi's chip integration division that you said that, and he laughed too. Absolutely classic example of naivete; assuming because something sounds good in context, it will be popular and work. Communism sounds great on paper too, but we all know how it plays out in real life.... |
Divinus Stella
riktw
moriz
Brett Kuntz
AMD's RV800 series chips will probably be significantly bigger. if memory serves me correctly (and the rumor source i read was correct), it will have around 2000 stream processors. if there's no significant architecture change, then the RV800 chip will probably end up around 350 mm^2 at least.
|
moriz
riktw
http://www.anandtech.com/showdoc.aspx?i=3573
-180mm^2 die size (4870 was 260mm^2, nVidia's was the size of the Titanic) |
Brett Kuntz
ah, i guess my source was really off then. however, are we sure that picture is of the RV870, and not a more mainstream chip (RV840)?
i'm quite familiar with AMD's "small chip" strategy. however, using a dual GPU solution for the very high end can have its drawbacks. the HD4870x2 seems to be completely at the mercy of drivers; if a game doesn't have a crossfire profile, it won't gain the performance benefit. of course, i still remember the HD4850 debut, and its +105% crossfire scaling in COD4. yikes. |
moriz
Brett Kuntz
moriz
Brett Kuntz
moriz
Brett Kuntz
Brett Kuntz
moriz
Brett Kuntz
Brett Kuntz
Lord Sojar
and rahja, maybe i should PM this, but what kind of collage have you done, working at ati or nvidia looks like a great job as i do way to much with electronics already.
and i guess its not an easy collage but i can still try |
What I don't get is that 2 6-pins = 1 8-pin......
And a 6-pin can easily handle 24 amps (288 Watts) yet the stupid PCI-E spec limits it to just 75 watts..... And 8-pin adds no more voltage lines, and yet the rating magically grows to 150 Watts.... lol Basically a 6-pin connector could power all the worlds GPU's, including the X2's. |
riktw