Rahja's FREEEEEEEE!!!!

1 pages Page 1
Lord Sojar
Lord Sojar
The Fallen One
#1
At midnight (EST, -5 GMT), I will fill in the blanks.... MUHAHAHAA.

GF100 outperforms ATi's 5870 by 46% on average
GF100 outperforms ATi's 5970 by 8% on average

The GF100 gets 148 fps in DiRT2
The GF100 gets 73 fps in Crysis2
The GF100 gets 82 fps in AvP3

*GTX misnomers removed due to Business NDA*

GF100's maximum load temperature is 55 C.

The release week of GF100 is Mar 02nd

Blackberry ordered about a million Tegra2 units for their 2011 smartphone.
Apple ordered a few million Tegra2 units for the 2011 iPhone.
Nintendo ordered several million Tegra2 units for their next gen handheld (the DS2!)

*Removed: Under business NDA*

That's all for now kiddies! See if you can guess for the time being, each - represents a letter or number

<font color="red" font size="3">Extra spoilers!</font>
  • GF100 and GF104 will feature a new 32x Anti Aliasing mode for enthusiast setups.
  • GF100 and GF104 can do 100% hardware based decoding for 1080p BluRay and H264 playback.
  • GF100 and GF104 feature full SLi capability and a new scaling method for rendering!
  • GF100 and GF104 will provide full on chip native C++ operation for Windows and Linux environments. This will be further augmented with CUDA and OpenCL.
  • GF104 will feature new technology designed for UHD OLED monitors!
  • GF100 promises to deliver at least 40% more performance than the GTX295 for less money. GF104 promises double that.
End
End
Forge Runner
#2
YEY

12 chars of yeyness



On a side note...what are you going to do now without a job? dedicate yourself wholly to the gwguru technicians corner?
Ariena Najea
Ariena Najea
Silence and Motion
#3
It's over 9000!

Also, congrats sort of!
Lord Sojar
Lord Sojar
The Fallen One
#4
Quote:
Originally Posted by End View Post
YEY

12 chars of yeyness



On a side note...what are you going to do now without a job? dedicate yourself wholly to the gwguru technicians corner?
Go back to school? DUHH
MisterB
MisterB
Furnace Stoker
#5
Quote:
Originally Posted by Rahja the Thief View Post
At midnight (EST, -5 GMT), I will fill in the blanks.... MUHAHAHAA.

Nvidia's cookies outperforms ATi's brownies by 24.89% on average
HAL9000 outperforms ATi's 5870 by 9001% on average

*Removed reference to removed content*


Rahja's
maximum load temperature is 40 °C.

The release week of the apocalypse is in 2012, the week of December 22nd. But it's all over on the 21st.

Cyberdyne Systems ordered about a million T-102 units for their 2011 smartphone.
Microsoft ordered a few million Playstation 2 units for the 2011 XBoxRetro.
The US Army ordered several million Stinger 2 units for their next gen handheld.

Phillip Morris USA is planning to move their primary focus to the juvenile market and the infant & toddler market.

That's all for now kiddies! See if you can guess for the time being, each - represents a letter or number
Mad libs are fun. I bet all my "guesses" are wrong.
End
End
Forge Runner
#6
Quote:
Originally Posted by MisterB View Post
Mad libs are fun. I bet all my "guesses" are wrong.
Quote:
The release week of Guild Wars 2's release
Fixed the release date

side note again: over 150 people viewing a Technician's corner thread O.o...
You'd think this was riverside...
Celestial Crown
Celestial Crown
Frost Gate Guardian
#7
What are the yields at now, 5%?

I like how you provide the hardware specs of the system for those benches. Hell, you don't even list resolution and you're using games that aren't out yet.

Fermi is going to flop. AMD currently holds every single market segment and Nvidia will only take away the high end at a ridiculous price. AMD will just release a higher end single GPU (5890 or something) and still contend in the enthusiast market.

"The architecture is broken, badly designed, and badly thought out. Nvidia does not understand the basics of modern semiconductor design, and is architecting its chips based on ego, not science. The era of massive GPUs is long over, but Nvidia (mis-)management doesn't seem to want to move their egos out of the way and do the right thing. Now the company is left with a flagship part it can't make."

http://semiaccurate.com/2009/12/21/n...-fermi-448sps/
Lord Sojar
Lord Sojar
The Fallen One
#8
Quote:
Originally Posted by Celestial Crown View Post
What are the yields at now, 5%?

I like how you provide the hardware specs of the system for those benches. Hell, you don't even list resolution and you're using games that aren't out yet.

Fermi is going to flop. AMD currently holds every single market segment and Nvidia will only take away the high end at a ridiculous price. AMD will just release a higher end single GPU (5890 or something) and still contend in the enthusiast market.

"The architecture is broken, badly designed, and badly thought out. Nvidia does not understand the basics of modern semiconductor design, and is architecting its chips based on ego, not science. The era of massive GPUs is long over, but Nvidia (mis-)management doesn't seem to want to move their egos out of the way and do the right thing. Now the company is left with a flagship part it can't make."

http://semiaccurate.com/2009/12/21/n...-fermi-448sps/
Charlie is a fool. He has a journalism degree.

Games that aren't available to the public but use DX11... HMMM, why would I use DX11 games? That's a tough one....isn't it? Any other gems of wisdom you would like to spew? I revealed this info because I am no longer employed by nVidia and several of the technical NDAs expired at midnight, seeing as how Jan 4th was my last day. The business NDAs still stand.

AvP3 and Crysis2 can be taken with a grain of salt, since they are not finalized and drivers aren't finalized yet. DiRT2 may improve or decline slightly, depends on final drivers.

Those tests were run using a Corei7 920, 6GBs of DDR3 1333, and an Intel 64GB SSD paired with a single GF100 card. The tests were run at 1920x1200 with 4x SSAA and 16xAF. Happy? If you want more details, wait for a benchmarking site to run benchmarks like everyone else. I am spoiling you as it is. Don't bite the hand that feeds.
Celestial Crown
Celestial Crown
Frost Gate Guardian
#9
Quote:
Originally Posted by Rahja the Thief View Post
Charlie is a fool. He has a journalism degree.
Games that aren't available to the public but use DX11... HMMM, why would I use DX11 games? That's a tough one....isn't it? Any other gems of wisdom you would like to spew? I revealed this info because I am no longer employed by nVidia and several of the technical NDAs expired at midnight, seeing as how Jan 4th was my last day. The business NDAs still stand.

AvP3 and Crysis2 can be taken with a grain of salt, since they are not finalized and drivers aren't finalized yet. DiRT2 may improve or decline slightly, depends on final drivers.

Those tests were run using a Corei7 920, 6GBs of DDR3 1333, and an Intel 64GB SSD paired with a single GF100 card. The tests were run at 1920x1200 with 4x SSAA and 16xAF. Happy?[/QUOTE]

No.

The number of SPs was cut. Nvidia is a horrible company, just look at what they did with Batman: Arkham Asylum. They locked AA code to be vendor specific, so it disables when an ATI card is detected. This was proven by changing the VendorID to Nvidia on an ATI card and then magically AA could work in the demo. Also disabling PhysX for use on other primary cards than Nvidia ones was a bad business move, now people are just bypassing it with modified drivers. I could go on and on about why Nvidia is a bad business model, but surely you know from working there. Oh I'm almost forgetting the fake Fermi that Mr. Jen-Hsun Huang displayed, which Nvida claimed was real and later had to retract that statement. Oh, guess what else... REBRANDS. Nvidia rebrands everything (8800 = 9800 = gts 250) and the gtx 3xx cards or whatever are just rebrands of the gtx 2xx mobile cards. Epic failure on Nvidia's part.

You should also mention the ambients at which you consider 55C load temps. Are you running it outside in some remote location with an ambient of -20C? No way a card with 225W of power draw will run at 55C on load by the way the stock cooler looked.
Lord Sojar
Lord Sojar
The Fallen One
#10
Quote:
Originally Posted by Celestial Crown View Post
No.

The number of SPs was cut. Nvidia is a horrible company, just look at what they did with Batman: Arkham Asylum. They locked AA code to be vendor specific, so it disables when an ATI card is detected. This was proven by changing the VendorID to Nvidia on an ATI card and then magically AA could work in the demo. Also disabling PhysX for use on other primary cards than Nvidia ones was a bad business move, now people are just bypassing it with modified drivers. I could go on and on about why Nvidia is a bad business model, but surely you know from working there. Oh I'm almost forgetting the fake Fermi that Mr. Jen-Hsun Huang displayed, which Nvida claimed was real and later had to retract that statement. Oh, guess what else... REBRANDS. Nvidia rebrands everything (8800 = 9800 = gts 250) and the gtx 3xx cards or whatever are just rebrands of the gtx 2xx mobile cards. Epic failure on Nvidia's part.

You should also mention the ambients at which you consider 55C load temps. Are you running it outside in some remote location with an ambient of -20C? No way a card with 225W of power draw will run at 55C on load by the way the stock cooler looked.
I apologize your school system didn't teach you to read.... such a pity.
Tesla (GT300) is very different from GF100. Do not mistake this again, or so help me Jesus I will intellectually murder you. Trust me, I am good at it.

Now shoo pest... you clearly are Charlie's prodigy. The man wouldn't be so annoying if he didn't have so much E-Rage towards a company. He fails to realize that without nVidia, ATi would charge insane amounts of money for their GPUs, and couldn't care less about what he thought. ATi is in it for the money, just as every other successful company is.

Short of the above: Stop being a flaming fanboy for a company that doesn't give 2 shits about you.
Celestial Crown
Celestial Crown
Frost Gate Guardian
#11
Quote:
Originally Posted by Rahja the Thief View Post
I apologize your school system didn't teach you to read.... such a pity.
Tesla (GT300) is very different from GF100. Do not mistake this again, or so help me Jesus I will intellectually murder you. Trust me, I am good at it.

Now shoo pest... you clearly are Charlie's prodigy. The man wouldn't be so annoying if he didn't have so much E-Rage towards a company. He fails to realize that without nVidia, ATi would charge insane amounts of money for their GPUs, and couldn't care less about what he thought. ATi is in it for the money, just as every other successful company is.

Short of the above: Stop being a flaming fanboy for a company that doesn't give 2 shits about you.
I assumed Fermi was the backbone for the Tesla and GeForce cards. Either way, I'm right about everything and you have done nothing to refute my statements.


Quote:
<Rahja_the_Thief> GF100 is single, correct
<Rahja_the_Thief> less than or equal to 300w
<Rahja_the_Thief> that hasn't been totally finalized
<Rahja_the_Thief> since A3 silicon just got back
<Rahja_the_Thief> so I am not totally sure

<Rahja_the_Thief> based on the spin, I'd take a stab at 250w
<Rahja_the_Thief> but give or take 50w
<Rahja_the_Thief> Sorry I can't be more specific, but I haven't received new data on this since November
<Rahja_the_Thief> They put you into a blackout period
You said this, taken from the IRC chat. Even at ~250W TDP, no way there is going to be 55C load temps, unless your testing conditions were like I previously stated.

And trust me, I'm not a fanboy. I could care less about AMD/ATI as I've owned both Intel and AMD processors, and both Nvidia and ATI GPUs.
Lord Sojar
Lord Sojar
The Fallen One
#12
Quote:
Originally Posted by Celestial Crown View Post
I assumed Fermi was the backbone for the Tesla and GeForce cards. Either way, I'm right about everything and you have done nothing to refute my statements.




You said this, taken from the IRC chat. Even at ~250W TDP, no way there is going to be 55C load temps, unless your testing conditions were like I previously stated.

And trust me, I'm not a fanboy. I could care less about AMD/ATI as I've owned both Intel and AMD processors, and both Nvidia and ATI GPUs.

You are wrong? How many ways can I say this? Wrong? Incorrect? Mistaken? Dumb? Idiot? Invalid data detected? Negative on that Ghostrider? I mean... really!

The Fermi you were shown was Tesla. GF100 is different on many levels, which you will soon understand (provided you take the time to read through a site other than that abomination Charlie runs)

And yes, there are ways of getting 55C load temps at those TDPs. You just aren't thinking outside the box. Also, in house testing is bias. Expect the temps in most people's systems to be higher, I wouldn't refute that. I just posted the data I have. Take it how you will. (note: I just realized that says GF104, not GF100, I have changed it as that was a typo)
B
Bob Slydell
Forge Runner
#13
I've heard nVidia sucked, I heard ATi sucked... I heard they both sucked. No one can ever settle. And even when I bring up something like this anywhere... your respective enthusiast will then try and explain to you why his brand is better than the other and so on.. Btw..I've even heard ATi out perform nVidia as this card outperforms the ATi card. Lost cause IMO. Framerates over 60 aren't even noticeable anyway.
j
jiggles
Desert Nomad
#14
Quote:
Originally Posted by Rahja the Thief View Post

Short of the above: Stop being a flaming fanboy for a company that doesn't give 2 shits about you.
Biggest piece of irony on this forum? I think so.
j
jackinthe
Krytan Explorer
#15
Quote:
Originally Posted by Chrisworld View Post
I've heard nVidia sucked, I heard ATi sucked... I heard they both sucked. No one can ever settle. And even when I bring up something like this anywhere... your respective enthusiast will then try and explain to you why his brand is better than the other and so on.. Btw..I've even heard ATi out perform nVidia as this card outperforms the ATi card. Lost cause IMO. Framerates over 60 aren't even noticeable anyway.
ditto. i think the general consensus is that nvidia is powerful but weak, and ati is weak but powerful.
i guess we'll find out
KZaske
KZaske
Jungle Guide
#16
Sounds like Nvidia is about to open a can of whoop-a$$.
Lord Sojar
Lord Sojar
The Fallen One
#17
Quote:
Originally Posted by jiggles View Post
Biggest piece of irony on this forum? I think so.

Clearly ironic, seeing as how I own an ATi HD5870.... CLEARLY.
B
Bob Slydell
Forge Runner
#18
Quote:
Originally Posted by Rahja the Thief View Post
Clearly ironic, seeing as how I own an ATi HD5870.... CLEARLY.
Oh yeah? What's with the nVidia sig then?
Arkantos
Arkantos
The Greatest
#19
Quote:
Originally Posted by Chrisworld View Post
Oh yeah? What's with the nVidia sig then?
He worked for nVidia, so I suppose he's trying to promote/support it, even though he has a competitors card.
B
Bob Slydell
Forge Runner
#20
Quote:
Originally Posted by Arkantos View Post
He worked for nVidia, so I suppose he's trying to promote/support it, even though he has a competitors card.
My appologies. Advertising. Never would have thought of it.