GPU Based Rendering

Winterclaw

Winterclaw

Wark!!!

Join Date: May 2005

Florida

W/

I saw this the other day and wanted to pass it on. It looks like the big thing for 2010 is GPU based rendering. Most render programs I know (like poser) of work from the CPU instead so maybe this will speed things up.

Games, I'm not sure of.

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

games are rendered on the graphic card to begin with. that's the whole point of having these cards in the first place.

Winterclaw

Winterclaw

Wark!!!

Join Date: May 2005

Florida

W/

I figured as much, but you never know. I mean you'd think that the powerful rendering programs would be using them by now, but they aren't.

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

that's because of wildly different GPU architectures out there. it's extremely hard to write a program that can take advantage of every available architecture.

Winterclaw

Winterclaw

Wark!!!

Join Date: May 2005

Florida

W/

Which I guess is the reason some games don't work well with every GFX card at release.

Bob Slydell

Forge Runner

Join Date: Jan 2007

Maybe someone can explain this to me. I always thought a GPU was basically an underclocked CPU, so why would a GPU be better than *utilizing* your CPU?

moriz

moriz

??ber t??k-n??sh'??n

Join Date: Jan 2006

Canada

R/

GPUs are dramatically different from CPUs. one is a general computational device, the other is a specialized chip designed to accelerate the rendering of 3D objects on a 2D screen. you cannot compare the two based on clock speed, since they are so radically different.

the modern GPU is made up of a number of scalable graphics "cores", each can be programmed to execute generic code. this is a step up from the older, fixed function "shaders" design. this means, while GPUs are still primarily about accelerating graphics, they can also be used to do whatever you want them to. right now, the fastest graphic card on the market, AMD's radeon HD5870, has a whooping 1600 graphics cores. compare that, with the modern CPU, which is more or less limited to 4 cores. this means that graphic cards are the best processors for running parallelized code. if you can divide up your task to take advantage of this parallel processing power, a graphic card will do it dramatically faster than a CPU. in fact, a $5000 desktop computer, with four nvidia graphic cards in parallel, can outperform a multi-million dollar computing cluster by a few orders of magnitude on certain tasks.

video rendering/encoding/decoding are tasks that can easily be divided up into little chunks, which means that graphic cards, if properly utilized, are dramatically faster than CPUs in these tasks.

Bob Slydell

Forge Runner

Join Date: Jan 2007

Quote:
Originally Posted by moriz View Post
GPUs are dramatically different from CPUs. one is a general computational device, the other is a specialized chip designed to accelerate the rendering of 3D objects on a 2D screen. you cannot compare the two based on clock speed, since they are so radically different.

the modern GPU is made up of a number of scalable graphics "cores", each can be programmed to execute generic code. this is a step up from the older, fixed function "shaders" design. this means, while GPUs are still primarily about accelerating graphics, they can also be used to do whatever you want them to. right now, the fastest graphic card on the market, AMD's radeon HD5870, has a whooping 1600 graphics cores. compare that, with the modern CPU, which is more or less limited to 4 cores. this means that graphic cards are the best processors for running parallelized code. if you can divide up your task to take advantage of this parallel processing power, a graphic card will do it dramatically faster than a CPU. in fact, a $5000 desktop computer, with four nvidia graphic cards in parallel, can outperform a multi-million dollar computing cluster by a few orders of magnitude on certain tasks.

video rendering/encoding/decoding are tasks that can easily be divided up into little chunks, which means that graphic cards, if properly utilized, are dramatically faster than CPUs in these tasks.
That is actually pretty damn amazing, I had no idea. Will we see CPU's with this technology in the coming years possibly?

Winterclaw

Winterclaw

Wark!!!

Join Date: May 2005

Florida

W/

Right now, there are quad core CPUs. After they got to about 2.5 or so gigs in power it got easier and more efficient to add more cores and let each one handle different tasks.

I'm not really sure when we see a 1600 core CPU for the home user.

Abedeus

Abedeus

Grotto Attendant

Join Date: Jan 2007

Niflheim

R/

The biggest CPUs have currently 4 cores, 8 threads, but Intel is already making CPUs that have as much as 48 cores. It's still experimental, and they wouldn't be as powerful as today's Quads and i7's, but the potential is huge.

Still, GPU are much more powerful when it comes to gaming and editing graphics/movies.