LOL - now you've gone and done it :D

Tim Sweeney, chief executive officer of Epic Games, said during his keynote at High Performance Graphics 2009 conference earlier this month that it is dramatically more expensive to develop software that relies on general purpose computing on graphics processing units (GPGPU) than to create a program that utilizes central processing units. He also re-iterated his earlier claims that the days of GPUs are counted.

I'm sure JDJ will pop in any minute now to set us straight on how it's really CPUs that are gonna disappear :D.
 

roofus

Distinguished
Jul 4, 2008
1,392
0
19,290
he could have really shortened the interview and said "we really dont give a **** about DX11, quad cores, etc if it slows us down from getting it in a pretty box and sitting on a store shelf then don't expect us to cater to it". you know...stuff we already knew.
 
Heres my favorite part:
"Although the market of video games has been growing rather rapidly in the recent years, game budgets have not been increasing that rapidly. Mr. Sweeney claims that although performance increase may be 20 times, game budgets will only increase less than two times. As a result, it is vital for game developers to take advantage of the modern hardware capabilities and performance at the lowest possible cost."
What? he wants a vacation? Having your HW 20x more powerful is now a problem for devs? OHHHHH maaaaan
 
“In the next generation we’ll write 100% of our rendering code in a real programming language – not DirectX, not OpenGL, but a language like C++ or CUDA. A real programming language unconstrained by weird API restrictions. Whether that runs on Nvidia hardware, Intel hardware or ATI hardware is really an independent question. You could potentially run it on any hardware that's capable of running general-purpose code efficiently,"
So much enthusiasm for DX10.1 & DX11 being shown here, I don't think he has read any of your recent posts JDJ or if he has I get the feeling that he may not be as excited as he is supposed to be considering how wonderful DX11 is going to be.
 
Unfortunately, the move away from fixed HW must have eluded him, and bigger player than nVidia and ATI has overshadowed D3D.
He said not opengl, which Implies? And maybe it was a slip of the tongue, or just his thought process, but he did mention C++ first
Gpus are going the way of the FSB, no doubt about it. nVidias G200s transistor count didnt match its ability in gaming at all, if you take historically how theyve manged transistor count vs perf in gaming in the past.
But, here you have a whole OS diverging from the old ways, new HW complying to that, the DX model doing it as well, and it isnt enough?
Moneys talking here, plain to see.I think alot of them are worried currently, as theyve poured more and more into consoles, and have left the PC market behind more and more. Consoles have hit their archetectural limits, and havnt and dont use the wider based structure we see in PC gaming because of their limitations of their fixed function.
So, how do they get around this?
Head in the same direction everyone else is, but currently, until a new arch is made for consoles, theyre stuck.
Traditionally, PCs have led in all this, and I find it as no surprise theyll lead here as well.
Remember, LRB is coming, and even if it underperforms, and even if MMO is doing quite well, they wont point there, instead theyll have cloud, and also itll be dumbed down.
This is what they want. If you as a gamer want this too, thats ok with me, but its not for me, but you cant stop the wind of change
 
Good thing he didnt mention OpenCl tho
"With all the talk about OpenCL and Snow Leopard together and how the spec will allow Apple's upcoming hotness to exploit graphics accelerators, it's easy to lose track of the place where the standard could make its biggest impact: gaming. Yes, OpenGL may have lost favor in that realm in recent years, but OpenCL looks to captivate the hearts and GPUs of gamers everywhere by applying some much-needed standardization to the physics acceleration realm, first shown in public at GDC running on some AMD hardware. Havok is demonstrating its Havok Cloth and Havoc Destruction engines, the former of which is embedded below, and we think you'll agree it's quite impressive. OpenCL allows such acceleration to switch between the GPU and CPU seamlessly and as needed depending on which is more available, hopefully opening the door to physics acceleration that actually affects gameplay and doesn't just exist to make you say, "Whoa."
 

dattimr

Distinguished
Apr 5, 2008
665
0
18,980
It's not like Epic still cares about PC gaming, anyway. Had you asked Activision Blizzard or Valve, you'd probably have heard something reasonably different. Also, Microsoft, Nvidia and Apple certainly have what it takes to push a new "standard".

Don't forget that one day a computer was far bigger than your house and pricier than an airplane. The world once had a market for maybe 4 computers. It wasn't quite cheap to develop something back then.

Those heavily quoting Jaydee obviously read only the title of the article and perhaps its two first lines. Basically, what Sweeney said is that, as of now, developing an application for GPGPUs is too expensive - if compared to developing for CPUs - and "the keynote of Tim Sweeney at High Performance Graphics 2009 was mostly dedicated to the death of the graphics processing units as we know them today."

Not a surprise, since he has always talked of Larrabee as a potentially revolutionary thing. This and it's obvious that GPGPUs - as we know them today - are "dead", since they are getting [literally] closer to the CPU by the likes of Intel and AMD, just as arguably getting far more advanced than any other parts of your computer.

Further in the article he also says "“In the next generation we’ll write 100% of our rendering code in a real programming language – not DirectX, not OpenGL, but a language like C++ or CUDA". Duh.
 
Its under an assumption that using programs such as CUDA doesnt need a gpu
Its also under assumption that where HE goes, everyone will follow, not that thered be an automatic hole left, only to be filled by greater and smarter people than HE
 
Could someone tell ATI and NVIDIA that they don't need to bother with the next gen of gpu's then?

We are just going to offload all of that stuff to the spare cores in my 128 core cpu whore. Single socket too ...

The copper waterblock will be mounted with a torque wrench ... 50 lb per 9/16 high tensile "head bolt".

I'll be putting a 3 core radiator out of a light truck onto the side of the case to keep the temps below critical.

at idle ...

the guys a shill ... some who tho?
 
Missed this lil snippet here:
"TS: Hopefully you'll see synthetic benchmarks replaced with real game benchmarks in that timeframe, because that's all that really matters. Already that's happening on a large scale; people run Unreal Tournament benchmarks and Crysis benchmarks, and those are the numbers people care about. "
http://arstechnica.com/gaming/news/2008/09/gpu-sweeney-interview.ars/3

OK, is this guy just coming off a drunk or what? He certainly needs more sobering up.
I know cpus tend to use a value bungholio marks moreso than gpus, but everyone prefers real life apps/usage perf marks, and on the gpu side, everyone knows you cant use Vantage from one setup to another, wheres this guy been?
He must think when Cleeve does a review, everyone skips to the 3DMark page (no, we really just skip to the bench numbers, sorry Cleeve heheh), and the same at Anands.

Its sounding like his rendered snow was 3 feet deep going to kill the monsters, all uphill to the monsters and 6 feet and even steeper uphill going back to the fortress, when he was a youngster

PS, notice the date on my link. This guy just never gives up heheh
 

wuzy

Distinguished
Jun 1, 2009
900
0
19,010
*checks the date of this thread to make sure*

Yah, Spud is back! :D

“In the next generation we’ll write 100% of our rendering code in a real programming language – not DirectX, not OpenGL, but a language like C++ or CUDA. A real programming language unconstrained by weird API restrictions. Whether that runs on Nvidia hardware, Intel hardware or ATI hardware is really an independent question. You could potentially run it on any hardware that's capable of running general-purpose code efficiently," said Mr. Sweeney in an interview last year.
Isn't that OpenCL???

Although I gotta say OpenCL is taking off reaaaaaaally slow compared to CUDA. Not good. :(
 
What Ive found is, in none of his interviews does he mention DX11, as he believes DX9 was the ned all be all.
DX10 and DX11 is heading exactly in the direction hes been spounting since 2000 http://www.scribd.com/doc/93932/Tim-Sweeney-Archive-Interviews
as is OpenCl, where he also fails to go into much comment or none at all.
DX11 allows for a mix of the shaders to take a much wider approach in their usage, as DX10 furthered the unification of shaders, offering formats he hates, such as AA, and DX10.1, having to use 1 less pass at x4AA

Hes shown a hatred for AA and ironically doesnt want higher res either.

Basically hes asking for the world to come to him, so he can make our dreams come true the way HE sees it
 

You are aware I hope that the new Batman game that you seemed to get upset about a couple of days ago runs under DX9?, the point is that two years after the release of DX10 devs are still using DX9 and doing ah heck all in DX10, gawd alone knows how long it's going to be before game devs start using DX11 as standard so I for one fail to see the excitement/problem/issue.
 

theholylancer

Distinguished
Jun 10, 2005
1,953
0
19,810
He wants PC-Consoles, where 0x03dff1 points to the end block of memeory mapped for UT3.

this way, he can dev easily for everyone, and not to mention everyone will have the HW to play his games!

Man if he deved crysis then this would be even better.