Sign in with
Sign up | Sign in
Your question

Oh the irony.

Tags:
Last response: in CPUs
Share
August 14, 2009 1:18:48 PM

More about : irony

a b à CPUs
August 14, 2009 4:29:56 PM

LOL - now you've gone and done it :D 

Quote:
Tim Sweeney, chief executive officer of Epic Games, said during his keynote at High Performance Graphics 2009 conference earlier this month that it is dramatically more expensive to develop software that relies on general purpose computing on graphics processing units (GPGPU) than to create a program that utilizes central processing units. He also re-iterated his earlier claims that the days of GPUs are counted.


I'm sure JDJ will pop in any minute now to set us straight on how it's really CPUs that are gonna disappear :D .
August 14, 2009 4:48:18 PM

he could have really shortened the interview and said "we really dont give a **** about DX11, quad cores, etc if it slows us down from getting it in a pretty box and sitting on a store shelf then don't expect us to cater to it". you know...stuff we already knew.
Related resources
August 14, 2009 4:53:39 PM

Since when, other than MMORMMM type games on PC has been promoted well?
Its an xbox thang, trust me.
Yes, the gpus days are numbered.So are desk tops BTW. Did he mention that too?
August 14, 2009 4:56:42 PM

Heres my favorite part:
"Although the market of video games has been growing rather rapidly in the recent years, game budgets have not been increasing that rapidly. Mr. Sweeney claims that although performance increase may be 20 times, game budgets will only increase less than two times. As a result, it is vital for game developers to take advantage of the modern hardware capabilities and performance at the lowest possible cost."
What? he wants a vacation? Having your HW 20x more powerful is now a problem for devs? OHHHHH maaaaan
a b à CPUs
August 14, 2009 6:06:54 PM

Quote:
“In the next generation we’ll write 100% of our rendering code in a real programming language – not DirectX, not OpenGL, but a language like C++ or CUDA. A real programming language unconstrained by weird API restrictions. Whether that runs on Nvidia hardware, Intel hardware or ATI hardware is really an independent question. You could potentially run it on any hardware that's capable of running general-purpose code efficiently,"
So much enthusiasm for DX10.1 & DX11 being shown here, I don't think he has read any of your recent posts JDJ or if he has I get the feeling that he may not be as excited as he is supposed to be considering how wonderful DX11 is going to be.
a c 172 à CPUs
August 14, 2009 6:16:45 PM

JAYDEEJOHN said:

Yes, the gpus days are numbered.So are desk tops BTW. Did he mention that too?

They have been saying that for years. Apparently, no one has told Intel or AMD.
August 14, 2009 6:23:26 PM

Unfortunately, the move away from fixed HW must have eluded him, and bigger player than nVidia and ATI has overshadowed D3D.
He said not opengl, which Implies? And maybe it was a slip of the tongue, or just his thought process, but he did mention C++ first
Gpus are going the way of the FSB, no doubt about it. nVidias G200s transistor count didnt match its ability in gaming at all, if you take historically how theyve manged transistor count vs perf in gaming in the past.
But, here you have a whole OS diverging from the old ways, new HW complying to that, the DX model doing it as well, and it isnt enough?
Moneys talking here, plain to see.I think alot of them are worried currently, as theyve poured more and more into consoles, and have left the PC market behind more and more. Consoles have hit their archetectural limits, and havnt and dont use the wider based structure we see in PC gaming because of their limitations of their fixed function.
So, how do they get around this?
Head in the same direction everyone else is, but currently, until a new arch is made for consoles, theyre stuck.
Traditionally, PCs have led in all this, and I find it as no surprise theyll lead here as well.
Remember, LRB is coming, and even if it underperforms, and even if MMO is doing quite well, they wont point there, instead theyll have cloud, and also itll be dumbed down.
This is what they want. If you as a gamer want this too, thats ok with me, but its not for me, but you cant stop the wind of change
August 14, 2009 7:13:12 PM

Good thing he didnt mention OpenCl tho
"With all the talk about OpenCL and Snow Leopard together and how the spec will allow Apple's upcoming hotness to exploit graphics accelerators, it's easy to lose track of the place where the standard could make its biggest impact: gaming. Yes, OpenGL may have lost favor in that realm in recent years, but OpenCL looks to captivate the hearts and GPUs of gamers everywhere by applying some much-needed standardization to the physics acceleration realm, first shown in public at GDC running on some AMD hardware. Havok is demonstrating its Havok Cloth and Havoc Destruction engines, the former of which is embedded below, and we think you'll agree it's quite impressive. OpenCL allows such acceleration to switch between the GPU and CPU seamlessly and as needed depending on which is more available, hopefully opening the door to physics acceleration that actually affects gameplay and doesn't just exist to make you say, "Whoa."
August 15, 2009 5:22:28 AM

It's not like Epic still cares about PC gaming, anyway. Had you asked Activision Blizzard or Valve, you'd probably have heard something reasonably different. Also, Microsoft, Nvidia and Apple certainly have what it takes to push a new "standard".

Don't forget that one day a computer was far bigger than your house and pricier than an airplane. The world once had a market for maybe 4 computers. It wasn't quite cheap to develop something back then.

Those heavily quoting Jaydee obviously read only the title of the article and perhaps its two first lines. Basically, what Sweeney said is that, as of now, developing an application for GPGPUs is too expensive - if compared to developing for CPUs - and "the keynote of Tim Sweeney at High Performance Graphics 2009 was mostly dedicated to the death of the graphics processing units as we know them today."

Not a surprise, since he has always talked of Larrabee as a potentially revolutionary thing. This and it's obvious that GPGPUs - as we know them today - are "dead", since they are getting [literally] closer to the CPU by the likes of Intel and AMD, just as arguably getting far more advanced than any other parts of your computer.

Further in the article he also says "“In the next generation we’ll write 100% of our rendering code in a real programming language – not DirectX, not OpenGL, but a language like C++ or CUDA". Duh.
August 15, 2009 10:50:55 PM

Its under an assumption that using programs such as CUDA doesnt need a gpu
Its also under assumption that where HE goes, everyone will follow, not that thered be an automatic hole left, only to be filled by greater and smarter people than HE
August 16, 2009 7:54:57 AM

Again, a nice lil laugh
"I remember the 6800 launch in Geneva, where Sweeney - to the obvious shock of Nvidia reps present - proclaimed anti-aliasing to be dead. Granted, it's not dead (yet), but the increasing number of titles trading AA-support for some üb0r-effect speaks for itself. "

http://forum.beyond3d.com/showthread.php?p=1321915#post...
August 16, 2009 7:56:24 AM

So, all this Intel x86 mumbo jumbo miracle grow crap? Take it with a grain.
a b à CPUs
August 16, 2009 8:28:20 AM

Could someone tell ATI and NVIDIA that they don't need to bother with the next gen of gpu's then?

We are just going to offload all of that stuff to the spare cores in my 128 core cpu whore. Single socket too ...

The copper waterblock will be mounted with a torque wrench ... 50 lb per 9/16 high tensile "head bolt".

I'll be putting a 3 core radiator out of a light truck onto the side of the case to keep the temps below critical.

at idle ...

the guys a shill ... some who tho?
August 16, 2009 8:59:20 AM

ut3 was a flop, the dude just want to be relevant again.
a b à CPUs
August 16, 2009 2:31:11 PM

Epic Fail.
August 17, 2009 7:39:29 AM

Missed this lil snippet here:
"TS: Hopefully you'll see synthetic benchmarks replaced with real game benchmarks in that timeframe, because that's all that really matters. Already that's happening on a large scale; people run Unreal Tournament benchmarks and Crysis benchmarks, and those are the numbers people care about. "
http://arstechnica.com/gaming/news/2008/09/gpu-sweeney-...

OK, is this guy just coming off a drunk or what? He certainly needs more sobering up.
I know cpus tend to use a value bungholio marks moreso than gpus, but everyone prefers real life apps/usage perf marks, and on the gpu side, everyone knows you cant use Vantage from one setup to another, wheres this guy been?
He must think when Cleeve does a review, everyone skips to the 3DMark page (no, we really just skip to the bench numbers, sorry Cleeve heheh), and the same at Anands.

Its sounding like his rendered snow was 3 feet deep going to kill the monsters, all uphill to the monsters and 6 feet and even steeper uphill going back to the fortress, when he was a youngster

PS, notice the date on my link. This guy just never gives up heheh
August 17, 2009 7:57:11 AM

20 years from now:
Tim Sweeny declares the gpu dead as we know it [:jaydeejohn:4] [:jaydeejohn:4] [:jaydeejohn:4] [:jaydeejohn:3] [:jaydeejohn:2] [:jaydeejohn] [:jaydeejohn:1] [:jaydeejohn:5]
a b à CPUs
August 17, 2009 9:44:41 AM

30 Years from now Tim Sweeny joins the new board of directors on the DNForeverandever Game Franchise.

August 17, 2009 12:08:56 PM

*checks the date of this thread to make sure*

Yah, Spud is back! :D 

Quote:
“In the next generation we’ll write 100% of our rendering code in a real programming language – not DirectX, not OpenGL, but a language like C++ or CUDA. A real programming language unconstrained by weird API restrictions. Whether that runs on Nvidia hardware, Intel hardware or ATI hardware is really an independent question. You could potentially run it on any hardware that's capable of running general-purpose code efficiently," said Mr. Sweeney in an interview last year.

Isn't that OpenCL???

Although I gotta say OpenCL is taking off reaaaaaaally slow compared to CUDA. Not good. :( 
August 17, 2009 5:18:41 PM

What Ive found is, in none of his interviews does he mention DX11, as he believes DX9 was the ned all be all.
DX10 and DX11 is heading exactly in the direction hes been spounting since 2000 http://www.scribd.com/doc/93932/Tim-Sweeney-Archive-Int...
as is OpenCl, where he also fails to go into much comment or none at all.
DX11 allows for a mix of the shaders to take a much wider approach in their usage, as DX10 furthered the unification of shaders, offering formats he hates, such as AA, and DX10.1, having to use 1 less pass at x4AA

Hes shown a hatred for AA and ironically doesnt want higher res either.

Basically hes asking for the world to come to him, so he can make our dreams come true the way HE sees it
a b à CPUs
August 17, 2009 5:36:28 PM

JAYDEEJOHN said:
What Ive found is, in none of his interviews does he mention DX11, as he believes DX9 was the ned all be all.
DX10 and DX11 is heading exactly in the direction hes been spounting since 2000 http://www.scribd.com/doc/93932/Tim-Sweeney-Archive-Int...
as is OpenCl, where he also fails to go into much comment or none at all.
DX11 allows for a mix of the shaders to take a much wider approach in their usage, as DX10 furthered the unification of shaders, offering formats he hates, such as AA, and DX10.1, having to use 1 less pass at x4AA

Hes shown a hatred for AA and ironically doesnt want higher res either.

Basically hes asking for the world to come to him, so he can make our dreams come true the way HE sees it

You are aware I hope that the new Batman game that you seemed to get upset about a couple of days ago runs under DX9?, the point is that two years after the release of DX10 devs are still using DX9 and doing bugger all in DX10, gawd alone knows how long it's going to be before game devs start using DX11 as standard so I for one fail to see the excitement/problem/issue.
a b à CPUs
August 17, 2009 5:41:00 PM

He wants PC-Consoles, where 0x03dff1 points to the end block of memeory mapped for UT3.

this way, he can dev easily for everyone, and not to mention everyone will have the HW to play his games!

Man if he deved crysis then this would be even better.
a b à CPUs
August 17, 2009 5:53:27 PM

theholylancer said:
He wants PC-Consoles

Now that's a scary concept.
August 17, 2009 6:20:40 PM

The Batman game is but a DX10 port, as the engine itself is DX9.
Havnt I alluded to Sweenys love affair of DX9? Havnt I also said he hates AA?
So, it proves my point, here we have a major influence in gaming , holding back progress as he sees fit, while people are complementing this?

Its a chicken or the egg thing here.

DX10 and DX11 are here, the HWs been here, but it came first, while it didnt need to.Devs are slow on the uptake of things, and thus the usage of true MT in games, or rather, the lack thereof? So, is that too a good thing?

The Batman problem is that the AA usage was ID blocked by the devs, using the nVidia name, and thats where I had a problem with it, as well as nVidia claiming that its PhysX was an open resource, while shutting out the ATI cards, which had previously been able to use it

I guess thats good too now....
a b à CPUs
August 17, 2009 6:56:19 PM

I never said or felt that it was 'good' just that it's how it is (a small and subtle difference). The point is, rather than than get fixated on one or two bits of what is a much larger picture I like to sit back and take the human and time components into account and as such these people (Sweeny) and tactics (Nvidea/Batman) will surface from time to time and are just another view/opinion on where PC's and consoles are possibly heading towards in the future.
August 17, 2009 7:37:49 PM

Problem is, it does have phsics for cpu in Batman, but we dont know how well its optimized for it, and it isnt final either, sowhether its to be done on gpu or cpu , it fails here as well
Its why I totally disdain Sweenys all or nothing approach, as we cant expect the devs to each take a path and expect it to work without guidelines, which tho are costly, gives us at least some coherency.
So too, as the Batman example is shown, this is an example of just that. You have 1 dev using/promoting 1 side of things, while we dont actually know how good the cpu/physics is as an alter native, and to head full bore ijnto this direction is folly
!