Sign in with
Sign up | Sign in

GDC 2010, Day 1: The Missing Middle

GDC 2010, Day 1: The Missing Middle
By

This year's Game Developer Conference is in full swing, and Loyd Case is on the ground, reporting the latest goings-on. His Day 1 coverage includes AMD's push for open-standard physics, Windows Phone 7, Playstation Move, real-world gaming, and Surface.

Every Game Developer Conference seems to have an unstated theme, a subtext that yields clues to the overall direction and health of the game industry. This year is no exception.

The center of gravity seems to have shifted away from the middle ground--game consoles--and to the sides, if you will. One side is represented by the mobile, handheld devices, particularly smartphones. Microsoft is busy pushing development on Windows Phone 7, an entire set of tracks is devoted to iPhone game development, developers who registered for key mobile sessions received free Google Nexus One phones, and even Palm was showing game development on Palm’s WebOS.

At the other end of the spectrum is the PC, which has been much maligned in the past few years as a dying platform for gaming. Intel announced its Core i7-980X Extreme Edition, along with a leading game title, Napoleon: Total War, able to take advantage of the six-core, twelve-thread monster.

But it’s not just one new CPU. 2K games was showing off Firaxis’ Civilization V, a PC-exclusive sequel in the venerable Civilization franchise--also scalable to many threads. AMD is out pushing its Eyefinity multi-display technology. Microsoft’s public displays more strongly emphasize Games for Windows Live, with more prominent placement than the Xbox 360. Even the show keynote will be given by Sid Meier, arguably one of the industry’s more influential designers, with a long and storied history in PC gaming.

With these ideas in mind: mobile gaming seems to be coming of age, while the PC is resurgent--let’s take a look at the first half of GDC.

AMD Pushes Open Standards

AMD’s ATI graphics group has been shipping a new DirectX 11 GPU every few weeks since the launch of the original Radeon HD 5870. AMD announced a branding strategy revolving around PC gamers, which it's dubbing “AMD Gaming Evolved.”

Branding aside, perhaps the most interesting part of the AMD announcement involved Bullet Physics, an open source physics library gradually gaining steam in the developer community. AMD helped the Bullet Physics teams develop libraries that work with both OpenCL and Microsoft’s DirectX 11 DirectCompute APIs. This allows game developers to take advantage of GPU acceleration using readily-available standards not tied to particular hardware. Bullet Physics will work with GPUs from both Nvidia and AMD, Intel integrated GPUs and x86 CPUs.

Also announced was an initiative to promote open standards for stereoscopic 3D, currently the hot button among consumer electronics suppliers. AMD will be working with makers of stereoscopic glasses of all types, (polarized, active, and passive shutters) and more panel makers and middleware providers to ensure hardware-independent access to stereoscopic 3D for gaming.

In addition to efforts promoting more open standards for physics and stereoscopic 3D, the company announced a certification program for Eyefinity, so that game developers can more robustly implement the massively multi-screen capabilities of AMD’s latest GPUs. Simply scaling a game up to a huge, six-screen surface isn’t all you need to do--that’s  maybe the easiest part of the puzzle. Game developers need to put more thought into the user interface and input architectures when so many visible pixels are available.

Display all 46 comments.
This thread is closed for comments
Top Comments
  • 26 Hide
    Onus , March 12, 2010 3:45 PM
    nVidia, please take a Bullet for the team. Let PhysX die, and embrace a shared, open standard.
    ATi, if it will help them swallow a bitter pill, do your 3D their way.
    To get the best features, I don't want to be limited to only certain games based on whose GPU I bought. You'll fracture the PC gaming market, and I really don't see how that is in anyone's interests.
  • 17 Hide
    rad666 , March 12, 2010 3:56 PM
    jtt283nVidia, please take a Bullet for the team. Let PhysX die, and embrace a shared, open standard.ATi, if it will help them swallow a bitter pill, do your 3D their way.To get the best features, I don't want to be limited to only certain games based on whose GPU I bought. You'll fracture the PC gaming market, and I really don't see how that is in anyone's interests.


    I second the motion.
Other Comments
  • 26 Hide
    Onus , March 12, 2010 3:45 PM
    nVidia, please take a Bullet for the team. Let PhysX die, and embrace a shared, open standard.
    ATi, if it will help them swallow a bitter pill, do your 3D their way.
    To get the best features, I don't want to be limited to only certain games based on whose GPU I bought. You'll fracture the PC gaming market, and I really don't see how that is in anyone's interests.
  • 17 Hide
    rad666 , March 12, 2010 3:56 PM
    jtt283nVidia, please take a Bullet for the team. Let PhysX die, and embrace a shared, open standard.ATi, if it will help them swallow a bitter pill, do your 3D their way.To get the best features, I don't want to be limited to only certain games based on whose GPU I bought. You'll fracture the PC gaming market, and I really don't see how that is in anyone's interests.


    I second the motion.
  • 4 Hide
    JohnnyLucky , March 12, 2010 4:07 PM
    Interesting developments. I don't think the major players really want to share unless it is absolutely, positively necessary.
  • 1 Hide
    bitterman0 , March 12, 2010 4:18 PM
    jtt283nVidia, please take a Bullet for the team. Let PhysX die, and embrace a shared, open standard. ATi, if it will help them swallow a bitter pill, do your 3D their way.To get the best features, I don't want to be limited to only certain games based on whose GPU I bought. You'll fracture the PC gaming market, and I really don't see how that is in anyone's interests.

    rad666I second the motion.

    It's called "competition". And it is considered a norm to have two or even more (in extreme cases) competing technologies to become "standard". After a while only one technology remains and becomes a de-facto standard. Nothing to get yourself worked up about, really.
  • 8 Hide
    falchard , March 12, 2010 4:24 PM
    Since nVidia does not have the performance crown, they most likely will be unable to push a closed standard like PhysX or nVision. Any game developer who uses such technology will do so at a dive in the total amount of customers they can have.
  • -5 Hide
    shin0bi272 , March 12, 2010 4:36 PM
    I thought amd didnt want anything to do with physics... are they scared of nvidia's physx now or something? All of a sudden they are pushing for an open source standard when 3 or 4 years ago when ageia was up for sale they wouldnt touch it with a 10ft pole. Seems amd is scrambling to find a solution that will benefit them equally with their competitors because they screwed up and are now covering their asses while pointing their finger at nvidia saying that a hardware dependent solution is unfair etc etc.
  • 6 Hide
    Trueno07 , March 12, 2010 5:18 PM
    Ahhh i love seeing this.. Rebirth of the PC and with it, new and flourishing competition.

    Makes me feel warm and fuzzy inside.
  • 0 Hide
    Onus , March 12, 2010 5:39 PM
    Oh, I'm all for competition. Compete on price and performance though, not on a mutually exclusive feature set that forces uncomfortable choices. People complain about game quality now, how do you think it will get when developers know they're only writing for that portion of the market that uses {ATi | nVidia} ? Ugly. Or compete on a value-add. Write a driver that uses one vertical column of pixels at each edge as a sort of "sound level meter," so those of us who are deaf in one ear (or entirely) will know where the sound is coming from; stuff like that.
  • -5 Hide
    Anonymous , March 12, 2010 6:36 PM
    I don't see any reason for nVidia to drop PhysX, since Bullet will run on the GeForce chips just fine using OpenCL or DirectCompute, but if the game supports PhysX then nVidia can get a boost in performance since that's specifically designed for their chips.
  • 0 Hide
    Anonymous , March 12, 2010 6:39 PM
    Isnt the Xbox "Microsoft" Natal compatible with windows? Wonder how many games will support it.
  • -1 Hide
    Anonymous , March 12, 2010 6:48 PM
    As for PhysX , watch it die in the next 2 years, save my words. A quad core can fully handle physics equations to trace explosions and bullets damage and that does only use less than 10% of the Cpu power of a Q6600.The problem is on the game developpers that denies optimizing those calculations on the Cpu and rather do them on Gpus, why? You probably know the answer. We are already seeing 6 cores to remind you or if you just want the fastest, Opteron 12corex 4 Sockets= 48Cores. Crysis is using Cpu physics while we never see a Cpu load of higher than 50%. All consoles uses Cpu directed physics processing.
  • -1 Hide
    Anonymous , March 12, 2010 6:50 PM
    Enabling PhysX on Gpus shows a Big hit in Fps, the Gpu power is more efficently used when only doing graphics and not damage linear calculations.
  • 0 Hide
    Anonymous , March 12, 2010 6:54 PM
    Yea, the most exciting gaming hardware we will soon see will be the Project Natal, coders will find ways on how to use it in creative ways like moving your mouse pointer with your fingers taping in the air, I know its not necessary but that shows a BIG advancement in Computer Science Technology area. I will take one. That would be awesome for a Media Center with the big TV.
  • 2 Hide
    joe gamer , March 12, 2010 7:03 PM
    PC gaming always has an upsurge near the middle of the console "lifespan" because PC hardware grows in a linear fashion and consoles grow in spurts with ever larger lengths of stagnation in between. There is a convergence point where a PC can quite cheaply become much more powerful(and is always more versatile) than consoles.

    Far more terrifying for PC gaming to my mind is the proliferation of DRM, publisher greed, and investment stagnation. Everybody of course wants to blame piracy but that argument doesn't really cut it for me, If I buy the game I want to play it whenever, however I want. None of this "needs internet for single player" BS. This is all a symptom of games becoming more and more profitable, as publishers(who know jack squat about good games) have all the god damn money, they continue to put more limits on developers and crank out sequel after sequel, and movie based abominations. It just seems as if everything is getting worse. I think there are what? Three different studios with their fingers in the Call of Duty pie? Seriously?

    As games are made for a wider and wider audience I just can't help but be reminded, most people are idiots. When I see Farmville being touted as "the most successful game of 2009" it hurts my brain, in the middle and a bit toward the back, right in the Common sense portion. I seriously worry about humanity as a whole when this is mainstream entertainment. Let me know when "Owe My Balls" airs on Fox so I can start climbing the clock-tower.
  • -1 Hide
    mindless728 , March 12, 2010 7:19 PM
    Quote:
    I second the motion.


    also with this notion, not only does it lock out ATI cards, but also people with lower end NV GPU's that can't handle both

    also, most physics can be done on the cpu just fine, look at havok and the physics engine from crysis
  • 2 Hide
    hannibal , March 12, 2010 9:14 PM
    Ferni will most propably be a beast in PhysX, but is it enough? If game developer can sell more games because they use standard that more customers can use, in this case "Bullet", it is economically more usefull put time and money to it.
    Nvidia has good connections to game developers, like the Unreal 3 engine upgrade to 3D resently. Hard to say if they can monopolise physics or any other part of these new features. I personally hope open standards and competition in speed. I am guite sure that Ferni could be very fast allso in Bullet physics engine, because of it's calculation features, but it would allso allow ATI and Intel users to benefit same features. It would be up to speed vs cost factor then.
  • -1 Hide
    xcamas , March 12, 2010 11:04 PM
    try to separate me from who is my best friend, is evil
    what is evil needs to be corected
    if we cant corect, we have one option, KILL IT BEFORE IT KILL US

    lets kill PhysX before it kills us all.
    Thats why AMD is pushing for a standard that will save even nVidia and all gamers
  • 1 Hide
    anamaniac , March 13, 2010 4:25 AM
    jtt283nVidia, please take a Bullet for the team. Let PhysX die, and embrace a shared, open standard.ATi, if it will help them swallow a bitter pill, do your 3D their way.To get the best features, I don't want to be limited to only certain games based on whose GPU I bought. You'll fracture the PC gaming market, and I really don't see how that is in anyone's interests.

    I understand your point, but the drama of actually having real reasons (other than performance) t choose one vendor over the other adds to the excitement. =)
    dreamphantom_1977Alright, anyone who thinks nvidia should make physx open to ati- read this.http://www.tomshardware.com/news/n [...] ,5841.htmlthen read thishttp://www.bluesnews.com/s/108344/ [...] x-commentsthen read thishttp://www.ngohq.com/graphic-cards [...] esent.htmlthen read thishttp://www.extremetech.com/article [...] 555,00.aspAti fanboys- in short here is some educationNvidia paid for physx Nvidia was going to license physx to havoc to Ati extremely cheap.Ati didn't want physx - ati wanted havocNvidia wanted Ati to have physx- even on the cheap, soNvidia built drivers that supported Ati's cards to run physxNvidia paid for the coding to include Ati's card on physxAti still didn't want it. Anyone notice how much ati points fingers at nvidia?Nvidia got sick of Ati pointing fingers, and got sick of paying for ATI GPU's to be coded into physx WHEN ATI didn't want it.Ati didn't want physx because it wasn't open standards. But, nvidia says anyone can use it and write there own software to use it, every system has it, even the iphone, wii, xbox 360, and ps3, and it's free and legal to write your own software for it, as long as you licence it. . You just have to licence it like just about any other program written. So, lets get this straight- Nvidia didn't BLOCK ati cards from using physx, ati didn't want it, so nvidia just stopped writing software to support ati cards. Ati doesn't care if physx is popular, they aren't gonna pay to licence it because it's owned by nvidia, and they will do whatever they can to make nvidia look bad, including make stuff up. Ati doesn't want physX. Thats why nvidia pulled support. Nvidia wanted ati to have support. Ati said no, even though it is the most popular physics software.WHY SHOULD NVIDIA WRITE CODE TO SUPPORT ATI IF ATI DOESN'T WANT IT?You all are blaming nvidia, but you should be blaming ati for caring more about money, and less about what gamers really want. Nvidia was willing to work with ati, ati said no, now everyone is mad at nvidia. If nvidia didn't buy physx, physx wouldn't be were it is today. NVIDIA has been paying to bring physx to games, because havok wasn't going anywhere and wasn't as popular and physx was better. Why should nvidia have to pay for everything? If ati wants physx they should have to pay too. The games that use physx are not blocking ati hardware, as you can see by using different drivers. Physx just doesn't know how to use ati hardware because nvidia isn't including the code (that ati should pay for) to take advantage of physx. The companies that use physx pay coders to code with physx because it's better. Nvidia pays programmers to code physx. Why shouldn't ati have to pay? Last question- If it only costs a few pennies - "PENNIES per gpu" to licence physx and enable physx on your ati hardware- would you do it?Keep in mind that nvidia has to pay programmers to code it, support it, and implement it into the physx software. When was the last time you got free coding?Ati- quit thinking of money, you design graphics cards, you should support the gamers who buy your products. If nvidia didn't own physx today, if asus bought it, ati would have licensed it. It's just because it's there rivals.

    I think I remember that.
    I would still prefer a open standard anyways. Too bad ATi didn't play ball though, I'd be satisfied with PhysX on my 5770.

    Something that has me interested though. People running a computer with both dedicated (a 5770) and integrated (GMA 4500) using the crappy integrated for physics. You have an extra unused chip, that should be able to handle physics better than a CPU without the need for a second dedicated card, so why not? (Except for us X58 users with no integrated video)
    Also, for multi monitor setups, I'd like a true FOV across all monitors. I hate seeing the images stretched and zoomed horribly on my sides monitors. Maybe have each monitor have it's own unique point of view? (Such as FSX, where you can setup multiple point of views)
    That'd be sick. It'd also motivate me to finally buy a second 5770.

    Happy gaming all.
  • 1 Hide
    cheepstuff , March 13, 2010 4:38 PM
    Quote:
    Is this the future of gaming--turning the real world into a giant Skinner box, treating players like hamsters eager for the next virtual food pellet in exchange for real dollars? We can only hope for a backlash.


    great analogy, this is a dangerous and yet predictable outcome of virtual entertainment. Western culture is based around the idea of happiness and entertainment which means for the foreseeable future stuff like this will become popular. hopefully the social changes wont be to severe.
  • 0 Hide
    JDFan , March 13, 2010 6:02 PM
    Thing is though as long as Nv owns Physx they might say they will allow others to use it as long as they license it but what do you think happens once it becomes a standard and games are written for it at the exclusion of other standards --- Nvidia decides to raise the licensing fees to ridiculous amounts and tells everyone to either pay or stop using it !!
Display more comments