Sign in with
Sign up | Sign in
Your question

Can we get Toms on to this band wagon about eyefinity/surround?

Last response: in Graphics & Displays
Share
January 21, 2010 1:20:22 PM

http://www.hardocp.com/article/2010/01/08/nvidia_3d_sur...
FTA: "For god’s sake NVIDIA, if you pull this bullshit where NVIDIA Surround "games" will not work on Eyefinity configurations, we are going to beat NVIDIA down repeatedly and publicly for harming the PC gaming industry. Keep those crappy proprietary PhysX policies, but if you start messing with OUR multi-display gaming and not letting it remain "open platform," I will personally lead the mob with the burning torches to the castle gates. And we will be fully prepared to use the torches. I will personally lead a boycott of NVIDIA products if I see NVIDIA harm multi-display gaming in the marketplace through an IP grab. Multi-display gaming belongs to gamers, not NVIDIA."

Please someone from Tom's pick this up and make this a front page issue, support it and make it so that there won't be another thing like physx where available CPU resources isn't used.


I will bump this until some one from tom's responds (either nay or yay), who is with me?
a c 189 U Graphics card
a b Î Nvidia
January 21, 2010 1:32:33 PM

nice try i think, with 3dvision & nFinity tech become one, but i think it's rather odd when i see the configuration of "surround monitors"... :) 
January 21, 2010 1:44:34 PM

well see, if they leverage TWIMTP titles to not allow non standard res when ATI is plugged in (aka not like 1920*1200, 1024768, etc. etc.) because that is precise what ATI is doing with eyefinity, the I would imagine it would be criminal, and even if they get sued, it would take years and they may be able to blame it on a technical issue and just overwhelm the judge with technical info they didn't need
Related resources
a b U Graphics card
January 21, 2010 1:57:06 PM

I agree, the more proprietary crap we get, the less we consumers end up with.
While these are nice to have, at some point, the devs have to take responsibility as well.
They cater to the low end, why not the high end as well? If they havnt figured out where the butter comes from for their bread, they may as well just quit making games for PC, and if nVidia sloughs off, and just says its up to the devs, well, theyre also the ones pounding their chests by how much they do, and how often they do it with those devs.
ATI had better pick it up as well, and make it a common access, no flags
January 21, 2010 2:11:39 PM

Aye, which is why I want to see this issue pushed by toms. compared to Hard ocp or other places, toms seems to have the most readers AND the most, errrrr first time, yeah first time, lets go with that, enthusiasts due to the forum and the tolerant nature.
a b U Graphics card
a b Î Nvidia
January 21, 2010 2:16:53 PM

I like (respect) Hocp,Kyle but I don't agree with about half of what he says in his editorials or reviews. Good for him, but I don't even think he needed to write this piece except maybe to hear himself bitch. He spews his bias with the "crappy physicx" rant. After a couple of his video reviews his attitude towards some things are obviously pre-determined and that turns me off.
edit: I don't think any gaming output to multi-monitors would ever become non proprietary (if thats the term you want to use) but Nvidia has been developing 3d and and still is, and that will take some unique code(maybe) on behalf of a game. But you see Kyle has already condemned 3d gaming as well.
January 21, 2010 2:20:52 PM

But he does present a valid point, TWIMTP titles favor nvidia and batman showed us they were willing to take it to the next level by disabling universal features because ati was present...

most games can and will scale at randomly weird resolutions, its not like ppl are asking they include special textures for some of these eyefinity reses,just that they work and not locked to surround.
a b U Graphics card
January 21, 2010 2:21:34 PM

It definately turns me off as well, and is why I rarely go there, but a points a point.
You can rant at them also, but even I end up reading some of their stuff
a b U Graphics card
a b Î Nvidia
January 21, 2010 2:32:09 PM

Its not part of Kyles point. But I would think that "eyefinity" is copyrighted. What part of the technology (3 monitors, 1 resolution ) is ATI's intellectual property ? I don't know. Is Nvidia have a name for their multi-monitor solution yet ? Can they do the same thing and just call it something different ?
a b U Graphics card
January 21, 2010 2:40:12 PM

Yea, they do, and will, except if it works on other gpus, will it? If its flagged out, there may be a few more people defecting. This is a tough scenario for nVidia here, since theyve come in last.
Being first, such as with Physx, well until theres real competition, they can slide a lil, not this time tho
a b U Graphics card
January 21, 2010 8:03:45 PM

One question: So you can have Nvidia's 3D Vision technology running with an ATI card just like you can run Physx of an Nvidia card with an ATI main? And the author of the article is scared that Nvidia will block such support when its drivers see an ATI card? Point being that I didn’t know you could run an Nvidia card with an ATI main and still have 3D Vision, if I read that article correctly.

As a side note though I think its great that Nvidia can enable it’s Eyefinity through drivers allowing old cards access to it. And I also agree with that article that it is disappointing that the Fermi cards don’t have 3 outputs.
a c 130 U Graphics card
a b Î Nvidia
January 21, 2010 8:20:27 PM

Im very confused as to the how as far as this goes. I dont understand how a game could be coded to stop a GPU from displaying it anyhow it wants to ??? Please explain.

Mactronix
January 21, 2010 8:23:45 PM

imo this is just a hold over tech until the econ picks up and a few years from now imma be rocking a 34" curved OLED screen at 5000xsomething that makes sense with 5000 for my surround
a b U Graphics card
January 21, 2010 8:29:00 PM

Since the games in question are only Nvidia's TWIMTBP games, Nvidia has the ability to tell the developers to add some code that has the game ask the Nvidia drivers if it detects an ATI card or its drivers on the current system. Im sure there is some line of code that allows you to do this, I mean if you have the ability to go to your device manager or add or remove programs to see if there is an ATI card or driver so should the drivers of Nvidia.
a c 130 U Graphics card
a b Î Nvidia
January 21, 2010 8:40:48 PM

Quote:
Thais true however, this is nvidia. No doubt they could try and make sure that if an ati card is detected then the available resolutions are limited as they will have "IP claims" over any assistance in "making multidisplays work".


I see what you are saying but i think its very thin cause in the extreme, I just cant see a way of making that fly.
I think the guys in the link need to take a chill pill. Not having access to in game AA because according to AMD Nvidia blocked it is an arguable case, "well our techs helped with the game and we never told them they couldn't include your code but it seems you never provided it"
That's one thing but if they try and stop the basic functionality of a competitors card then that's a see you in court kind of case as far as im concerned.

Mactronix
a b U Graphics card
January 21, 2010 9:50:53 PM

paperfox said:
One question: So you can have Nvidia's 3D Vision technology running with an ATI card just like you can run Physx of an Nvidia card with an ATI main? And the author of the article is scared that Nvidia will block such support when its drivers see an ATI card? Point being that I didn’t know you could run an Nvidia card with an ATI main and still have 3D Vision, if I read that article correctly.

As a side note though I think its great that Nvidia can enable it’s Eyefinity through drivers allowing old cards access to it. And I also agree with that article that it is disappointing that the Fermi cards don’t have 3 outputs.


How is that question relevant?
[H]ardOCP doesn't want nVidia to leverage TWIMTBP program to make games incompatible with ATI Radeon cards when going for multi-monitor setups.
He doesn't care about PhysX & 3D.
a b U Graphics card
January 21, 2010 9:53:54 PM

mactronix said:
Im very confused as to the how as far as this goes. I dont understand how a game could be coded to stop a GPU from displaying it anyhow it wants to ??? Please explain.

Mactronix


The game must support the resolution, which is why Half Life doesn't work at 2560 x 1600, the game can detect the hardware running (how it sets the default graphical quality) when it detects GPU A, DEV X can code it so it would stop - close.
January 21, 2010 11:53:09 PM

sabot00 said:
How is that question relevant?
[H]ardOCP doesn't want nVidia to leverage TWIMTBP program to make games incompatible with ATI Radeon cards when going for multi-monitor setups.
He doesn't care about PhysX & 3D.

Well given the limited compatibility to begin with any failure of eyefinity displaying things properly is gonna what be blame on twimtbp if the game has it instead of just plain it doesn't work. ATI has yet to give crossfire eyefinity across the board which i find odd considering it's pushing high resolutions which crossfire is usually used for.
a b U Graphics card
a b Î Nvidia
January 22, 2010 1:52:19 AM

I'm surprised to see Kyle write something like this that I agree with. He's seemed to be fine with all the shenanigans in the past, it's nice to see something finally getting him to take a neutral stance, and not the 'what's good for the market leader is good for everyone' stance.
a b U Graphics card
January 22, 2010 2:04:42 AM

This wouldn't exactly what I call neutral, straddling the fence, in the middle, or like terms.
He has made his position clear & rather forceful, a position in which I perfectly agree with.

@IzzyCraft, all of my modern games work (Crysis, Bioshock, L4D, HL2, Assassin's Creed, Far Cry 2), the 9.12 Hotfix added CF+Eyefinity for all CF setups.
a b U Graphics card
a b Î Nvidia
January 22, 2010 2:20:42 AM

mactronix said:
Im very confused as to the how as far as this goes. I dont understand how a game could be coded to stop a GPU from displaying it anyhow it wants to ??? Please explain.


Alot easier than you'd think.

It would only affect those too 'mainstream/naive' to 'fix' it themselves, but all you'd have to do is auto-detect and build the .ini like this (or likely more 'hidden' amongst 100s of lines);

...
[Display]
uVideoDeviceIdentifierPart1=222222222
uVideoDeviceIdentifierPart2=222222227
bForcePow2Textures=1
bForce1XShaders=1
bHighQuality20Lighting=1
bFull Screen=1
fDefaultFOV=75.0000
fMaxFOV=90.0000
bIgnoreResolutionCheck=0
.
.
.

and then the other one shows;

[Display]
uVideoDeviceIdentifierPart1=111111111
uVideoDeviceIdentifierPart2=111111116
bForcePow2Textures=1
bForce1XShaders=1
bHighQuality20Lighting=1
bFull Screen=1
fDefaultFOV=75.0000
fMaxFOV=270.00
bIgnoreResolutionCheck=1
.
.
.

Which one do you think would disable surround gaming on any hardware including eyefinity or Matrox's TH2GO?

Very easy to do, equally easy to undo, but then one is 'officially supported' and the others 'are not supported and may cause instability, blindness, anal leakage, and the apocalypse'?

I would be surprised if nVidia did so after this public statement, but then again, this thing still happens. :pfff: 
a b U Graphics card
a b Î Nvidia
January 22, 2010 2:22:51 AM

sabot00 said:
This wouldn't exactly what I call neutral, straddling the fence, in the middle, or like terms.
He has made his position clear & rather forceful, a position in which I perfectly agree with.


True, I meant a stance 'for neutrality' not neutral stance.
Trying to type while watching the Daily Show. :whistle: 

a b U Graphics card
January 22, 2010 2:24:39 AM

They can code this into the game engine, into a file that would be much harder to open than a simple .ini, however I doubt (& hope) they won't go to such lengths.
a b U Graphics card
a b Î Nvidia
January 22, 2010 2:27:51 AM

I realize that, I'm just saying it's even THAT easy, let alone if they want to hide it deeper like Batman's AA, etc.

a b U Graphics card
January 22, 2010 2:52:34 AM

Well as the co-founder/co-leader of an indie game studio, I know we won't intentionally lock out any hardware, and won't implement PhysX.
PhysX doesn't make sense from a dev's standpoint, on the latest steam survey, PhysX capable GPU's are probably less than 10% of the discrete. While all of the CPU crowd can use roughly the same. Plus nVidia's outlook isn't great, so their 60% marketshare will go down.
January 22, 2010 12:46:46 PM

Umm ok for those dont know how eyefinity is done, lets recap:

in a traditional multi-screen (well ok radeon and geforece, lets not bring in professional or other specialized cards ppl) set up, the card allows you to set up 2 or 3 or more monitors by extending the desktop on to the other monitors, hence there is a "set primary" option and you see multiple screens in the windows desktop resolution configuarator. Now some games and most movie players especially will ONLY play on the PRIMARY screen and not any of the secondary screens at all, just because they can't due to technical issues, or due to some other factors. In this case the OS handles the positioning of the screens.

what eyefinity does is to MERGE all the monitors into one giant monitor on the hardware level (or driver) and present it as ONE big PRIMARY only display to the OS, whereby any game, or video player should be able to take advantage of (think about it, you can make a video windows scale past ur current screen if you really wanted, just full screen support is buggy as hell on some players for multiple screens without the primary flag) without extra coding, as to them, there seems to only have ONE and only ONE screen plugged in (while there is indeed more plugged in), thus making it easier to interface and etc. The driver will have to handle the position of each picture and etc, not the OS.

How can NV or someone else lock out this tech? By limiting the display resolution to some specific numbers if there is a ATI card in place, for example, limit to 800 * 600 1024 * 768 ............. 2560 * 1600 (and maybe what ever is after that) and then that's it, so weird resolutions made up of six 1024 * 768 screen isn't going to be supported (6144 * 4608) and is only supported by NV surround.

that means a weird setup with say 5 screen or 4 screen (man ur crosshair would be right in the middle...) or what nots would be effectively locked out of the game or application, and you have to run at the max standard resolution possible, which is just useless for eyefinity
a b U Graphics card
a b Î Nvidia
January 22, 2010 1:14:35 PM

Yep, and don't forget Field Of View, having a surround setup with a 90 degree field of view would suck for a wrap-around screen(s) setup, that's why I included both, because either by a resolution check/block or by limiting the FOV you would kill most of the benefits of eyefinity gaming, and definitely surround screen gaming.
January 22, 2010 1:27:04 PM

Yeah, the FOV too would be an issue, but only if it is offered on NV and not ATI, if they dont offer it on both, well nvm
January 22, 2010 1:29:41 PM

theholylancer said:
How can NV or someone else lock out this tech? By limiting the display resolution to some specific numbers if there is a ATI card in place, for example, limit to 800 * 600 1024 * 768 ............. 2560 * 1600 (and maybe what ever is after that) and then that's it, so weird resolutions made up of six 1024 * 768 screen isn't going to be supported (6144 * 4608) and is only supported by NV surround.

that means a weird setup with say 5 screen or 4 screen (man ur crosshair would be right in the middle...) or what nots would be effectively locked out of the game or application, and you have to run at the max standard resolution possible, which is just useless for eyefinity


So you think that preventing the use of a widescreen resolution when an ATI card is detected wouldn't be that bad??? Have you ever tried Eyefinity with 1920x1200 resoultion or something.... it stretches like a woman giving birth. Makes the image look retarded. And the FOV issue would blow just as bad.
a b U Graphics card
a b Î Nvidia
January 22, 2010 1:33:16 PM

Yeah most games set a default only just for a standard starting point, but if they slip in an FOV max/cap either in the .ini like my example or as Sabot said by burying it deeper, then they could limit one but not the other, and we've gotta make sure devs understand that's a no-no no matter what the IHVs want.

WideScreenGaming had a list of recommended FOVs but it appears to be a dead link now. :??: 


January 22, 2010 1:36:26 PM

TheGreatGrapeApe said:
WideScreenGaming had a list of recommended FOVs but it appears to be a dead link now. :??: 


http://www.widescreengaming.net/wiki/Battlefield_1942

"This setting defaults to 1 for a standard 4:3 monitor. Change to suit your aspect ratio:

1.33333 for 16:9
1.25 for 15:9
1.2 for 16:10 "

Might be applicable.
January 22, 2010 2:22:12 PM

RealityRush said:
So you think that preventing the use of a widescreen resolution when an ATI card is detected wouldn't be that bad??? Have you ever tried Eyefinity with 1920x1200 resoultion or something.... it stretches like a woman giving birth. Makes the image look retarded. And the FOV issue would blow just as bad.



well most games now have a ini setting to fix it, but they could very well lock it eh...
January 22, 2010 2:30:31 PM

Ha ha ha ha ha....

No chance of Toms joining this... Nvidia pay them far too much for that! :lol: 
January 22, 2010 3:22:22 PM

well if this keeps getting bumped for a few months, i hope we at least get a response
a b U Graphics card
a b Î Nvidia
January 22, 2010 6:12:00 PM

Well they do look from time to time, but they are rather busy, and many of them are former active forum members (like Don, Paul, Crash). Problem is it's an editorial and those duties are a little spread out.
January 22, 2010 6:21:02 PM

Does Tom check his own boards? :p 
a b U Graphics card
January 22, 2010 6:41:00 PM

I think due to that field of view issue people started reccomending portrate multi screen setups rather than landscape.
a b U Graphics card
January 22, 2010 8:00:32 PM

So who is Tom?
a b U Graphics card
a b Î Nvidia
January 23, 2010 12:59:34 AM

Yes, but it was very funny to those of us who've ever interacted with him way back when. ;) 
a b U Graphics card
January 23, 2010 2:15:02 AM

Quote:
You assume anyone of the staff actually checks the forums, traditionally they have always been seperate entities although the comments section of articles may be different.

As Ape said, some do visit the forum. I think Chris has a look through after he publishes an article, since some threads get posted in addition to article comments. I don't know about any other time.
January 23, 2010 7:05:16 PM

Dang...

well who knows if this stays up for a few months then we should at least get one in....
!