Sign in with
Sign up | Sign in
Your question

The New Graphics

Last response: in Graphics & Displays
Share
October 18, 2006 10:51:40 AM

A tale of Direct X 10, and rumors of the hardware to drive it. While the demand for Direct X 9 hardware is not slipping, and more graphics cards are constantly being launched, there is much interest in this new standard and the hardware that will support it.

More about : graphics

October 18, 2006 11:55:50 AM

A good article, hopefully both these cards stand up to the hype
Related resources
October 18, 2006 12:10:01 PM

Quote:
That is, until the game developers incorporate more objects into a scene, of course, as we have seen in previews of games such as the Age of Conan, where we will finally be able to cut the limbs of an opponent - like the black knight in Monty Python and the Holy Grail.
Not true. Check out the game Rune. You could chop off heads and arms and beat your enemies with them. :twisted:
October 18, 2006 12:22:39 PM

I'm waiting for the game where I can rip a guys head off and shove it into his arse.

Then I can die peacefully in gaming oblivion.
October 18, 2006 12:35:04 PM

These cards should work very well in OpenGL 2 on Linux.

DX10? What's that? Oh, yes - a fancy name for a GLSL interpretor on a system that finally dissociated objects, graphics hardware and main loop.

Seen under *nix with OpenGL, DRI, X11R6.8. Sooo innovative.

Mind you, these cards should really kick @$$.
October 18, 2006 12:35:50 PM

Quote:
That is, until the game developers incorporate more objects into a scene, of course, as we have seen in previews of games such as the Age of Conan, where we will finally be able to cut the limbs of an opponent - like the black knight in Monty Python and the Holy Grail.
Not true. Check out the game Rune. You could chop off heads and arms and beat your enemies with them. :twisted:

you could even do that in jedi outcast (by filling in a cheat :wink: )
it was so fun!!! :twisted:
October 18, 2006 1:32:51 PM

>Run very well in OpenGL 2 for Linux

They probably will. Of course, you'll be running the nVidia drivers... which have security holes :) 

>GLSL interpreter

Please, OpenGL != Direct3D. They are totally different 3D standards.
October 18, 2006 1:40:51 PM

...and both include a shader language that actually map almost exactly one to the other - dixit the wine-devel mailing list.

So as to Direct3D and OpenGL being completely different 3D languages, maybe at one time - but less and less.

Makes you wonder why Microsoft just didn't implement the OpenGL language from the get go and dubbed it MSGL (since the specs are open, only the name is copyrighted), like 'others' did with Mesa.

As to the driver, well, true - for now. There's been some progress made towards making most (if not all) recent Nvidia hardware accelerated in 3D, in the 'nouveau' project. When it works I'll probably switch - less hassle.
October 18, 2006 2:00:06 PM

So, basically, you agree with what I said?

OpenGL and D3D are becoming different languages to address the same features. D3D is not an interpreted OpenGL.


>makes you wonder why Microsoft didn't just implement the OpenGL language from the get go

Because they're Microsoft!

>driver

Yeah, the benefit of the OSS community is that they can fix said problems. Sadly the nVidia drivers aren't OSS and Nouveau isn't *quite* there yet, but we shall see!


(incidentally, I'm not a Microsoft apologist, though I use XP for desktop, I am a Solaris sysadmin, I develop on a Kubuntu box, I have a Smoothwall firewall and my laptop runs... well, whatever is flavour of the month. I've had Knoppix / Kubuntu 5/6 / FC 3,4 and RH9.2 on it in the last 2 years) :) 

Next project: MythTV... *shudder* ;) 
October 18, 2006 2:13:19 PM

Quote:
That is, until the game developers incorporate more objects into a scene, of course, as we have seen in previews of games such as the Age of Conan, where we will finally be able to cut the limbs of an opponent - like the black knight in Monty Python and the Holy Grail.
Not true. Check out the game Rune. You could chop off heads and arms and beat your enemies with them. :twisted:

also, the game Blade of Darkness had this feature. It came out before rune. I love that game.

*correction* blade of darkness came after rune. my fault.
October 18, 2006 3:01:49 PM

Quote:
A good article, hopefully both these cards stand up to the hype


That was a mediocre article, all it said was "now we have geometry shaders." Two pages and that's all the info there is. I guess it isn't a terrible sum up of where things are, but anyone who even knows what a G80 is knows that already.

Quote:
So, basically, you agree with what I said?

OpenGL and D3D are becoming different languages to address the same features. D3D is not an interpreted OpenGL.


>makes you wonder why Microsoft didn't just implement the OpenGL language from the get go

Because they're Microsoft!


A little thing we like to call vendor lock-in. Apparently microsoft actually left the OpenGL review board in 2003. I would guess that that is probably right about the time they started developing DX10.

I would like to see them all get back together and agree on it and everyone use it, but it won't happen, because MS is making too much money to give up the DX now.
October 18, 2006 3:03:22 PM

>vendor lock-in

I kinda implied this with "Because they're Microsoft", but ok! :) 
October 18, 2006 4:00:11 PM

and I'm completing a full switch to Linux - because I like to spend my cash and time on stuff that deserve it.
October 18, 2006 4:22:55 PM

So, then is it true that the new G80's might not use the DX10 engines?
October 18, 2006 5:04:29 PM

Oh, it probably will - it's just that while Nvidia keeps going in the fixed role shader units (pixel OR shader), Ati will keep churning out mixed role shader units (pixel OR vertex depending on case). Now, since DX10 was developed in close collaboration with Ati, it would stand to reason to think that Ati would make 'real' DX10 cards, while Nvidia won't.
Considering that in fact, vertex shaders can be emulated by the CPU at little cost, graphics cards may in fact include only pixel shaders, and not waste cycles on emptying a shader unit's pipeline, change its role, then refill its pipeline to do it all over again.
Frankly, the DX10 'revolution' is done more at the driver level than at the hardware level - DX10 cards won't be fundamentally different from DX9 cards, merely add a handful of functions to justify the upgrade.

Remember when Nvidia released the 'ti' series in the Geforce 2 range - and they turned out to be the exact same cards with a better driver? Well, it's pretty much the same thing here.

You may not expect your DX9 cards to run DX10 games with all the eye candy, but I'll be damned if there's no gifted demo maker able to make a demo run so similar on both DX9 and DX10 hardware no one would notice the difference.
October 18, 2006 5:44:19 PM

Well, for some reason i am expecting the Gefore FX Fiasco to repeat it self for some reason.
October 18, 2006 5:52:11 PM

I too was a bit underwhelmed by the article :/  It was pretty brief, I would've liked if it at least mentioned things like early and mainstream games which will take advantage of DX10 (Crysis for example). And they could've also told when DX10 hardware would come out from ATI and nVidia... or when DX10 will be out, period! They also didn't mention compatibility between XP and DX10 or lack thereof. I tend to prefer my THG articles to be >5 pages with lots of nummy details.

Ah well, beggars can't be choosers as they say. Though if any THG people are reading this, I at least would appreciate it if this article had a bit more meat to it.
October 18, 2006 7:33:11 PM

well, like they said at the start of the article... they put as much meat as they could in there. Frankly, as stranger mentioned there is not much known beyond what we have all read on these forums already. Kinda like a calm before a storm. Once the hardware hits there will be tornados of reviews claiming the +27 1337 performance from the g600ultraXTX that blows the doors off of "last gen" hardware.

Not saying that those wont be valid claims, as those upcomming cards will be smokin. But just like previous generations of advancements it will be a while before a rockin dx10 game comes out that truely taxes it. Remember dx8 and the whole shader "revolution"? It took forever for games beyond tech-demo quality to surface. Watch and you will see that dx9 will still be predominant through 2007. So let it be written, so let it be done. ;) 
October 18, 2006 8:41:55 PM

How much better will DX 10 cards perform on DX 9 games vs. DX9 cards?
October 18, 2006 9:00:52 PM

[consults magic 8-ball]

"Answer unclear Ask later"

[scratches head bewildered]
October 18, 2006 10:02:19 PM

You guys are assuming windows will cost more, which is why your just going to go to Linux and "spend money on those who deserve".

Like most of you guys arnt pirates anyway. lol
Windows is ... ahem... free.

lol. Acutaly I own more then 8 copies of windows XP home (oem) and 2 copies of pro (1 oem). They just come with the computers.... i dont care. I think its BS they cant be transfered though, but its just as easy to download a copy of XP off the internet...

As far as rendering engines.... I cant remember the last game I played that had the D3D logo on it. I think it might have startwars rouge squadron... a fun game.
October 18, 2006 10:11:06 PM

Quote:
That is, until the game developers incorporate more objects into a scene, of course, as we have seen in previews of games such as the Age of Conan, where we will finally be able to cut the limbs of an opponent - like the black knight in Monty Python and the Holy Grail.
Not true. Check out the game Rune. You could chop off heads and arms and beat your enemies with them. :twisted:
Ahh the memories. I should dig it out and play it again. I loved the hammer. Smashing my foes into meatwaffles. Or the freakshow that Odin grants you after what seems like endless skeleton soldiers in that pit.

I can feel the bloodrage surging to my balls as we speak!

Quote:
I'm waiting for the game where I can rip a guys head off and shove it into his arse.

Then I can die peacefully in gaming oblivion.

So far the closest I've come to that was watching the cutscene where Duke rips the head from the final boss and proceeds to sh!t down his neck in Duke Nukem3D.
October 18, 2006 10:58:05 PM

I was refering to the fact that even owning the OS, you some times need to resort to other methods. I did not mean offence.
October 19, 2006 3:44:17 AM

Quote:
Although Ageia is gaining ground


Ageia is gaining ground O_o I must have missed that I thought it was screwed.. (was going to put some analogy there but the only ones I could think up were a bit tasteless so I restrained myself :p )
October 19, 2006 6:16:29 AM

Quote:
"Frankly, as stranger mentioned there is not much known beyond what we have all read on these forums already"

That's the problem then ...you guys should read more than just these forums.

These other guys already have a g80 and have released a few interesting bits ...like the fact that the g80 DOES have unified shaders:

http://dailytech.com/article.aspx?newsid=4442
http://dailytech.com/article.aspx?newsid=4441


hmm... if you actually believe that no-one here has read that then you are a bit narrow in the mindset... we have already seen that. Maybe if you were half as boned up on reading as you think you are then you would know that until the animal is actually seen in production instead of some grainy bigfoot pics then most of the info that comes around like that is FUD.

Nv and ati both spread dis-information on new products regularly. The non-unified aspect is from a more solid foundation but is no-where near 100%... that Nv has been touting that unified is not the way we need to go yet in their press releases lately only moves it closer. Ati has already made a unified part so that is partly why Nv talks it down, but based on history they will wait another gen before they jump to fully unified otherwise they are contradicting themselves;[/i] and that cant happen for another generation. ;)  (JMO of course)

Regardless, while dailytech can bring some good scoops they can also put out unreliable info. Everything on this subject is rumor, you just have to filter out the wheat from the tares. The truth will be fully known only when the card is revealed.

Myself and others on here have been wrong before, so don't take this as a "I know all" statement. And please, dont believe everything you read from the rumor-mill.
October 19, 2006 7:38:18 AM

I have Win98SE, win2k and winXP pro licences. I have never downloaded a pirated Windows version - my previous job let me have a go at enough original CDs to care :D  and I know how to 'upgrade' those with a text editor and a command line.

I use Linux anyway.
October 19, 2006 8:31:30 AM

I owned 95,98,me and a copy of office 2000 enterprise. All still wrapped and never used. Of course i pirate software but for hte most part its a see if i even want it kind. When it comes to windows and ms software however i might own a real copy but i sure as hell am not using it. I think i have a copy of windows xp pro somewhere but i use the rtm cracked all to hell version that doesnt blow up when i change stuff in my computer. Personaly i dont care if MS or anyone has a problem with that i dont want to deal with the stupid reactivation bs since im constantly changeing out hardware upgrading and rebuilding my computer. Needless to say if i could just "plug and play" when it came to games i would be using something like linux. Since i like things like office and gaming ill stick with windows as long as im not requiered to use the real copy. Last game i got that plays on opengl though i think was half life :-/ i really miss opengl, Not sure if its the same anymore but back then running in d3d made everything look like crap distorted transparancys and reduced frames just by using d3d.
October 19, 2006 9:32:13 AM

I still have a Office 2000 licence somewhere - never used (when I got it Off97 was much more stable, and after that I switched to openoffice). My big bro' was the one coming home with pirated software, and now I just don't bother (I don't need to - FOSS software rulez).
October 19, 2006 1:29:47 PM

hehe ..so you are telling me dailytech is openly lying when they say they have the g80 in their hands?

We have only a few weeks to go now. You can bet this thread will be revisted.

:) 
October 19, 2006 2:49:52 PM

Quote:
hehe ..so you are telling me dailytech is openly lying when they say they have the g80 in their hands?

We have only a few weeks to go now. You can bet this thread will be revisted.

:) 


not calling anyone a liar here, re-read what I posted... just saying that until the real product comes out to market there is little way of knowing what rumor is true. "Most" analysts have stuck to the point that Nv's part will not be "fully" unified. I will look for your re-visit for sure. If I am wrong I have no issues admitting it. (wonder if you would do the same...?) but we will see when the sasquatch is revealed... is it a monkey suit or a lost species? I for one would be happy either way. (one to see Nv go the way of the future and not stay in the past or the other to know that there is solidarity in how the market has been working)
October 19, 2006 4:43:20 PM

Windows XP barebones activated install.

Linux dd Windows XP barebones activated install partition.

Copy .img to DVD

Customize Windows as I see fit.

New hardware, Re-install????

dd Windows XP barbones activated .img back to Windows partition.

You should be able to run Windows Recovery to adjust for the new hardware without having to re-activate but I haven't tested that yet.

On a similiar note, you could do the same thing with G4L but it wouldn't recognize my RAID when I had one so I went with the above method (TY linux_0). G4L would be much easier because it wouldn't require a Linux distro to be installed but since I use Linux for everything else and XP for games only it works out ok for me.
October 19, 2006 5:20:12 PM

Congrats on your 3000th post. Have you yourself become unified with all those posts yet?
October 19, 2006 5:25:52 PM

yup, like I said... FUD until we actually see it and the "official" specs are released.
October 19, 2006 5:30:07 PM

Quote:
Congrats on your 3000th post. Have you yourself become unified with all those posts yet?

Wow 3000 posts... I just got past 400 and was kind of proud haha.
October 19, 2006 5:41:05 PM

Actualy according to DaSickNinja, post count = computer knowledge. So at 3000, you are many times more informed then I am....


Time to get back on track.

Does this senario sound familer? Every new cool gadget that comes out we have this discusion. Someone says they get a demo product, and has ALL the specs and programs needed to test it. Nobody gonna get it untill its for sale... and then we will know for sure...


Reminds me of all the hype behind WindowsME... untill it was actualy availible that is.

DaSickNinja.... im j/p
October 19, 2006 5:43:19 PM

i don't get you guys:

"DailyTech received its first looks at a GeForce 8800 production sample today, and by the looks of it, the card is a monster: at least with regard to size and power requirements.
"

How does this remotely equate to just spreading around rumours we have already seen?

These guys actually have the g80. So, like i said ..you have to call them liars if you mentioned the word rumour and ignore what they are saying as FUD. Guess what?.. you have (called them liars) :) 

Most rumours have qualifiers like ...speculate/imply/think/possibly/maybe ...ect...

Dailytech is not holding back...they have made absolute statements. They have a g80 sample ...the g80 has unified shaders (up until that point..every other source has speculated that the g80 does not have unified shaders ...based on the notion that nvidia has been dismissing the need for them.)

Now as far as the other post about admitting if i am wrong.... I will assuredly say i was wrong in believing dailytech's word that they had the g80 in their own lille mitts if time reveals false info. For now though, i accept their word at face value --especially as they use absolute phrasing. I consider them well above the likes of tabloids like the register. I am not one to discount EVERYTHING just to be safe :) 

I will be more than a little displeased, to be sure, that my trust was misplaced should it transpire that dailytech lied...Afterall, i have no insider connections and so rely on places like Tom's and anandtech etc.. to provide me with useful info.

edit: now since ati has just released a card but changed its specs from the review samples ... there is some wiggle room on that front. Though dubious, it is possible nvidia might be playing a game.
October 19, 2006 6:20:36 PM

happy place time man. (think midgets, women in lingerie and beer... lol)


seriously though, you are right. w/o a test you have to go one someone's word. Plus, engineering samples are not final, so all the pics showing water cooling and such are not "for sure" things. The chip otoh is kindof a nebulous thing. So far both Nv and ati have put all reviewers under nda in the past w/ new reviews and then let them all off at the same time. So if dailytech has this "sample" then how did they get it? if it was official Nv release then they would be under nda if it was true (or else other review sites would have "previews" too)... and if not official then how can you trust the info w/o a test?

Like I said, it would be cool if it was unified... but I really doubt it to be true that it is. We shall see...


FUD
a b U Graphics card
October 19, 2006 6:57:56 PM

Im thinking all of what were hearing/reading here.Maybe,yes maaaaybe its a sample,and maaayne its nowheres near a final product,BUT incorporates many of the final products hardware/abilities.I believe Annand,as I would something from Toms.nda is more from the final product end,and maaaybe since this isnt a final product they can write a lil about it.Sort of like an appitiser so to speak,NOT the main meal IMHO
October 19, 2006 7:51:24 PM

Just so we are clear, my response was to you because of your:

"there's one thing thaty bugs me. how would dailytech know it has them specs. AFAIK all that crap, oh and by the way for now it is, was just taken from the same supposedly leaked specs that have circulated for a few weeks."

Here you directly state dailytech got there info "from the same supposedly leaked specs ". (repeated for emphasis)

So i'm like ,no way dude ...they have the g80 part as they clearly state.

:) 
October 19, 2006 8:30:49 PM

they have to be leaked specs b/c how else would they know? you cannot looks at a chip and tell if it has x pipelines or y shaders let alone if those shaders are unified... so unless they had a test for it (which stranger pointed out is non-existant right now) then the specs must have been leaked.


seems to be straight forward logic to me... I am not condeming dailytech mind you (you can get some wrong just as you can get some right), just saying that as of now it is unsubstantiated rumor and nothing more.
a b U Graphics card
October 19, 2006 8:41:55 PM

Quote:
happy place time man. (think midgets, women in lingerie and beer... lol)


LOL! Glad I didn't even bother with this thread I would been cussin' sooner.

Quote:
seriously though, you are right. w/o a test you have to go one someone's word. Plus, engineering samples are not final, so all the pics showing water cooling and such are not "for sure" things. The chip otoh is kindof a nebulous thing. So far both Nv and ati have put all reviewers under nda in the past w/ new reviews and then let them all off at the same time.


And the thing is, hands on doesn't matter unless they have an X-ray machine and the ability to understand what each group of transistors is, I doubt Dailytech has that on hand. So whatever information they are providing about the design is almost certainly nV provided. Unless they've run tests to see fill rates, and test certain ppixel specific and vertex specific loads (not sure if you can do geometry only tests, but likely weight them more/less). You'd need alot of tests to no min/max, and whether or not it was unified. Even then it would depend on whether or not is had a god internal dispatcher or if it need to flag pixel, vertex, and geometry in software. So even now they might not be able to tell for sure.

It could still be a 'hybrid' and called unified by nV, and thus anyone reding off of nV's launch info.

128 'unified shaders' could mean 96 dedicate pixel ALUs in 48 traditional 2ALU 'pipes' and then 32 unified vertex / geometry shaders, equivalent to 16 traditional geometry units. We don't even know what constitutes a 'shader' to them (Full+Mini ALUs?).

Quote:
Like I said, it would be cool if it was unified... but I really doubt it to be true that it is. We shall see...


I agree, right now I think it's likely still hybrid based on their early statements, and I think the term unified is an over-estimation of abilities to try and take away any PR sting of 'hybrid'.

Quote:
FUD


Maybe, but more likely fed by nV or a 'guesstimation' by DailyTech on what limited info they had and what they couldn't possibly ever know without runnning a ton of tests (BTW how does anyone think they'd know the 'true' number, I doubt Rivatuner or any other tool is anywhere near accurate at reading a chip that complex).
October 19, 2006 9:55:15 PM

Quote:
FUD


Maybe, but more likely fed by nV or a 'guesstimation' by DailyTech on what limited info they had and what they couldn't possibly ever know without runnning a ton of tests (BTW how does anyone think they'd know the 'true' number, I doubt Rivatuner or any other tool is anywhere near accurate at reading a chip that complex).

agreed. I only say "FUD" when I am more focused on the UD portion of it that (like I mentioned earlier) can be (and has been) fueled by Nv. (and ati... all of them do it) They need to keep the competition in the dark, and create hype for sales.

oh, and I think I have a pair of x-ray goggles that I bought as a kid stuffed in a drawer somewhere... maybe that is what they used to see how the chip is set up? lol ;) 
October 20, 2006 1:01:42 AM

A very interesting post and one that made me think more on it :) 

reading on in the comments (g80 series) again (and finding)..dailytech "clarifies" 128 unified shaders with:

"By KristopherKubicki on 10/5/2006 1:05:16 AM , Rating: 3

NVIDIA defines the 8800GTX with 128 "stream processors." Each shader has a clock of 1350MHz, but the core clock is 575MHz. We will detail more on the architecture later today. "

and of course, they never post more as promised after that initial oct 5th.

If NDA caught up with them, then you would expect the oct 5th stuff to be deleted as well.
a b U Graphics card
October 20, 2006 5:17:29 AM

Quote:

agreed. I only say "FUD" when I am more focused on the UD portion of it that (like I mentioned earlier) can be (and has been) fueled by Nv. (and ati... all of them do it) They need to keep the competition in the dark, and create hype for sales.


Yep, I mean how many reviews simply post the number ATi and nV provide, complete with the provided slides? I understand it's nigh on impossible to come up with their own figures (still troublesome getting access to that Xray machine or tunneling electron microscope and then figuring out what those gates & interconnects are supposed to do).

The thing is though, as we've seen in the past, the PR leading up to the launch is usually more misleading than even the speculation in the forums.

I don't doubt there'll be a ton of suprises, but the question comes up, which one was/is the misleading FUD, the stuff they said 6 months ago or the stuff they're saying now? I tend to believe the older statements about a hybrid chip because that statement wasn't about to get ATi to change their architecture due to the mention that nV wouldn't be unified, however to call the G80 unified instead of a hybrid has a more realitstic purpose IMO. But of course only time will tell as always.

Quote:
NVIDIA defines the 8800GTX with 128 "stream processors." Each shader has a clock of 1350MHz, but the core clock is 575MHz. We will detail more on the architecture later today. "


Yeah that sounds like alot of double talk since current SIMD design would be a stream type processor as well, it doesn't provide much additional information. As for the seemingly 'asynchronous' clocks, that will be something interesting to look at for sure.
October 24, 2006 9:22:43 PM

I just want to know when I can aquire a new DX10 video card so I can finally see what the fuss is about on Microsoft FSX :-)
October 24, 2006 10:33:39 PM

It looks really sweet in max settings at 1920x1200 wish i had a good joystick though D:
October 24, 2006 10:42:01 PM

Quote:
It looks really sweet in max settings at 1920x1200 wish i had a good joystick though D:



Still, I can't wait to see it on a DX10 card.
!