Sign in with
Sign up | Sign in
Your question

Intel and DirectX 10 support--is ATI/AMD in serious trouble?

Tags:
  • Graphics Cards
  • AMD
  • Directx
  • Intel
  • Graphics
Last response: in Graphics & Displays
Share
February 27, 2007 2:52:36 PM

http://www.tgdaily.com/2007/02/23/intel_crestline_graph...

Okay, so now Intel is stating that their integrated video released this May will be fully DirectX 10 compliant. Obviously, it is not supposed to be superior than say the GeForce 8800 series or something, but let's remember something: AMD announced that their new DirectX 10 video cards are coming out this May as well.

What would be the consequences if AMD didn't release the new cards until AFTER Intel has the new chipsets? Again I realise there is a huge difference in specifications, etc., but come on, Intel is beating ATI to DirectX 10? That is embarrasing!!! It is time ATI moved forward, because this AMD aquisition looks like it is a dumber move every day that passes, I feel. (sigh) I warned AMD in an email that I feared this would happen--I never expected them to listen or something, but they responded and assured me that the transition would help them develop DirectX 10 faster than they would separate from AMD. I was skeptical but of course felt they knew more than I do about these sorts of things. I was wrong, apparently.

AMD is in serious trouble, I hate it. I use NVIDIA and AMD together because to me they were practically made for each other, and I still feel AMD made a horrible mistake with ATI. I hope they prove me wrong, I really do, but right now, it is not looking good at all.

More about : intel directx support ati amd trouble

February 27, 2007 3:45:05 PM

Perhaps you havent noticed but Intel is the largest graphics manufacturer in the world.

And yes you are equating an IGC to a dedicated GPU card. Really man you need to take a deep breathe and step away from the PC for a little while. ATI has had dx10 on one of those in the xbox360 since 2005.

ATI is far from being in any sort of serious trouble from Intel for years to come. In fact the opposite is true, Intel is the one under pressure from ATI because they are pushing into the IGC market which Intel has had a stranglehold over for a very long time.

As far as dedicated GPU cards are concerned Nvidia is the only direct competition for ATI and from the looks of it that will only last a little longer with the release of the R600.
February 27, 2007 3:51:26 PM

A minor point, perhaps, but the 360 doesn't actually use DirectX10. It's more than DX9, but not full DX10.
Related resources
February 27, 2007 4:02:29 PM

I'm not so sure. They say its using a many-core architecture, this makes me think it's x86-based processors. And only 16 cores on a chip!?

They took a very general purpose architecture and are trying to mould it into a graphics engine. This means they're paying latency, power and die area for x86 decode. What a waste! And how can 16 cores perform 16x better than an 8800! An 8800 has 128 specialized cores running at several hundred MHz, within a specialized connection fabric. It's a stream processor.

Most graphics computations are 4-vector based, because they're working with transform matrices, point vectors and RGBA pixel values. Trying to do more work than one or two 4-vector operations per cycle in a generalized manner seems pretty complicated, but that's what they would have to do in order to compete. Each individual core would need a lot of FP processing power. I'm guessing it's also going to be running at a pretty high clock speed.

I dunno. I suppose they probably know more about it than I do.
February 27, 2007 4:06:36 PM

Agreed. The 360 is not DX10!

Best,

3Ball
February 27, 2007 4:16:06 PM

:? I know what you mean but yeah I am gonna agree with sandman.

I really know how you feel :x , ati is taking its time to release the r600 and time is what they don't have right now ,but I predict that ati is gonna come with a card that can beat the 8800 performance for around 10-25% and maybe MAYBE they come out with a physics technology in which we can use another cards for physics.

Is gonna be good for ati since they'll have the performance lead again( I don't know for how long) but again they are losing a lot of buyers to nvidia for taking their time.

so we are gonna see how it goes, time will tell us :roll:



____________________________________________________________
CPU Type - DualCore Intel Pentium D 820
Motherboard Chipset - Intel Lakeport i945P
System Memory - 1024 MB (DDR2-533 DDR2 SDRAM)
Video Adapter - NVIDIA GeForce 7800 GT (256 MB)
Sound Card - SB Audigy 2 ZS Audio
Monitor - Dell 1905FP 19" LCD
Disk Drive - (160 GB, 7200 RPM, SATA-II)
Optical Drive - PHILIPS DVD+-RW DVD8701 (DVD+R9:8x, DVD+RW:16x/
8x, DVD-RW:16x/6x, DVD-ROM:16x, CD:48x/32x/48x DVD+RW/DVD-RW)
Optical Drive - TSSTcorp DVD-ROM TS-H352C (16x/48x DVD-ROM)
Keyboard - Microsoft USB Comfort Curve Keyboard 2000 (IntelliType Pro)
Mouse - Logitech G5 Laser Mouse
_____________________________________________________________
February 27, 2007 4:23:17 PM

Quote:
I'm not so sure. They say its using a many-core architecture, this makes me think it's x86-based processors. And only 16 cores on a chip!?

They took a very general purpose architecture and are trying to mould it into a graphics engine. This means they're paying latency, power and die area for x86 decode. What a waste! And how can 16 cores perform 16x better than an 8800! An 8800 has 128 specialized cores running at several hundred MHz, within a specialized connection fabric. It's a stream processor.

Most graphics computations are 4-vector based, because they're working with transform matrices, point vectors and RGBA pixel values. Trying to do more work than one or two 4-vector operations per cycle in a generalized manner seems pretty complicated, but that's what they would have to do in order to compete. Each individual core would need a lot of FP processing power. I'm guessing it's also going to be running at a pretty high clock speed.

I dunno. I suppose they probably know more about it than I do.


Intel has been working on an 80 core vector processing project that can be applied to problems such as graphics processing. They aren't going to use a general purpose x86 processor for graphics. Processor architecture is moving more toward a managed system of specialized processing cores. Don't be surprised when your video processing moves onto the motherboard and is directly managed in concert with the CPU.
a b U Graphics card
February 27, 2007 4:29:35 PM

Hasn't this been covered about 50 times already?

You better hope ATI/AMD survive, because if Intel gained a monopoly, do you even have a clue as to what Intels prices will go to?

I guarantee it won't be good!!!!! :roll:
February 27, 2007 4:35:49 PM

What benefits are there for AMD/ATI to rush the release of a DX10 card? There are no DX10 games and the high end video card segment accounts for less than 5% of their sales. Nvidia may own current bragging rights, but that does little to boost their sales. Midrange cards is where the money is at. Right now AMD/ATI is doing just fine in that market segment.
February 27, 2007 4:48:43 PM

Perhaps you haven't noticed, but the $hit Intel makes as graphics chips are only DX9 as a technicality. They SUX. I'm sure DX will love having 32Mb of shared system RAM, lol. (sorry for the rampant exaggerations)
-cm
February 27, 2007 5:08:00 PM

Quote:
What benefits are there for AMD/ATI to rush the release of a DX10 card? There are no DX10 games and the high end video card segment accounts for less than 5% of their sales. Nvidia may own current bragging rights, but that does little to boost their sales. Midrange cards is where the money is at. Right now AMD/ATI is doing just fine in that market segment.


I'm in agreement here.

Though there are rumors running around (started in forums) that AMD is to be bought out by IBM. *shrugs*
February 27, 2007 5:12:34 PM

I think the issue isn't that NVidia has high end kick ass cards, it's that they have DX10 cards. People don't want high end often, but they want contemporary.
-cm
February 27, 2007 5:19:15 PM

Actually the Intel GMA 965 (aka GMA 3000) is quite capable of playing the latest games, including (brace yourself) OBLIVION: http://www.intel.com/support/graphics/intelg965/sb/CS-0...

Yup, in a future driver release, Intel claims Oblivion is more than capable of working on this GMA--it'll likely be 11FPS outdoors at low settings though (heh)!

We are over-analyzing this. 90% of the casual gamers look at DirectX compliance and video memory for their gaming needs, and nothing more. Intel's new DX10 chipset will sell BECAUSE it has the capability, not the speed. We are all intelligent enough to know the difference in this forum, but the reality is that most consumers are completely ignorant of this.

All intel has to do is advertise the fire out of the directx 10 capability, and the ignorant users who had one bad nvidia card in the past will purchase Intel IGCS when it comes out. You think this would be miniscule, but I've had dozens of people already ask me when will the next DirectX 10 card come out by someone other than nvidia? NVIDIA had the FX5200 card that Dell advertised as a "gaming card" and therefore most ignorant users assume nvidia are liars.

Again, innovation means nothing in the end--it all comes down to profits, and if AMD delays its graphics card much longer it will seal its own fate.
February 27, 2007 5:22:56 PM

Have you read about all of the DX10 issues with NVIDIA?

IMHO, It's a very smart thing to hold off on DX10 cards until all of the issues are resolved. I suspect just as many of the issues are in the Vista code as they are in the drivers.

And yes, stating that Intel has DX10 coming out is so funny.
Sure, it can play HL-2, but only on 320x240 Resolution using 16-colors. However, when set as such the other DX10 features work nicely.
February 27, 2007 5:26:26 PM

Two things:
1: You link indicates that the game is NOT playable. It's really clear.
2: Chip manufacturers often say a game is "playable". This means nothing. This means that the chip can the game within the game's settings. My friend has a Radeon 9000 that can run Oblivion so it is "playable". This means he can run it at a resolution of 640x320 within Oldblivion with no eye candy and 25 FPS. While technically "playable" in reality this is not at all enjoyablt to play. Sorry mate. The GMA950, though a powerful integrated solution, sucks at games.

Wiki: http://en.wikipedia.org/wiki/GMA950
It says the GMA950 is beaten my a Radeon Xpress 200. Would you play Oblivion on a X200? Hell it only has 4 pixel pipes. You have to go back about 4 years or so to find a mainstream card with 4 pipes that is acceptable.
-cm
February 27, 2007 5:41:02 PM

Quote:
Though there are rumors running around (started in forums) that AMD is to be bought out by IBM. *shrugs*


http://www.xbitlabs.com/news/cpu/display/20070215235758...

I would prefer for AMD to be taken over by IBM rather than a "private equity group." I trust IBM (as much as I trust any corporation).
February 27, 2007 6:18:19 PM

Quote:
gma 950 and gma 3000 are two different products


*DOH* You are correct...I meant the 965 GMA not the 950 :roll:
February 27, 2007 6:22:41 PM

Quote:
Have you read about all of the DX10 issues with NVIDIA?


Actually, no I didn't. I'm busy and lazy at the moment and don't feel like digging around--you wouldn't happen to have any URL's would you?

Ah, nevermind, I'll just have to look it up later tonight...
February 27, 2007 6:58:53 PM

Well I don't have a DirectX10 Card or Vista running so I may not know the latest, but here is a tidbit.

http://crave.cnet.com/8301-1_105-9684886-1.html?tag=hea...

As of a few weeks ago, NVIDIA still did not have working DirectX10 drivers. Hence while they had the worlds only DirectX10 cards, there were no well functioning DirectX10 cards yet.
February 27, 2007 7:39:16 PM

For integrated, the current GMA X3000 isn't as bad as everyone touts. From what I can read of it, it appears to use some sort of programmable, unifed architecture. Probably not as advanced as nVidia's or what ATI's will be, but nonetheless not bad for integrated.

Link.

This probably as more of a plug to make the the Intel's latest integrated graphics chipset appear more Vista capable. The average computer user doesn't know what Direct X is but they know what Vista is.
February 27, 2007 8:09:12 PM

Quote:
intel is not using an x86 processor for graphics.
dont confuse a core with an EU
i am not sure if the 16 cores would be 16x better but better none the less
also i think power requirments will be quite low compaired to the nvidia/amd offerings

check this out---- tflop @65-85W
http://news.com.com/Intel+pledges+80+cores+in+five+year...


The current graphics engines from Intel don't use x86 cores, I know. The EUs in the graphics engine are _completely_ different than an x86 core. There is no confusion there.

However, Larrabee up in Hillsboro is a whole different monster. That is what this article is talking about and that is what I am talking about. Only 16 EUs on a graphics chip at 45 or 32nm would be pretty pathetic. You would be able to fit 128-256 of those EUs on a chip at those manufacturing nodes.

As far as I know, Larabee is developing a many-core graphics solution that uses x86 cores with a large set of graphics extensions to accelerate graphics and other streaming FP calculations.

Larabee is a completely different group than the group at intel that is currently doing graphics.
February 27, 2007 8:15:23 PM

Quote:
Well I don't have a DirectX10 Card or Vista running so I may not know the latest, but here is a tidbit.

http://crave.cnet.com/8301-1_105-9684886-1.html?tag=hea...

As of a few weeks ago, NVIDIA still did not have working DirectX10 drivers. Hence while they had the worlds only DirectX10 cards, there were no well functioning DirectX10 cards yet.


Aha...okay, yes I heard these stupid sensless complaints, yes. For one, there is no reason to have "fully DirectX 10 compliant drivers" when nothing in the universe uses them yet--if the CARD ITSELF SUPPORTS DIRECTX 10, those stupid lawsuits should be shoved down the whiner's throats.

Sorry, but whining about technology that hasn't even come out yet is stupid, and for some odd reason it is a huge pet peeve of mine (you likely noticed that from my harsh statements...). DirectX 10 is supported in the cards, just drivers aren't there yet. For God's sake give it a rest people (speaking to the nvidiaclassaction.org people).
February 27, 2007 8:25:07 PM

Quote:
why cant one core have more than one EU?


Depends on the architecture. A regular, non-superscalar in-order core can't really make use of multiple execution units for say FP or Vector operations.

A superscalar in-order core would be able to make use of multiple execution units, though statically scheduling instructions in the compiler becomes increasingly difficult as you add more units.

A superscalar out-of-order core (like core 2 duo) would be much better at scheduling instructions and would make better use of it's available execution units (integer, load/store, branch, FP, vector).

Another option is to increase the width of your vector unit from say 4, to something like 16. That increases your overall throughput, however, your data or algorithms may not be so conducive to 16-word vector operations.

Either way, x86 decode + superscaler + out-of-order execution costs a lot of power and die area. Stuff that you probably don't want to pay if you want to do as much FP calculations as possible per watt and per dollar.
February 27, 2007 9:28:58 PM

Quote:
I dont know if it is amd's fault or that they just woke up intel
looks like intel is planning on makeing the amd/ati thing a waste of money

http://www.ntcompatible.com/Intel_Discrete_GPUs_Roadmap...

You think within this decade, onboard graphics will have 16x the speed & memory of the 8800 GTX ? This is about when Fusion appears.
February 27, 2007 9:36:48 PM

What makes you think DX10 is important yet? What DX10 application is grabbing huge market share and compelling people to upgrade?

...what DX10 applications are there at all?

chill. DX10 doesn't matter. It will continue to not matter for about another year or more. It seriously might not take off at all since it practically requires an entirely new system to run DX10 games in Vista if your system is >1year old and most gamers don't do full system upgrades every year. More like every 3 years.
February 28, 2007 1:58:46 AM

Quote:
i think once the cpu and gfx device share the same cache then this is a possibility


I keep on disagreeing with everything you say :wink:

Actually the CPU and Graphics engines don't need to communicate a whole lot once the data is loaded into graphics memory. After the vertex list and geometry list is loaded into graphics memory, the CPU really just needs to send state updates to the graphics card, things like transformation matrices for the different objects in the scene. These are pretty low-bandwidth and easy things to compute.

However, for doing other general purpose FP-intensive tasks that require more communication, you may very well benefit from a shared cache or even just a "closer" connection between the CPU and graphics engine.

As far as I know, GPUs today are limited by their sheer FP processing capabilities. Memory hierarchy matters in GPUs, but it's not at the point where it would benefit more than additional FP resources.
February 28, 2007 2:25:21 AM

Quote:
Have you read about all of the DX10 issues with NVIDIA?

IMHO, It's a very smart thing to hold off on DX10 cards until all of the issues are resolved. I suspect just as many of the issues are in the Vista code as they are in the drivers.

And yes, stating that Intel has DX10 coming out is so funny.
Sure, it can play HL-2, but only on 320x240 Resolution using 16-colors. However, when set as such the other DX10 features work nicely.
The only known Direct X10 issues with Nvidia is their drivers, but there has been no reports of serious Direct X10 issues as you're implying.
February 28, 2007 2:43:08 AM

Quote:
Actually the Intel GMA 965 (aka GMA 3000) is quite capable of playing the latest games, including (brace yourself) OBLIVION: http://www.intel.com/support/graphics/intelg965/sb/CS-0...

Yup, in a future driver release, Intel claims Oblivion is more than capable of working on this GMA--it'll likely be 11FPS outdoors at low settings though (heh)!

We are over-analyzing this. 90% of the casual gamers look at DirectX compliance and video memory for their gaming needs, and nothing more. Intel's new DX10 chipset will sell BECAUSE it has the capability, not the speed. We are all intelligent enough to know the difference in this forum, but the reality is that most consumers are completely ignorant of this.

All intel has to do is advertise the fire out of the directx 10 capability, and the ignorant users who had one bad nvidia card in the past will purchase Intel IGCS when it comes out. You think this would be miniscule, but I've had dozens of people already ask me when will the next DirectX 10 card come out by someone other than nvidia? NVIDIA had the FX5200 card that Dell advertised as a "gaming card" and therefore most ignorant users assume nvidia are liars.

Again, innovation means nothing in the end--it all comes down to profits, and if AMD delays its graphics card much longer it will seal its own fate.


GMA965 playing oblivion? LOL, you should rather say, slideshowing Oblivion. Heck i could also have Oblivion running on my Voodoo 2 using emulation and directx wrapping calls, still getting 0.0002 FPS. And then i could claim that Oblivion is "running" on a Voodoo!
February 28, 2007 3:04:34 AM

Quote:
Perhaps you haven't noticed, but the $hit Intel makes as graphics chips are only DX9 as a technicality. They SUX. I'm sure DX will love having 32Mb of shared system RAM, lol. (sorry for the rampant exaggerations)
-cm

yeah a little exaggeration there.

exactly where did i say there were any good??????????????
February 28, 2007 3:11:00 AM

Quote:
Hasn't this been covered about 50 times already?

You better hope ATI/AMD survive, because if Intel gained a monopoly, do you even have a clue as to what Intels prices will go to?

I guarantee it won't be good!!!!! :roll:


I fully agree, and my second, older PC is an Intel Northwood!

Okay, The Inquirer isn't perfect, but they point out that everyone's coming out with DX10 integrated graphics in 2007. ATI should have theirs by the end of the year. I'm sure that both ATI and Nvidia's integrated graphics will blow Intel's out of the water, just like in previous generations.

Intel can't do graphics. They just get by with what handles Word or Excel and have had to upgrade to run Vista. They don't waste any money on performance. Where Intel excels is going back to an older CPU design and updating it to performance levels. Sadly, they also excel at skating so close to illegality in OEM agreements that they're being sued for their anticompetitive tactics.

Here's The Inquirer's article. Take it with a grain of salt.

http://www.theinquirer.net/default.aspx?article=36451

Also, ATI's X700 based AMD's C2D 1250 and AM2 R690G chipsets are late by about six months. They're coming out this month to compete against Intel's integrated DX10. I expect after R600's value segment, we'll see DX10 integrated graphics arrive on time, now that ATI's integration with ATI has gotten over it's bumpy start.

And Intel's integrated DX10 graphics will be fine for Excel in Aeroglass, but it won't run Alan Wake at 800 x 600 resolution. If it does so at playable framerates, then I'll be surprised. ATI's and Nvidia's will, not that most gamers want integrated graphics, but it works for casual gamers.

If Microsoft ever merged with Intel, we'd all be in trouble. Now, who but stock owners and fanboys would want that?
February 28, 2007 3:14:58 AM

Quote:
your assuming all graphics functions are hardware accelerated in the graphics engine. but with the GMA graphics not everything is done in the GPU. also imagine a nvidia card with no local memory just using the system ram as its only memory. wouldnt it be faster to be sharing the processor cache locally?


If there is a _ton_ of comunication between the CPU and GPU and the current setup is bottlenecking the performance, then it may help. But you also wouldn't want to saturate the cache with requests for textures, vertex lists, geometry lists and the framebuffer.

The GMA is inside the MCH, so it's just an internal buffer away from main memory. Admittedly further from memory than say an 8800 is, but not too bad.

I still think that sheer FP-processing power is much more of a bottleneck than memory hierarchy. Further down the road, memory hierarchy and cache coherency will be more important as the number of execution units scales up.
February 28, 2007 5:55:32 AM

Quote:
your assuming all graphics functions are hardware accelerated in the graphics engine. but with the GMA graphics not everything is done in the GPU. also imagine a nvidia card with no local memory just using the system ram as its only memory. wouldnt it be faster to be sharing the processor cache locally?


If there is a _ton_ of comunication between the CPU and GPU and the current setup is bottlenecking the performance, then it may help. But you also wouldn't want to saturate the cache with requests for textures, vertex lists, geometry lists and the framebuffer.

The GMA is inside the MCH, so it's just an internal buffer away from main memory. Admittedly further from memory than say an 8800 is, but not too bad.

I still think that sheer FP-processing power is much more of a bottleneck than memory hierarchy. Further down the road, memory hierarchy and cache coherency will be more important as the number of execution units scales up.

Again, I was not saying Intel's GMA would like, beat anyone elses integrated graphics (heh, except SiS I suppose hahaha), just saying that they may have DirectX 10 capability before AMD does.

BTW, since Intel's GMA 900 series, their Integrated graphics chipsets have drastically improved. Again, they're integrated graphics--nobody expects them to run games with maxed out settings, but according to the preliminary reports from Intel and such, they claim it will run most if not all DirectX 9.0c games at playable frame rates at the lower resolutions. Heck, even my video card will play Oblivion at the lowest resolution, and it's what--over 3 years old now??

Don't underestimate Intel's integrated graphics...on that note, don't overestimate them either :) 
February 28, 2007 6:06:51 AM

People who bought 8800 series cards are stupid as far as I'm concerned anyways. Those cards came out in November, it's been 4 months already and there's still no good games with DX10 support? And don't even bring up Company of Heroes. A patched Direct X9C native game doesn't count as a DX10 title.

By the time people that bought their 8800GTX/GTS cards actually have games to play that utilize DX10, I guarantee even Nvidia will have something else on the market at a CHEAPER price that PERFORMS better.

I don't know what Nvidia was thinking, making the paper launch of the G80, but in a way, I'm glad they did, because I had no idea people were going to be stupid enough to bite on it.

The only game out that couldn't handle the current market cards was Oblivion, and Oblivion is poorly designed.

Pretty much all people with 8 series cards succeeded in doing, was building a system that needs a bigger power supply. I really doubt you'll notice the difference between playing Quake 4 on a 7950GT and playing it on a 8800GTX.

The market is fine, AMD/ATI is fine, Nvidia is fine and Intel is fine.

What you're all missing, is Intel's DirectX10 support has nothing to do with game benchmarks and everything to do with Windows Vista, because some people don't give a shite about games, but they DO want Windows to look good. And what does Aero require? Good Direct X support.

The difference between ATI and Nvidia is just as clear now as it was 7 years ago. Nvidia puts out whatever they can, whenever they can because they play aggressive. They push the pace.

ATI is patient. If the cards they have out now are concrete enough to handle today's games, that's all they care about. They don't mind if the R600 is a week late, as long as it's out and on the shelves when Direct X10 mainstream titles are.

How is Crysis running on all your 8800 cards guys? OH WAIT IT'S NOT OUT YET LOL. If anything, the ones playing this game all wrong are the people that actually bought 8800 cards. Even more so if you didn't buy an EVGA version, because you can't even trade it in when the 8900 comes around.

And do you honestly think you can sell your 8800 on the market for even 75% of what you paid for it, when people could pay 50% for a better running card?

November was 4 months ago, March is tomorrow. March will be the dawn of the 8900 cards, and I can promise you two things- They'll be cooler, cheaper and at the very LEAST, just as fast.

Good luck with those 8800's and arguing on message forums about how superior Nvidia is to ATI, because in all honesty, a message forum is the only place you can actually see the difference.
February 28, 2007 8:19:30 AM

Quote:
People who bought 8800 series cards are stupid as far as I'm concerned anyways. Those cards came out in November, it's been 4 months already and there's still no good games with DX10 support? And don't even bring up Company of Heroes. A patched Direct X9C native game doesn't count as a DX10 title.


The 8800 series cards, and the upcoming R600, will run the prior DX generation amazingly fast. The same thing happened with the Radeon 9700 Pro and DX8.1. Though I don't play Company of Heroes, I'd say it counts because DX10 support is DX10 support, whether through a patch or at release. The CoH patch reminds me of the Far Cry shader 3.0 patch.

Quote:

The only game out that couldn't handle the current market cards was Oblivion, and Oblivion is poorly designed.


Bethsoft's CRPG design was lacking in Oblivion, too much mainstreaming for an old school Elder Scrolls fan like myself, but the actual game design was good. All Elder Scrolls games have pushed the envelop in their generation, from TES: Arena through Daggerfall and Morrowind. They've never gone the simple fps equals superior game route of many generic FPS.

Quote:

Pretty much all people with 8 series cards succeeded in doing, was building a system that needs a bigger power supply. I really doubt you'll notice the difference between playing Quake 4 on a 7950GT and playing it on a 8800GTX.


For me, Quake 4 would be just as boring on both. I can't wait for Two Worlds and future generations of immersive CRPGs, even the next Elder Scrolls (though if Fallout 3 is a violent as Todd Howard promises, then I'll skip it).

As for bigger PSUs, I'll have to switch out the one in my barebones because, aside from the fact that it's the Apex 400 watt that came with it, any DX10 card, even the 8900 and R700 series, will need more power than a DX9 card. That goes with the territory and I don't see why switching now vs. switching when a DX10 title arrives makes much difference.

There will always be a card refresh that's a bit better six months down the line. Not everyone keeps their cards for three years. Most gamers get a new card every one or two years. As long as there's an advantage in DX9.0c, then a DX10 card this spring makes sense.

Though I'm an avid CRPG gamer, I don't see the point of SLI or Crossfire and the physics cards are just as problematic. All of that mishegoss showed up because Quake 4 and Unreal Tournament style gamers insisted on more frames per second over good game design as in FPS like Half Life 2 or Far Cry (I play few FPS, but I can at least acknowledge the ones where game design almost approaches that of the best CRPGs).

Me, I'll get either an 8600 Ultra because my chipset seems to lock me into Nvidia cards for my one month old build, but I'm not averse to building another PC with the X2600 down the line. Rendering another immersive world with style takes a better card than required for twitch games like Quake 4.

I'm sure Alan Wake will need much more than a 7950GT, just as it will be one of the first to actually require a dual core. I bet you weren't criticizing those with Athlon X2 this time last year? Were you criticizing early adopters of the Pentium D who couldn't wait for C2D?

Probably not, perhaps because they help in multitasking and held the promise of making Excel on Aeroglass look a bit better. Well, the mainstream might push the market, but it's gamers who push the technology.

Me, I put up with Microsoft on my desktop at home because consoles only go so far and there's no competition for operating systems for gamers like in the DR. DOS days. There is competition for CPUs and, while the C2D is the best today, the Athlon X2's are a close second and Smithfield and Pressler space heaters come in a distant third, like Intel SIS and Via integrated graphics bring up the rear when compared to ATI and Nvidia.

I'm sure that when Intel tries to reintroduce a dedicated GPU that it will beat the SIS dedicated cards just like they beat SIS integrated graphics. Were it not for the people who buy a PC without regard to graphics, but who then decide they want to try a Civilization or Sim title, we'd see better graphics in more intensive titles today. Intel really is holding graphics back.

If I were just into Excel and Aeroglass, I wouldn't even have bothered with the 7600GS I got as a two month stopgap before the 8600 Ultra. I'm sure any DX9 integrated graphics will work fine in that market. After all, Aeroglass does not need the destructible environments that geometry shaders add to DX10.

That's why I'd love to see AMD's fusion take the crown from Intel's integrated chipsets for the business and home non gamer end of the market.

Well, I finally got my rant finalized. DX10 games might get delayed, but not by much. After all, DX10 is Microsoft's way of forcing Vista on the whole lot of us, from enthusiasts to budget gamers.
February 28, 2007 1:43:27 PM

Quote:
AMD is in serious trouble, I hate it. I use NVIDIA and AMD together because to me they were practically made for each other, and I still feel AMD made a horrible mistake with ATI. I hope they prove me wrong, I really do, but right now, it is not looking good at all.

Looks like your prediction was incorrect. AMD's integrated platform is ready to go. Intel just jumped the gun as usual to stir up the hornets nest before AMD made their announcement.

http://www.tgdaily.com/2007/02/28/amd_690_chipset/

Guess intel figured that their platform didnt have a chance at holding up to an ATI integrated graphics setup so they played their hand early to try to disrupt the AMD release. Its a shame.
February 28, 2007 2:48:18 PM

Well this is good and bad. I hate that AMD is now making a "todo" over their graphics being better than NVIDIA's at this point. I mean, duh, I knew that this was coming, but like the author of the article said:

"It is the rather strange portion of this announcement, given the fact that AMD's primary target is Intel at this time and Nvidia could be an ally who would be able to make AMD platforms even more compelling."

I will certainly miss the AMD-NVIDIA combo, that's for sure. Hey...this may even lead to a split in the next gen graphics interface (e.g., PCI express "x32" may be implemented by NVIDIA and Intel, while AMD/ATI chooses their own thing or just sticks with the more than sufficeint x16 slot right now). This could get interesting.
February 28, 2007 5:53:05 PM

That article said nothing about DX10 support for either version of the RS690. It only mentioned DX9 and SM2.0 support. The AVIVO function and separate DVI/HDMI outputs on the RS690G could be very attractive for HTPC people though.
March 1, 2007 1:36:49 AM

Quote:
Well this is good and bad. I hate that AMD is now making a "todo" over their graphics being better than NVIDIA's at this point. I mean, duh, I knew that this was coming, but like the author of the article said:

"It is the rather strange portion of this announcement, given the fact that AMD's primary target is Intel at this time and Nvidia could be an ally who would be able to make AMD platforms even more compelling."

I will certainly miss the AMD-NVIDIA combo, that's for sure. Hey...this may even lead to a split in the next gen graphics interface (e.g., PCI express "x32" may be implemented by NVIDIA and Intel, while AMD/ATI chooses their own thing or just sticks with the more than sufficeint x16 slot right now). This could get interesting.


Who said that there won't be any more AMD-NVIDIA combos? AMD would lose out on a lot of CPU sales if they gave the shaft to NVIDIA. Just like Intel hasn't said no to ATI support.
March 1, 2007 2:04:25 AM

EETime

Article about the R600 (and Barcelona and AMD's new chipset all in 1)

"Separately, AMD gave one of the first public demos of the R600, its next-generation graphics controller that uses 320 multiply-accumulate units. The company showed a Barcelona-based system using two 200W R600 graphics cards to hit a terabit/second benchmark.

Release of the R600 has been delayed "a few weeks" so that AMD can roll out a full suite of graphics chips covering multiple market segments for the latest Microsoft DirectX 10 applications programming interface."

"In addition, AMD announced a new desktop chip set, the first from the ATI division since the merger last fall. The AMD 690 sports an ATI Radeon X1250 graphics core and a new video decode block. It is also the former ATI's first chip set to support the HDMI video interface with HDCP copy protection for high definition video.

Ten motherboard makers said they will ship as many as 30 products with the chip. "

and so on and so forth
March 1, 2007 11:58:21 PM

Quote:
That article said nothing about DX10 support for either version of the RS690. It only mentioned DX9 and SM2.0 support. The AVIVO function and separate DVI/HDMI outputs on the RS690G could be very attractive for HTPC people though.


It's not DX10, that's a mistake that people are making on other threads. The X1200/X1250 are based on the X700 core, with 4 pixel shaders but pawning off vertex to the CPU. It's an improvement over the old X200 and has been compared to the X300 discreet card. ATI did add Avivo, so it shares some things with the X1xxx series.

The AMD/ATI integrated DX10 chipset will have the R780 core, and will compete head to head with Nvidia's upcoming chipset. Both will most likely beat Intel hands down in integrated graphics quality.

The R690 is six months overdue. It would have been more revolutionary in September 2006, but it's what's out right now. Personally, though I want a third PC with an ATI chipset, I'll wait till the R780G and native DX10 support.

After all, if discreet DX10 cards in the sub $200 range will have 48 shaders (ATI) and 48 or 36 (Nvidia), then low end discreet cards should have around 24. That will leave open the possibility that DX10 integrated graphics will have up to 12 shaders and be as powerful as my 7600GS or an X1600 Pro in framerates, but also provide geometry shader support along with pixel shader. I'm just wondering if they will still pawn off vertex to the CPU?

I'd love to have DX10 in a dedicated core alongside a dedicated physics core and two generalized K8L or K9 cores. That would be my ideal budget processor. We'll see in a couple of years if that's what fusion ends up to be.
March 2, 2007 12:37:22 AM

Quote:

AMD is in serious trouble, I hate it. I use NVIDIA and AMD together because to me they were practically made for each other, and I still feel AMD made a horrible mistake with ATI. I hope they prove me wrong, I really do, but right now, it is not looking good at all.


it totally suxs!

a mear 9 monts ago it was:

ati + intel or nvidia + amd and all things were good - then came amti (the merger) and nvidia telling intel no sli - now are optitons are really limited.
March 2, 2007 5:07:05 AM

Quote:


it totally suxs!

a mear 9 monts ago it was:

ati + intel or nvidia + amd and all things were good - then came amti (the merger) and nvidia telling intel no sli - now are optitons are really limited.


My favorite rumour was the AMD almost bought Nvidia, but Nvidia's CEO wanted to be the head of the company, which Ruiz nixed. Then there were rumours that Intel had to buy Nvidia, but they've always felt that their graphics were enough for those who don't play games.

Then, people said that Nvidia should do it's own CPU, but if they do anything along that line, it's doubtful it will be for an x86 desktop. Nvidia can keep on doing chipsets for AMD and, now, for Intel. They just need to support AMD cards on their motherboards as a matter of fairness.

There's room for both AMD and Intel in the CPU market and for AMD/ATI and Nvidia in the GPU market. What I'd love to see is an open standard where an Nvidia and an ATI card could work in tandem (ie "SLFire"?).
March 2, 2007 5:40:33 AM

Hell...if no one bought cutting edge technology...citizens would still be riding horses in NYC. :wink:
March 2, 2007 6:20:27 AM

Quote:
People who bought 8800 series cards are stupid as far as I'm concerned anyways. Those cards came out in November, it's been 4 months already and there's still no good games with DX10 support? And don't even bring up Company of Heroes. A patched Direct X9C native game doesn't count as a DX10 title.


DX10 support is one selling factor of the Geforce 8800 GTX. Wouldn't you agree that to an enthusiast the prospect of having one video card as fast as X1950 XTX Crossfire is appealing too? We're talking about a graphics card twice as fast as anything else on the market. The DX10 capabilities are just future-proofing.

I don't recall any 9700 Pro owners complaining about the performance of the card in the DX9 games that later ensued. Do you?
a b U Graphics card
March 2, 2007 7:10:20 AM

Funny thread, sorry I missed it's first day. :lol: 

Quote:

Okay, so now Intel is stating that their integrated video released this May will be fully DirectX 10 compliant...

...[AMD/ATi theories deleted]...


So will this DX10 support be like the GMA900 series DX9 support with all the vertex being done by the CPU? Is that true DX10 when something is done by the CPU essentially emulating on VPU features? I don't think anyone really cares if it's out before ATi or S3 or SIS, what will matter is the price/performance at the time they buy, and likely few people will chose that solutions because of the 'DX10' so much as that they wanted to buy the cheapest DELL/HP/etc and it just so happened to come with it.

I suppose the support list for the GMA965 was a typo on your part and you meant 'OldBlivion' which was created for the T&L impaired GMA900 series.

Quote:
Well this is good and bad. I hate that AMD is now making a "todo" over their graphics being better than NVIDIA's at this point. I mean, duh, I knew that this was coming,


Why?
Aren't you making a 'todo' of the possibility of intel getting a steaming pile of DX10 poo to market first? It's PR baby, nothing new, nor surprising really.

Quote:
Hey...this may even lead to a split in the next gen graphics interface (e.g., PCI express "x32" may be implemented by NVIDIA and Intel, while AMD/ATI chooses their own thing or just sticks with the more than sufficeint x16 slot right now).


Why would that have anything to do with anything? PCIe 2.0 is backwards compatible, so there wouldn't be much differentiation based on that other than PR checkbox features, and when it comes to those, intel and AMD have them all over the place, so expect them to have the 32x and anything else they can give OEMs to throw at customers. nVidia will just play along with whatever AMD and intel decide to do, it's not like they control the market the way intel does, and AMD would be a distant second, but still following right along with the PCIe standard intel would set for everyone. This isn't like HT where some curious innovative twist is a wise move. Everyone would stick to the plan for stuff like that, and as soon as anyhone makes the transition everyone else will be right there to follow.

Also I don't know why you seem to think intel would be tied at the hip to nV, they don't need them anywhere near as much as nV needs intel. intel actually backed off on it's exclusivity by letting sis make chipset after dropping ATi so it's not like intel wants to be any more dependent or integrated with nV, and likely even less so once they move to more discrete graphics production.

I suspect intel's foray into DX10 will be like their intro into DX9, pretty much unusable in anything more than theoretical applications., and really no threat to anyone. The only thing it may do is bring FartCry and HL2 to the 'intel gamer' usually found on the bargain end of the OEM pre-built or laptop section. Nothing to crow about or for anyone to cower from.
a b U Graphics card
March 2, 2007 7:33:57 AM

Quote:
Then there were rumours that Intel had to buy Nvidia, but they've always felt that their graphics were enough for those who don't play games.


Also there were/are big regulatory hurdles for an intel+nV deal, whereas AMD could court both and intel would've had only the same regulatory concerns as AMD so it too would've been an option. intel nV would never have been allowed before the AMD+ATi deal, and really I'm surprised that intel didn't go with ATi considering their focus on multi-media and their need to go into discrete for mainly that reason (thus buying the ATi graphic, and AIW, and their industrial multi-media division [the Xilleon, etc]).

Quote:
Then, people said that Nvidia should do it's own CPU, but if they do anything along that line, it's doubtful it will be for an x86 desktop.


Unless they do the smart thing an buy VIA for it's IP and licenses, because without X86 focus, to even build a desktop CPU would be pointless. If you thought the X86 market was trouble, just try the dedicated server market where mid-sized players come and go because IBM and intel pummel the crap out of them once they get too big.

Quote:
Nvidia can keep on doing chipsets for AMD and, now, for Intel.


Yeah, but I think their best days for that are coming to an end once AMD takes over it's own OEM designs. Once their in house solutions start producing where they should, you wil likely see notebooks and pre-builts ship less and less nV+AMD versus their own all-AMD solutions, just in the same way that most cheap intel systems run intel graphics and chipsets (or even cheaper chipsets like SIS).

Quote:
They just need to support AMD cards on their motherboards as a matter of fairness.


They need to support them in order to sell. Just like AMD and intel supporting the opposition's graphics, it's about selling the most MoBos, crippling your solution to favour your cards, thus cutting off the consumer from the potential best card from someone else creates alot of ill-will towards a company. When nV even rumoured to be doin that with the GF3 series the backlash was pretty vocal because at the time the FX was the nV candidate and it sucked. Why limit youself, it's not like intel will reward nV by staying out of the discrete market if they favour intel.

Quote:
There's room for both AMD and Intel in the CPU market and for AMD/ATI and Nvidia in the GPU market.


I agree with that, but the problem is that there will also be intel, SIS and S3 in the profitable low-end, this is really going to mess with AMD and nV's numbers for profitability if any of the other players gain traction in the profitable low-end.

Quote:
What I'd love to see is an open standard where an Nvidia and an ATI card could work in tandem (ie "SLFire"?).


Well the platform agnostic Xfire seems like it may become official, although that's been possible for a long time with hacked drivers. I think the closest you'll get to mix and match cards is with physics, where it's likely that the VPU will not worry about what the phsyics GPU is doing or what it is. Well at least that's how Havok and Microsoft have been selling the idea.
March 2, 2007 4:32:03 PM

Quote:
aparently intel has added hardware support in its x3000 chip for HWT/L,
pixel shader 3 model, clear video and vertex shader 3.
read the article
http://www.dailytech.com/article.aspx?newsid=2837

i suspect if x3000 and the new amd690 are compaired and code complete intel will give amd/ati a run for its money.


Nah, unlikely. I have little doubt it will be 10 times better than their previous integrated graphics, but there's one major problem: most games 'prefer' drivers from either nvidia or ATI (er, AMD I mean...), how many games do you expect that will be written specifically to run well on Intel solutions at this point?

Again, Intel will write drivers better suited for that sort of thing I predict, and most games will have to "support" Intel graphics chipsets (so they sell more games), but no game programmers that I can think of will start a new Intel graphics alliance or something. However, AMD and NVIDIA will continue to do so.

Just my thoughts...
a b U Graphics card
March 2, 2007 8:16:30 PM

Quote:
aparently intel has added hardware support in its x3000 chip for HWT/L,
pixel shader 3 model, clear video and vertex shader 3.


Yes, I already know that, however when it was introduced the GMA900 series touted SM2.0 support without making it clear to everyone that the vertex portion was handled by the CPU, and my question remains, how much of the intel chip's DX10.x support will be done by the host and not in hardware.

Quote:
i suspect if x3000 and the new amd690 are compaired and code complete intel will give amd/ati a run for its money.


Unlikely.
a b U Graphics card
March 3, 2007 1:22:09 AM

Quote:
as far as i could find the HW is doing all of the vertex and pixel shader
I would assume that since the HW is doing those it must also do all the HW T/L as those are the last peices they needed to add.


I think you're missing what I was saying, not that this next generation doesn't have T&L (BTW vertex shader support replaced T&L it's a superset), but that the new cards 'DX10' support would be equally missing in full hardware support, and likely alot of other features mentioned are CPU-assisted but we will see once it ships. I wonder about the true geometry portion of the chip/shader, and also it's ability to change states on its own.
!