is ATI going to get their drivers in order?

Oh SoS

Distinguished
Nov 14, 2007
45
0
18,530
the 3870 should be stomping the competition, GDDR4, higher GPU frequencies...what the hell?

i'm leaning towards a 3870 so I can' go crossfire in 6 months to a year when the prices drop...but I'm scared they're never going to get their act together and really be competitive w/ the 8800s.

Which will happen first: SLI capability on x38? or ATI releasing a good set of drivers?
 

Slobogob

Distinguished
Aug 10, 2006
1,431
0
19,280
It's as good as it gets. The clock speeds are misleading. ATI uses a different architecture compared to Nvidia.
AMD GPUs are clocked higher than those of their competition, but they run their shaders at GPU clock while Nvidia runs them a whole lot faster.
The 8800 GT runs its shaders at 1500-1620 Mhz while AMDs 3870 runs at only 775Mhz which is significantly less. Thanks to their shader architecture amd cards are inherently more parallel though.
GDDR4 does not have to be an advantage. The biggest improvment from GDDR3 to GDDR4 was the bandwidth, not the latency. It should be better if it comes to swapping large textures around but otherwise the advantages are rather slim.
 

Mathos

Distinguished
Jun 17, 2007
584
0
18,980
Not to mention, a lot of games are optimized to run on Nvidia cards nowadays and not the other way around. The games I've seen that don't have the Nvidia the way its meant to be played logo, seem to have the 3870 and 8800gt at pretty much the same level.
 

randomizer

Champion
Moderator

GDDR4 is also more expensive IIRC, if they had used GDDR3 they could have sold them a bit cheaper, and made them more competitive. In Australia the only thing the HD3870 has going for it is dual slot cooling and lower power consumption, and there are some galaxy 8800GTs with dual slot cooling too. It is more expensive than the 8800GT by a few dollars, so it's not a very good buy.
 

systemlord

Distinguished
Jun 13, 2006
2,737
0
20,780
I have been wondering when ATI cards are going to be able to run AA without a huge drop in frame rates, guess this is it. Although when the 8800GTX first came out the drivers were not so good, but with time it got much better. If ATI can't get the next two drivers running with good AA then its too late for the series of cards.
 

blotch

Distinguished
Dec 9, 2007
93
0
18,630
ATI's and Nvidia's cards have different architectures like someone said but they didn't explain what it was. Nvidia's cards have a great design to run DX9 games because they require tons of texture units. Because of these units in DX9 they can pump out the frame rates and will be fine with AA because AA in DX9 is tied to the number of texture units. ATI's cards in comparison have tons and tons of shader units but relatively few texture units. the ATI cards are designed specifically for DX10 which does everything with shaders, including AA. I dont know what ATI was thinking by releasing a card this far ahead of the software but it wasnt a good move.

And to all you who are thinking "but Crysis and a couple other have DX10 and the ATI cards don't perform well in those." Well its pretty easy, ANY game that is not VISTA ONLY is not a dx10 game. It will be a DX9 game with some "dx10 like" effects. To take advantage of the massive amount of shader units in the ATI cards the game has to be written from the ground up for dx10. The ATI cards should be far superior in dx10 games but seeing none exist they are currently losing to cards that are designed for todays games and not tomorrows.
 

dev1se

Distinguished
Oct 8, 2007
483
0
18,780


Both companies really need to look into this AA debacle, since no current games seem capable of running with it activated unless you enjoy -30FPS
 

systemlord

Distinguished
Jun 13, 2006
2,737
0
20,780


I remember quite some time ago (couldn't fint it) there was an article on weather AA should be done in hardware versus software, but some games (Oblivion) without AA looked as good or better than some with AA. I played Oblivion for 500 hours on a 7800GTX without AA and I never noticed any jagged edges or any uneven lines.

Now Ghost Recon Advanced Warefighter doesn't even support AA, it has something Ubisoft calls "Edge Smooth" which makes the game blurry or out of focus. It seems Oblivion had AA built into the game engine which is why it was so hard to run on most systems including mine. I think this AA debacle needs to be addressed by the card makers and software companies, because a game that doesn't support AA can't use it even if the hardware can.
 
Thats where we come to the unification of DX10.1, which was supposed to be DX10. 4xAA will be inherent in those games, as that will be the standard. So no freebies anymore for non compliant cards. Theyll still run DX9 etc, but will be more like the ATI 2xxx and 3xxx series, but really, EXCEPT for these mid/schitzo DX9/10 games out now, the DX9 games run fine with the newer cards, even the 2xxx and 3xxx series. So that leaves us with only better performance to come. And yes it will come, as the gpu makers adapt their HW to the new DX models
 

pauldh

Illustrious

Seriously? Oblivions jaggies so jump out at me they drive me nuts. Every person you talked to had jaggies on their face just for starters. I quickly upgraded an almost brand new 7800GT to an X1800XT that could run fsaa + HDR at once, higher details, and still maintained better fps outdoors than the 7800GT. I felt the game was so good, I'd put in plenty of time to justify a new card and not put up with no fsaa and lousy foliage framerates. Shoot upgraded that card for an X1950XT when I got a new monitor and ran 16x10 instead of 12x10 and still needed fsaa IMO. Screenshots show the jaggies, but they are even more noticeable as you move around and watch them climb and crawl. I bet the jaggies would be very obvious to me at 19x12 even in that game.
 

Slobogob

Distinguished
Aug 10, 2006
1,431
0
19,280


I suspect that we will see more GDDR4 once graphic cards move beyond 512MiB since it is more power efficient than GDDR3. Architectural NVidia will steer clear of GDDR4 as lower latencies are more important to them then thanks to their high clocked Shaders. For a highly parallel GPU, like AMDs R6xx series for example, GDDR4 is a gift. With a nice, wide bus, GDDR4 should really work great for AMD as long as they can keep the GPU speed up.
I suspect that Nvidia won't jump on GDDR4 like AMD did and that will keep it more expensive and finally make it not competitive price/performance wise until GDDR5 arrives.
 

systemlord

Distinguished
Jun 13, 2006
2,737
0
20,780
Opticians anyone?


oh and can people like the OP please not make threads like this. if you are going to ask a question, ask one do not make statement like posts. christ, when are people going to stop playing the ati can't make good driver card?

i have never known an ati driver that has caused system instability or could be the isolated cause of in game crashes so what is wit the ati bashing?

Don't even begin to tell me what you think I should be doing, the last time I checked I don't live in a communist country where I have lost my freedom of speach. Show me where I bashed ATI, if you can. All I stated is during a game no AA didn't bother me, get a grip on yourself. FACT:Nvidia cards don't suffer as much as of a frame drop when using AA compared to ATI's cards, if you can't handle truth then it seems like a personal problem to me. I NEVER said anything about ATI drivers causing crashes or instability, did I?

@strangestranger You seemed to have missed everyone intent, were talking about standards for AA. Example,
I remember quite some time ago (couldn't fint it) there was an article on weather AA should be done in hardware versus software


Thats where we come to the unification of DX10.1, which was supposed to be DX10. 4xAA will be inherent in those games, as that will be the standard. So no freebies anymore for non compliant cards. Theyll still run DX9 etc, but will be more like the ATI 2xxx and 3xxx series, but really, EXCEPT for these mid/schitzo DX9/10 games out now, the DX9 games run fine with the newer cards, even the 2xxx and 3xxx series. So that leaves us with only better performance to come. And yes it will come, as the gpu makers adapt their HW to the new DX models

If you have a problem with where this thread has gone then beat it! I can't stand winers!!
 

Mathos

Distinguished
Jun 17, 2007
584
0
18,980
Nvidia 8000 and 9000 series cards have fewer stream processor units, and more Texture units. ATI's 2000 and 3000 series cards have a ton of streaming units, but not as many texture units.

The reason Nvidia's performance doesn't drop much with AA is the fact that the texture units on the 8000/9000 cards are running at very high clock speed, close to memory speed. Where as the texture units in the ATI cards run at core gpu clock speed.
 

systemlord

Distinguished
Jun 13, 2006
2,737
0
20,780
strangestanger said, "oh and can people like the OP please not make threads like this". "If you are going to ask a question, ask one do not make statement like posts".


I think it is extremely humorous that you think we all live in a communist country where people don't have freedom of speach. Your request is denied because I live in the USA and the last time I checked I still live here. :lol: :lol:


 

randomizer

Champion
Moderator

Also known as deferred shading, a real PITA in Stalker too :(.
 

homerdog

Distinguished
Apr 16, 2007
1,700
0
19,780
I've always called it deferred rendering, but Wikipedia calls it deferred shading.

As someone has mentioned before, DX10.1 is what DX10 should have been. Microsoft had to neuter DX10 because Nvidia couldn't quite make it work. As a result we have the so called "Edge Smooth" effect in place of real AA. Cool huh :p

In reality none of that really matters though since Vista hasn't exactly been the darling of the gaming community.
 

pauldh

Illustrious
systemlord, lol take it easy. I think you read his comment wrong. I believe the only part of strangers comment aimed at you was the first two words. The rest of it was a comment toward the OP. :)
 

systemlord

Distinguished
Jun 13, 2006
2,737
0
20,780


I guess its that some of the posters (there were a few) were assuming my pictures of GPU-Z were fake, "a poster stated, "you had it coming for posting fake pictures". If someone really wants to piss me off is accuse me of something that I didn't do. Then I have many posters PM'ing me telling me what a dumna** I am, and if that doesn't stop soon...

My thread came under fire when the other post was made, "In responce to Systemlords Bullshyt. The PM's won't stop now as I have even pleaded with them nicely to quit pm'ing me with bad language and insults, you though the post ended badly just have a look at the pms.