I am kinda surprised by Nvidia not invovling similar help. Like wehn D3 came out, ATI guys like humus came out with tweaks, help to raise the performance. Where is Nvidia doing the same now ? Does anybody know if those guys even care abt the community ?
:tongue: <A HREF="http://www.geocities.com/priyajeet/fing.jpg" target="_new"><i><font color=red>Very funny, Scotty.</font color=red><font color=blue> Now beam down my clothes.</font color=blue></i></A> :tongue: <P ID="edit"><FONT SIZE=-1><EM>Edited by priyajeet on 11/17/04 10:56 AM.</EM></FONT></P>
Unlike some others, when we are done we will give you a complete review, and shortly thereafter benchmarks that we are confident you won't get anywhere else. We aren't going to hurry out the door with a benchmark handed us by a card manufacturer.
Right from the front page, perhaps Nvidia knows the damage isnt soo bad since CS sourse ran quite well, such a change in performance for both parties seems quite unusual at best.
<font color=red>Post created with being a dickhead in mind.</font color=red>
<font color=white>For all emotional and slanderous statements contact THG for all law suits.</font color=white>
such a change in performance for both parties seems quite unusual at best.
I don't remember you making those kind of noises when D3 launched and suddenly the FX series was ahead of the R2xx despite Carmack initially saying the inverse, and them droping the NV30 path. Interesting the optimizations were in the drivers with shader replacements, and later ATI did the same and improved greatly throughout their lines. But no one was skeptical of the sudden effectiveness of the FXs back then. Interesting that the tides have changed, perhaps people are learning about releases, or else they flow (dance) with the current that brought them. I'm sure nV will improve their performance with driver updates, however for now, on launch date ATI is the one to buy, just like on launch day of D3 nV was the card to buy. However, people better only be playing these single games if that's all their basing their purchases on, because like I said back then, neither of these games are representative of the abilities of either line over a wider variety of games.
Now of course people can start arguing about which will be the more adopted engine for future games, and extend that aspect of the debate.
- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK
The ultra is at about 125FPS at 1024x768 with 4xaa and max everything while the Xt is at 150 fps at same settings. But that differnce is negligable any how because 60 fps is perfect!! So a comparison is kinda pointless. Plus ATi worked closer with valve during the development of HL2 because of the FX series, now Nvidia has a card that is just as fast. Give both time and i'm pretty sure they will come pretty close in performance to eachother, just like ATI improved in D3 after it was released, now they are neck and neck. the GF6 is proving to be just as good at X800's. Either way you can play the game maxed out at perfect framerates. I have a GT in one computer and a 9800 Pro in the other and both run the game really well.
all possbile codecs u wud want !!
All are clean - no viruses, spy, ads
also includes quick time and real player alternatives - plugins to make it run under the media player classic - included - very good.
:tongue: <A HREF="http://www.geocities.com/priyajeet/fing.jpg" target="_new"><i><font color=red>Very funny, Scotty.</font color=red><font color=blue> Now beam down my clothes.</font color=blue></i></A> :tongue: <P ID="edit"><FONT SIZE=-1><EM>Edited by priyajeet on 11/17/04 04:42 PM.</EM></FONT></P>
Is that so I could swear I was being more conservative, also Carmack said it himself the FX was slightly ahead of the 9xxx series.
Actually, that's not the case. Carmack indicated that the 9700's were more powerful than the FX cards in the standard path, but that since Doom3 didn't need the precision, the FX cards were faster running their proprietary path.
Since he later dropped the proprietary FX path when the 6X00 series was released, it's pretty obvious ID wasn't especially happy with the compromises they made to the FX path just for benchmarking's sake.