Page 2:NV Or Not NV? G70 Is The Question
Page 3:NVIDIA Gets Shady With CineFX 4.0
Page 4:NVIDIA Gets Shady With CineFX 4.0, Continued
Page 5:The Three H's
Page 7:More Hardware
Page 8:ForceWare 75 Driver
Page 9:Super And Multisampling
Page 10:Power Consumption - Boot And Idle
Page 11:Power Consumption - Games
Page 12:Test Setup
Page 14:3DMark 2005
Page 16:Far Cry (32 Bit)
Page 17:Half-Life 2
Page 18:Halo: Combat Evolved
Page 19:Splinter Cell
Page 20:UT 2003-Flyby
Page 21:UT 2003-Inferno
Page 22:UT 2004
Page 23:Final Thoughts
Page 24:Looking Forward Or Back?
Looking Forward Or Back?
While we are not convinced that the NV47 truly went away, the G70 is an impressive chip. To be able to perform better than your competition while consuming less power is quite an achievement. This is the total opposite of what we saw in the mobile space with the GForce6800 Ultra and Mobility X800 XT.
The questions that we still have relate to the impact of parallelism in discrete graphics. Both AMD and Intel are merging cores onto one die in order to get performance up and heat down. SLI and CrossFire are the main variants in the drive towards parallelism, but at what cost? Is having two graphics cards or one large one with two cores the solution? Are we destined to go back to $1,000 or more for Obsidian single card SLI?
If adding more shaders is the path towards parallelism, why not just produce a chip with 32 pixel shaders and 16 vertex shaders, instead of building dual card solutions with 16 and 8 each, which will probably cost more anyway? Are consumers that enthralled with graphics power that they will continue to buy cards no matter the cost? I find it hard to believe that there is no limit to the cost consumers will bear.
It seems that graphics makers are moving in the direction of AMD and Intel. Clearly more instructions per clock was the correct answer because "why do 4 when you can do 6 with less effort and power?" With the recent moves in the CPU market, I am sure it is only a matter of time until we see graphics be more in tandem - the fact is that we already are with multi GPU.
It is my opinion that this is history repeating itself once again. When you have multiple graphics chips with dedicated RAM on one PCB and then hook them up with some interface, all I can think is 3DFX. Though I cannot say it is wrong, because I still have my Voodoo2s... somewhere.
BTW: Anyone looking to sell their Voodoo5 6000?
- NV Or Not NV? G70 Is The Question
- NVIDIA Gets Shady With CineFX 4.0
- NVIDIA Gets Shady With CineFX 4.0, Continued
- The Three H's
- More Hardware
- ForceWare 75 Driver
- Super And Multisampling
- Power Consumption - Boot And Idle
- Power Consumption - Games
- Test Setup
- 3DMark 2005
- Far Cry (32 Bit)
- Half-Life 2
- Halo: Combat Evolved
- Splinter Cell
- UT 2003-Flyby
- UT 2003-Inferno
- UT 2004
- Final Thoughts
- Looking Forward Or Back?