skittle :
How about you stop whining like a 6 year old girl, learn to use paragraphs, and come back when you can appreciate the value of a dollar.
Harsh, but made me laugh.
makotech222 :
i agree witrh topic starter. nvidia and ati should be getting their hardware better instead of just adding on more cards, costing the customer more, rather than bringing innovations to the cards and staying in the sub-$400 range.
...Except that nVidia hasn't been in the sub-$400US range for their top-end cards since like the day of the GeForce 4 Ti. One must learn that the prices of the top will be dictated by what those with bottomless pockets will pay, but things are also governed by the law of diminishing returns; you don't need to buy the most expensive hardware to play; you could likely spend half as much and still get >75% of the power.
dariushro :
Crysis is not optimized , really...blame crytek and ea a little too maybe?
"Not optimized?" With most of the games I see this claim made on, it really is just someone saying "my hardware isn't good enough for fully maxed settings and a good framerate. Coincidentally, I have hardware that's at least a little outdated."
aevm :
Guys, keep in mind that the OP is in Europe and things are more expensive there.
I'd note that this isn't really all that much the fault of the hardware makers, so much toward the European governments for their tariffs as well as slapping on that VAT, which while it varies from country to country, is rather high. (whereas in the USA, sales tax cannot apply to inter-state commerce, which tends to be the case when you shop online)
metrazol :
High frames per second.
High resolutions.
Cheap.
pick any two.
I actually don't strictly take any of them perhaps save for the first. I don't know why people insist on going to resolutions like 1680x1050; I've played at 1024x768 since like the days of
Jedi Knight, and have been fine with it. Likewise, I'm willing to splash out a few hundred bucks when I actually buy my video cards, to make sure that they'll be solid and last me for a bit. (my X800XT's still going fairly strong, though
Crysis is clearly going to be the undoing of it)
mactronix :
I think the op is just getting frustrated at the way Nvidia are just rehashing the same tec, They have obviously taken the decision to milk the existing market with the existing GPU's and why not ? This is the price we pay for lack of competition. While its true that you could move the argument to anybodys door as far as whos fault it is that we cant play newer games at the res/fps that we could the last generation of DX9 games you have to remember that if the software didnt chalenge the hardware we would stagnate that way also.
Personally i feel that MS pushed out Vista about a year early as they knew the graphics companies were having problems with it even before launch and should have given them more time to develope the hardware.
Actually, I've come to the conclusion that nVidia's not trying to do that at all; they ARE producing new tech, after all... And it is working to lower prices. The GeForce 8800GT comes to mind very sharply; we suddenly get 8800GTX-equivalent performance for what, suddenly around 2/3 the cost? It's only been a bit over a year (since late 2006) when we saw the first GeForce 8800 come out... Yet already we've seen the price slashed nearly in half. Likewise, we have attractive options below it, in the form of AMD's Radeon 3850 and 3870 cards.
spotless :
hrmm did you know the electric goods (cards, cpu etc) production cost is cost just a fraction of its cost?, they might says "the manufacture cost", but the truth they only send the reference design & let already established factory manufacture it
Discrete graphics card production can't really be equated to stamped-sheet steel manufacturing. Obviously, the production cost of almost ANY product is less than the price it sells for; that's how companies make a profit and stay in business. And likewise, you could phrase that lower cost as "a fraction."
However, advanced semiconductor logic fabrication isn't a cheap process in the very least; you're talking a form of production that, unlike stamping sheet metal, has pretty much zero tolerance for error by comparison. And even when it appears to physically be perfect, a lot of chips wind up being rejects for not being able to run at full spec and remain stable.
yipsl :
My opinion is that GPU's will become multicore like CPU's. The prices for the cards will remain the same at each price range because that's how the market works. Enthusiasts spend too much money to get the best graphics in their favorite games, while budget gamers go for affordable cards and medium settings. Occasional gamers go for low cost cards and low settings.
I'm in the budget gamer category. Don't sweat DX10. Games will get optimized, drivers will get optimized, Vista will get fixed by SP2, and dual core cards at all price ranges will arrive in the next couple of years. Alongside quad core in the mainstream, it will make gaming DX10.1 titles quite enjoyable.
Heya, yipsl; long time no see.
At any rate, you're right: prices aren't going to radically change for different levels, though this also has to do with the fact that the production costs are liable to remain the same for each certain level of video card: high-end cards may typically use a rather large CPU die and 8 or 16 chips of GDDR3 or GDDR4, which is going to remain roughly the same cost; advances in technology are going to be parlayed into packing more transistors on; the result is that the amount of silicon remains the same, which tends to lend to about the same production costs.
And I agree, that DX10 is over-rated. Given that
Crysis doesn't really seem to improve from it, I wouldn't worry about it. A simple edit of the game will let you use the highest detail settings in DX9 anyway, without having to deal with quirky drivers.