GeForce 3: too expensive?

I have seen a ton of posts here and on other boards complaining about how much the GeForce 3 will cost, with most of them leveling criticism at NVIDIA for being greedy and/or Intel-like. Consider the following points:

*The only hard price I have seen on a GF 3 board is the 3D Prophet III, $529 on Hercules website. I would assume the street price will be below $500.

*The GF 3 uses about 57 million transistors, making it more complicated than any desktop <i> processor </i> in existence, and MUCH more complicated than any other GPU (graphics processing unit).

*The price of a GF 3 based card includes several things besides the GPU, none of which NVIDIA makes a profit on.

*The GF 3 is the most advanced card on the market.

*The Pentium 4 (42 million transistors) has a street price of $480 in it's most advanced version (1.5 GHz). This is for only the CPU, and obviously doesn't include motherboard or RAM.

*The video card is currently the most important link in producing 3D graphics, and it becomes more important (i.e., the rest of the computer has to do less work) with each new generation of card. This is a change that has been hard to get used to--video cards didn't used to be this important–but it is very true today.

*The RAM on graphics cards is very expensive. Again, NVIDIA has nothing to do with the RAM.

I would be quite interested to know how much the GF 3 GPU costs the card manufactures, how much the RAM costs and how much profit the card manufactures make. If anybody knows any of this please enlighten me. :)

--> Now considering all this, why is the GeForce 3 too expensive? Is NVIDIA really like Intel in their pricing schemes?

Post agreements, disagreements, flames, rants, etc. below. :wink:


*A note on transistors: the transistor count has a direct bearing on how technologically advanced the chip is, which would influence the cost of designing it, and on the die size, which greatly influences the cost of manufacturing it.
26 answers Last reply
More about geforce expensive
  1. My question is that it is really worth to buy it now since there isn't much softwar out there?? When later there are some which can candy our eyes with G3, then the price probably will drop to an affordable margin. Well, I personally won't bite the bait for the time being!! :)
  2. people are dumb

    I just don't see why someone would need to buy something more than a GeForce 1 for christ's sake. I'm still with a Voodoo3 and will probably keep it for another few weeks...

    Anything over 150$ is just plain ridiculous. Especially when Nvidia pushes you to buy a brand new card every freakin' 3 months!

    I *may* go with a Radeon 32MB DDR soon. You know why? Because it's dirt cheap, and is MORE THAN ENOUGH for any game that I'll ever play in the next 2 years!!!

    My Voodoo3 lasted 2 years, so will this Radeon.

    Thanks to idiots that buy 300$+ video cards, I can get excellent cards for around 100$. :)


    Athlon rocks!skcor nolhtA
  3. How long untill it becomes affordable is the question. The most this thing will drop is maybe $399 by the end of the year. That's only about a $100 difference.

    Whether to wait or not is your decision but I think it is wrong to bash only nVidia over prices. Every single company will debut an item at a high price. Whether it is Microsoft, Intel, or nVidia. When they all release a great new product, the price is extreamely high. Why is that? Well some of it is to pay for the development it took. nVidia put a lot of time and energy into this card.

    Despite the price, this is going to be a sweet card! If the games were coming out to support the benefits; would the price be worth it then? Some rumors say some game developers plan on putting out patches so current games can take advantage of the new benefits. Will ATI or similar competitors release something better? I can't say for sure but nVidia always seems to be at least 1 step ahead of the competition.

    Some people can afford this; some people save up money ahead of time so they can buy this type of thing when they come out. And some people don't just play games on their computers. Some are programmers; some are developing advanced graphic design. And some just like to know they have the best card on the market.
  4. I still have my GeForce 256 DDR running on an Athlon A 650. I dont plan to upgrade either anytime soon. I've had the processor for over a year I think and the G/C for even Longer.

    But, If you have the funds I think there is nothing wrong with going for the card. I actually lied above and might get the GF3 when Doom 3 is released and also if I decide to do some DX8 Development on the GF3 Platform (U Know, like increase my future job prospects).

    <i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
  5. I'm glad to see someone that actually kept his origina GeForce... altough that card is still relatively new... maybe a year old, max...

    I may buy a GeForce3 in about a year... until then, it is simply stupidly too expensive... I prefer investing that kind of money (500$US) into a new computer instead... lol


    Athlon rocks!skcor nolhtA
  6. so what 57 million transistors. ibm has a server chip that has around 200 million transistors. In .15 micron SOI

    !!! Leader of the Anti-via army !!!
  7. I bet it wont be able to beat the GeForce 3 Chip on real time rendering.

    <i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
  8. the GF3 is bullshit!

    Where are the bandwidth-saving technics??? There's NONE!!!

    TBR is the way to go!!

    It makes me laugh when a 150$ card (KyroII) beats a 400$ GF2Ultra! LOL!

    The KyroII looks better every day. And Nvidia NEVER looked good to my eyes (and my wallet)...


    Athlon rocks!skcor nolhtA
  9. !!! Leader of the Anti-via army !!!
  10. Lets see Guess what the geforce3 doesnt have:

    Hyper Z

    Thats all i will say.

    !!! Leader of the Anti-via army !!!
  11. But doesn't it have the new Hyper-VIA, that's much faster and looks cooler than Hyper-Z?
  12. Actually all pentium 4's are packaged with two 64 MB rambus Rimms.
  13. When new technology comes out the demand is usually quite high. So they need to put the price up so it offsets with the supply and demand. If they sold the GF3 at $300; everyone and their uncle will be running to the stores to pick it up. So their is more to it than just making a profit.

    Intel on the other hand I disagree with. Their prices are just so damn high that computer makers have to put in crappy componets to make the system affordable. For example, I was in my local computer shop the other day and browsing through some of the systems and one that was marked "Great Value" and a "High End System" had a P4 1.5 Ghz; but with an nVidia TNT card and hard drive I've never heard of before. It was 1,999.00. The next P4 up was 2599.00 and had a GeForce 2 Ultra but had a low amount of RAM and a small hard drive. You need to spend a good 3 grand or more for a GOOD high end system that has the P4. That is one big reason AMD is such a strong competitor. At least with AMD; the computer manufactures do not have to try and trick the public to make the system afforable. This of course is one reason I perfer to build my own computers but some people are not able to do that.
  14. Quite a variety of responses! I'll go in order:

    Bighead111 —— I agree with you. Here is my upgrade philosophy: If your current card is a TNT2 or slower, upgrade now! And if you can afford an Ultra, then skip it and go for the GF3. If your current card is a GeForce 2 MX class, then an upgrade is debatable but not necessary. If you do upgrade, try to get a GF3 as it will last you the longest time. If your current card is a GeForce 2 GTS/Radeon 64MB, then don't bother with any upgrade for now, unless you just love to spend money.

    Shnak —— You may not need a GF3, but that doesn't mean none of the rest of us "need" one. You also may have slightly misunderstood my question here. I basically want to know if you think NVIDIA charges an excessive amount for their chip, not whether the money is worth it for you personally.

    Shnak —— There is a lot more "computer" in a GF3 than in any $500 desktop. Sometimes your computer does need upgrading worse than your video card—I am not denying that. But a $1000 desktop will not give anywhere close to the amount of 3D performance that a $500 desktop with a $500 graphics card will. The CPU is no longer the most important factor in 3D

    rcf84 —— You missed the whole point of the transistor count. I obviously wasn't claiming that the GF3 was the highest count chip in existence, only higher than any DESKTOP chip, and my point was this: Transistor count indicates the expense to design and produce a chip, but the GF3 sells for substantially less than lower-count desktop CPUs. How then could NVIDIA be charging horribly excessive amounts for their chips, like many people say, when they are charging less than other companies?

    Shnak (again) —— Uh, there are many bandwidth saving features in the GF3. Have you read any in-depth reviews of its architecture? If so you slept through part of it.

    rcf84 —— HyperZ is one way to solve certain video acceleration problems, but is not the only way, or probably even the best way (as in, something better is sure to be developed in the future, if it hasn't been already.) HyperZ certainly doesn't make or break the effectiveness of a video card, as you imply. It also doesn't have anything to do with whether or not NVIDIA gouges people with their pricing, so I didn't really get your point in saying this. The fact that you don't LIKE the GF3 does not effect whether it is being unreasonably priced in comparison to its cost of development and manufacture.

    slayer255 —— 2 RIMMs ship with all P4 <i>systems</i> (which cost a lot more than $480) as the memory architecture demands that the RIMMs be loaded in pairs. P4 chips alone do not come with ANY memory. In fact, if you only pay $480 for your 1.5 GHz P4, it will be an OEM version that doesn't even include a heatsink or fan.

    Jerry557 —— In regards to your last post, well said! :cool:

    I am not going to be buying a GeForce 3 anytime soon. Whether you do or not is dependant on many factors that only you can decide, but this is discussed at length in several other threads here. My question really has to do with the business practices of NVIDIA. Many people like to say that NVIDIA is all greed, but I have pointed out several reasons why that <i>might</i> not be true. What do you think?

  15. I will laugh in Shnak face. He say that we dont' need to buy anything more then a Geforce 1. I would want to see his Vodoo 3 runs Giants for once and get a geforce 1 to run giants and see what kind of problem he would get.
  16. maybe. I don't care for Giants.

    I'm testing a Radeon 64MB DDR for a few days... not sure if I'll keep it or not though...


    Athlon rocks!skcor nolhtA
  17. Whats wrong with Giants on a GeForce 1?

    <i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
  18. If it's a DDR GeForce 256 (which is what you have isn't it holygrenade?) then Giants should actually run OK. The GeForce 256 DDR is a perfectly viable card today which is why it was a smart buy--6 months or more ago. It is too close to the end of its life (that is, its life as a high-performance card) to be a good buy now, IMO. But there is no real need to upgrade yet if you have one.

  19. Yes, the GeForce3 is expensive. Is it too expensive? It should be to those that can't afford it or will make other sacrifices to buy it. It's like every other product, autos, stereos, tvs, vcrs, etc. They have entry level models, standard models, and top of the line models. This is the top of the line and too buy it you pay top of the line. Everytime any new electronic comes out the price is high. So, if you don't want to pay that amount, then you can either buy something in the standard model arena or wait till newer technolgoy comes out and then get it then. It is just the way marketing of electronics has always been.

    That's my $.02....

    It worked yesterday! :lol:
  20. Exactly...I can remember when a Pentium 75Mhz would cost $1,500! Now we have P4's and AMD T-Birds with speeds as high as 1.7 Gigahertz!

    The Geforce 3 will accually debut CHEAPER than what the GF2's debuted at. If you have the money for a GF3...go for it! That is why it is out there. The price won't be dropping all that much untill the end of the year where it might drop $50-$100.
  21. It's strange... I am in agreement with a lot of these posts. What has surprised me is that we haven't heard from the NVIDIA haters saying how NVIDIA is making $450 on every GF 3 they sell (impossible) and all that kinda usual jazz. The couple NVIDIA haters that did post only wanted to talk about the features of the GF3 vs. the (insert favorite card here). No comments about NVIDIA's exorbitant profit margins or about how NVIDIA graphics cards have gone UP in price while other technology prices have gone DOWN.......

  22. When one thinks about it some of the GPUs on video cards are even more impressive than CPUs in general. I have read many articles by tech experts stating the cost to make GPUs is far more than the cost to make CPUs. Part of this would be the fact that fixed costs get brought down more in fabrication with quantities which for GPUs is low and for CPUs is incredibly high (thus they benefit heavily). Video card prices would be down excessively if it wasn't just something us games bought. Look at all the other prices for components in computers. They have went down and got better. What is the difference. Simple, they are sold in every computer. Thus, when VIA makes a chipset and 85% of the Athlon motherboard manufactuers use it, they will have a low price to secure that level of business and enjoy massive profits. The video card is the one component that NEVER is sold a computer (okay just a few companies that offer gaming machines). Basically all these cheap bastards (Dell, Compaq, HP) use integrated video to offer the consumer a piece of [-peep-] video solution. The result is that video cards in general are sold in small ratios when you think about the number of computers sold to home users. This means that fixed costs (fabrication, R&D, ect) are lowered far less. A bigger chunk of money that Nvidia and ATI make is for sell the "intergrated video solutions" because they sell far more in that arena that actually selling chips that go on a video card. If you think about it, in the old days video cards used to be sold with the computer. That was the point when the latest and greatest technology in the video card segment was far more reasonable. It was after the point that integrated video became mainstream that the prices increased dramatically. The lower sale of these GPU chips must be offset with a high price. Since I don't see integrated video solution ever ending I believe slowly over time the prices will come down for new video card technolgoy. What would really help is 5 major competitors instead of 2 or 3. Until then we will pay top dollar.

    One final thought that you should think about to answer the question of whether you should sink $500 into a video card. Basically how many hours of 3D games do you play a week? If you play just 10, then that is 520 hours a year. In my opinion that is like paying $1 per hour to take a decent or okay video game experience to the level of AWESOME. That to me is woth it. Think about it like this, you pay $8 for a two hour movie (movie entertainment $4 per hour), $15 to $50 for a nice to spectacular meal ($15 to $50 per hour), ect.... When it all comes down to it, what are you willing to pay for entertainment...

    It worked yesterday! :lol:
  23. I sort of disagree with you there. If we have 4 or 5+ competitors for every computer component we will have massive compatibility problems. We have seen within this industry many times that users will usually pick a "standard" of one company. (ie: Intel, Microsoft, Western Digital, nVidia). All the others seem to always be pushed aside. Why? The industry has those companies as standards. When you read the requirements to a video game it will say 95% of the time, an "Intel or compatible proccessor." Even though AMD will operate perfectly fine if not better than the Intel...Intel gets the standard and AMD is stuck with it.

    Everyone keeps using the word "monopoly" in the computer industry but you have to remember that it is the users of these machines that set the standards. Otherwise you won't be able to get your system working if there was a million competitors. Imagine a huge competitor to Windows that is just as popular. Imagine the compatibility issues and the confusion of what software works in what machine.

    The GeForce 3 is priced right if you want something that just came out and is new technology. That is the price you pay for that. If you can afford it, get it. If not, no big deal. It seems like the only ones complaining about the price are the ones who can't afford it. And they come on to these boards saying how much nVidia is monopolizing the market and how much this card isn't worth it. Yet, the people who have $500 sitting in their savings account, are counting down the days.
  24. As it has been pointed out before, ATI is the ancient giant amongst the Graphics card manufacturers. nVidia is quite new compared to the players in the field. They've come this far because they took initiative and created technology superior to their competitors.

    ATI made a comeback with the radeon, Imagination tech are starting to get their act together with the Kyro, but still with management problems.

    I don't think there will be compatibility issues. There are two unified architectures to handle that, Open GL and Direct X. The developers have to develop for one of them.

    The only problem is when new technology is released, the api's for them take time to get into games. Mainly because the games these days take ages to develop. Another thing, even though the unified api's work without problems with technology from different manufacturers, they're not to good at handling a mix between old and new technology. It is hard for developers to create something for both without difficulties. This is mainly due to the fact that, realtime 3D Graphics is at its infancy. All the graphics functions are in a transition of moving from instructing the CPU to do everything (software rendering), to doing the the rendering in hardware (3D Accelerators), to processing entire 3D Scenes in hardware (T&L GPU).

    Each time one of those major steps has taken place, there has been a division in compatibility in games etc. But each time a step in the transition has been made things slowly settled down. Until the next step that is.

    <i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i><P ID="edit"><FONT SIZE=-1><EM>Edited by holygrenade on 04/08/01 09:26 PM.</EM></FONT></P>
  25. Jerry557,
    Like holygrenade said, compatibility shouldn't be a problem because of DirectX and OpenGL. This compatibility issue is, in fact, exactly why API's are used.

    Also, I disagree with you on one point, but it is very small. :wink: You said, " have to remember that it is the users of these machines that set the standards." Well, history has proven that standards get chosen not by which is technologically superior (which is what users should want) but by what company kisses up to the right people and produces the best BS. Over and over the final standard "chosen" has been anything but the best.

  26. chrisojeda,
    Yes, very good points. Production volume is something I didn't even think to touch on, but is obviously a very big factor in price.

Ask a new question

Read More

Graphics Cards Geforce Graphics