Sign in with
Sign up | Sign in
Your question

DDR1 & DDR2 Memory

Last response: in Graphics & Displays
Share
November 11, 2003 12:07:07 PM

Ok I have been around computers for a long time and expect graphics cards made in years after the first releases to out perform the preceding cards.
Ok so I am just another fool that expected that to be true.
I am around computers so much that I rarely if ever bother to read the details off the boxes any more when I stuff a new card into a Computer. I read the BOLD PRINT ON THE BOX THAT SAYS the card is a newer release. Lame I know but true, I don't have the patience really unless I am going to get a new card for my personnel P.C that needs to run a new software title that just lags like a <Insert Comment Here>

I was seriously considering purchasing a new Nvidia FX 5900 Ultra Pro graphics card.
That is until I saw how poorly a friends system exactly Identical to mine in every way as I installed it all for him and built his computer...ran the new FX 5200 Nvidia graphics card.
We both have Splintercell and he installed his new card before I could or knew he had it, as you can understand he was in a hurry to play his games at the elite in graphics performance.
Unfortunately I was up at the Hunting Camp Blasting Daffy Duck and Hooking Nemo the Bass so I was not available to install his new card for him.
His new card ran so poorly that as soon as I was next to a cell phone that had a battery left working (They still cant make a Battery for a cell phone!!) he asked me to come over and tweak his system in what ever manner was necessary to get it working as it should be expected to with a high end release card like Nvidia's FX 5200 card.
Well we spent most of the day reinstalling drivers etc and it was still not working great... we based this on the fact that my Nvidia Ti 4200 8X AGP 128MB DDR memory card was raping his new FX 5200 card.
We went back to the store and said we think this cards dual pipe lines are not working great.....Well you need to have a good story when exchanging the card for another one because we both suspected it was a born Lemon.
Ok new card in hand we limped back to his place and we installed his new card same thing ??? Card ran like crap we were both disappointed, him because he lost out on a small fortune that could have purchased a small house and me because I was not going to break the bank account on a FX5900 card until we find out why the Nvidia Flag Ship graphics cards series stink up the neighbourhood so badly.
I started to look into the MHz of the DDR memory chips used by Radeon and Nvidia and found out that ATI-Radeon is not only located in Markam Ontario and yes a few years back I walked that plant floor to view SMT lines, THAT they do in fact use the BEST DDR memory you can get in all their new cards DDR2 memory architecture. Well it would seem from a few articles that Nvidia uses DDR1 a lower standard RAM in their flag ship cards but they use it in such a way that they claim they get higher MHz due to the design specs they use.
I can tell you from old experience that even in a manufacturing area when SMT components are installed onto SMT circuit boards that the Robotics operators and the Purchasing departments order SMT chips in batches so they do not (1) mix manufactures brands that is a big no no as it effects the cards operations at test benches in very interesting ways up to the point of NOT even functioning. (2) They will buy the cheapest chips they can get even if they are matched.
Here is some home work for anyone who bothered to read all this post.
A) what is the main difference between DDR1 and DDR2 Memory.
B) How can a flagship graphics card product from Nvidia use DDR1 Mem Chips and design the boards as a work around instead of using the ( I believe better faster Raw Power of a REAL DDR2 Memory chip) If they have to place 8 to 16 Memory chips they might as well use the good stuff.......Right ?)
B) What is the Deal ?????

Money as they say should not be an option when dealing at least with this kind of hardware, ATI Radeon seems to believe that why is Nvidia screwing the population.

»§øЫÑighthåwk™ Don't get mad at the player get mad at the game. Hackers drool and Skill's rule.

More about : ddr1 ddr2 memory

November 11, 2003 1:12:49 PM

A) not sure
B) probably because DDR-2 gets way too hot for those MHz

_____________
whompiedompie
November 11, 2003 1:14:43 PM

btw, if I am correct, though, Nvidia did have DDR-2 in the FX-5800, yet that had such a horrid cooler, that they probably abandoned the DDR-2 heat issues and went for DDR-1

_____________
whompiedompie
Related resources
November 12, 2003 8:47:44 PM

So how come a Canadian Company ;)  like ATI can build their Franchise graphics cards with all DDR2 Memory and they seem to have been doing it for the last 4 or 5 new design builds.
Is Nvidia shurking the Electronics complications and simply going with what works instead of providing the customer the best clock speeds there is in RAW power.
The way they build now they make the circuit board design so it can in effect multy task the DDR1 memory so it achieves or even exceeds the DDR2 performance.
If they tried a little harder it would seem to me that they could exceed even their current designs. All they have to do is find a way to dissipate the heat problems. In fact I always wondered why Graphics card manufacturers and even CPU design companies do not provide better cooling methods.

It would make perfect sense to me that learning to properly deal with heat dissipation would be a front line combat task for the Engineering crew.
We don't need to water cool a computer to dissipate heat. If the graphics card makers sell a Nvidia FX 5900 Pro card for $800.00 Canadian then they could ship with the card a new side panel and cooling fans for hard core game computer cases.
I mean you have to by SMT components in reels for the Robotics placement machines and those reels can cost up to 6 grand each for 5000 components so that is where their cost comes in initially but that cost has dropped over the last 12 years as those initial Reels of components has dropped to the hundreds of dollars.
If I was to pop-off or de-solder the SMT chips and Electro Lidic capacitors etc off a fully placed Graphics card I would end up with 5 bucks worth of parts in my hand Minus the GPU/CPU and memory chips.

»§øЫÑighthåwk™ Don't get mad at the player get mad at the game. Hackers drool and Skill's rule.
November 15, 2003 7:49:21 PM

Reading back on some other posts it becomes evident that Nvidia does not like 3D Mark testing of their graphics cards.
I installed 3D mark 2001 SE with the upgrade patch and got very good scores. The guy I was testing against was using a Radeon 9800 Pro and an Intel 2.3c overclocked to 3 Gig.
I ran a AMD 2800+ and a Nvidia GeForce Ti 4200 against his benchmarks.
We ran the benchmark on only the first 4 tests. The overclocked Intel CPU of course scored higher FPS then I got on a stock 2800+ cpu but I did beat his FPS for the car chase in low detail.
This is a perfect example of how different "Games" run on different set-ups regardless of what CPU or graphics cards are used. Of course we have to remember that the 3D Mark benchmark is only a game emulation and not an actual game. It only mimics game graphics and is not actually providing real time FPS in a game. If they clipped out 40 second clips from a game title that was released and used those to render FPS then we would all really see how the numbers are reflected.
Here are the scores I received on the Ti 4200 Graphics card.

3DMark Score 10645 3D marks
Game 1 Car Chase - Low Detail 170.9 FPS
Game 1 Car Chase - High Detail 61.6 FPS
Game 2 Dragothic - Low Detail 172.2 FPS
Game 2 Dragothic - High Detail 100.1 FPS
Game 3 Lobby - Low Detail 146.4 FPS
Game 3 Lobby - High Detail 69.5 FPS
Game 4 Nature 56.4 FPS

In all cases the Intel 2.3c @ 3 Gig scored better FPS accept for the low detail car chase. His set-up only benched FPS values about 20 FPS faster in the other tests.

If I set this 2800+ to 183/37 and that forces the memory from 333 MHz to 366 MHz and brings the front side Bus up to nearly 200 MHz so the CPU would go from 2.08 to 2.26MHz range I would in fact beat the Intel chip and Radeon cards FPS with little problem.

I however do not overclock my CPU's because best case scenario is a CPU burn out over time.
I am later going to install the AMD 64 CPU's line and use those stock as well. It would be a much more interesting test if he could install a GeForce Ti 4200 graphics card into his system to see if he benched out higher or lower then his posted scores.

»§øЫÑighthåwk™ Don't get mad at the player get mad at the game. Hackers drool and Skill's rule.
!