It's good to see that people are learning to spell in our public schools.
Rambus wasn't bad memory but it was attached to a processor that sucked nuts. First gen P4s were horrible and it wasn't until they hit 2.0GHz that they could even keep up with the Tautalin PIIIs. My 1.33GHz Athlon outperformed up to 2.0GHz P4s. Rambus was capable of extremely high speeds but the only problem was that latencys were extremely high as well.
Not long time ago i was seen some vired ram that work on high frequencys.
It was socket 423 some pentium with mother MSI.
Where does that memory disapired?
Can someone tell me something about it.
Why did DDR win the battle?
Unfortunately RAMBUS as a company tried to force the standard on the industry but it was too expensive and required a different standard than the cheaper and more widely accepted DDR. RAMBUS also held the patent rights close to the vest and just about sued anyone or everyone who tried to use RAMBUS in a way other than the company wanted. As a technology, RAMBUS was "better" than DDR, it was dual channel memory before DDR and, at the time, ran at higher frequencies of 600, 800, and 1066MHz. When DDR went dual channel and given that the industry was already geared towards DDR, RAMBUS fell by the wayside. DDR proved to be faster, cheaper, and more widely accepted.
Generally speaking, a particular technology can be the best thing that was ever invented, but if it's too expensive, not widely accepted, and is mis-managed, then no level of logic can be argued for it or amount of money can be spent on it to save it from failure. In a nutshell, that's what happened to RAMBUS.
Personally, I picked up a Skt 423 2GHz P4 with 4x 256MB PC800 RAMBUS and ran it on a aBit TH7 with the Intel 850 chipset. That thing was super fast (for the time), super stable, and I shelled out major $'s to set it up. But after the RAMBUS and Intel fiasco, I kicked it to the curb and went with AMD and DDR. Actually, I gave the machine to my brother, he's still running it today, fast as it ever was and still stable as all he[[.
Rambus used a very high frequency RAM technology, but over a very narrow bus: compared to SDRAM (its main competitor at the time), you had:
RAMBUS: 800 MHz, 16-bit wide => max. throughput: 1.6 Gb/sec
SDRAM: 133 MHz, 64-bit wide => max. throughput: 1.066 Gb/sec
DDR: 100 MHz, 64-bit wide, dual signal => max. throughput: 1.6 GB/sec
Now, RAMBUS clearly had the better throughtput. Coupled with the original P4 quad-pumped FSB, high frequency and very long pipeline stages, it allowed very good number crunching.
On the other hand, RAMBUS latency was much higher than SDRAM's, meaning that CPUs that would jump all over the place (ie. not always addressing RAM linearly) would perform better on SDRAM - which was, at the time, much lower priced than RAMBUS. You couldn't use SDRAM with a P4, but you could with an Athlon.
The next generations of RAMBUS merely increased frequency, at the price of much higher heat dissipation, you needed to fill all RAM banks, and it was extremely pricey.
On the other hand, DDR used technology very similar to SDRAM, with lower voltages but almost identical components - it thus got very cheap very fast, while compensating the only disadvantage SDRAM had over RAMBUS: throughput.
highest clocked RAMBUS: 1.0 GHz, 16-bit wide => max throughput 2 Gb/sec
highest clocked DDR(1): 250 MHz, dual pumped, 64-bit wide => max throughput 4 Gb/sec
highest clocked DDR(1) in dual channel: same as above, X2 => max throughput 8 Gb/sec
'High standard' DDR setups usually come by 2x512 PC3200 (200 MHz, dual channel, dual pumped), with a throughput of 6.4 Gb/sec and low (CL2/CL2.5) latency - the best you can get without overclocking. DDR2 is even better.
RAMBUS used a very innovative (at the time) idea: use serial transferts to reach very high throughput. However, their solution was impractical and they charged an arm and a leg for it - so only Intel (and a few other entertainment companies) wanted in, while other mobo makers decided to push companies such as Via and SiS forward, and some companies (such as Nvidia) used it as an opportunity to enter the market. Coupled with the K7, DDR soared so much that even Intel, which tried to force the market away from DDR by bundling free RAMBUS with their processors, first tried to create a translator chip (the infamous Memory Translation Hub) to allow SDRAM use on a RAMBUS-centric system (with low success), until they decided to offer SDRAM-based motherboards, that although bare and less performant than their previous SDRAM chipset (the incredibly successful 44BX for P-IIx), was so successful that they just decided to forget about RAMBUS altogether and go the DDR (then DDR2) way.