Sign in with
Sign up | Sign in

Dual Channel To The Rescue

Parallel Processing, Part 2: RAM and HDD
By

If you want maximum memory performance, you should install two memory modules into two different memory channels to have it run in dual-channel mode. This doubles memory bandwidth by providing a 128-bit data bus.

At a time when memory clock speeds could not be increased much more, the industry decided to widen the memory bus from 64 bits to 128 bits. With the introduction of the AMD Athlon XP and Intel’s second generation Pentium 4 on Socket 478, dual channel memory controllers saw the light of day on contemporary chipsets : Intel’s 865/875, Nvidia’s nForce2, and the VIA KT266A. The technology behind it is rather simple : data is distributed across two separate memory channels to combine their total bandwidth. As a consequence, you need two memory modules, but the benefit is noticeably higher performance.

Dual channel memory controllers were deployed on all subsequent performance chipsets for Intel processors, such as Intel’s 915/925, the 955 and 975 chipsets (with DDR2) and the latest P35 and X38 models. Let’s not forget about Nvidia’s nForce 4 chipset family either. The only difference compared to past chipset families from a memory standpoint are the utilized technology and clock speeds, as well as slight memory tweaks on the enthusiast models (Intel 975X, X38, Nvidia nForce 680i) to further reduce latency. AMD has been integrating the memory controller with all AMD64 processors. Its single-channel versions died together with Socket 754 ; all other architectures on Socket 939 and Socket AM2 are based on dual-channel RAM today.

Is there still a performance advantage today ? Both memory performance and cache efficiency on today’s processors has increased considerably - we’ll find out in the benchmark section. We picked a Core 2 Duo system with low-latency DDR2-800 memory, and benchmarked it in dual channel mode as well as with only a single memory channel. In both cases we used two 1 GB DIMMs by Corsair.

Installing two memory modules into the same channel of the memory controller will force it into single-channel mode.

Every fast Core 2 Duo processor comes with 4 MB of L2 cache, which has a balancing impact on memory performance. In other words : whether you use super-fast or pathetically slow memory still is important, but the impact is less than it was with older processors using less efficient caches.

Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 6 comments.
This thread is closed for comments
  • 0 Hide
    perzy , August 8, 2008 8:11 AM
    This is a great article!
  • 1 Hide
    Anonymous , September 15, 2008 9:40 PM
    (Firt of all: Excuse my poor English... )
    mmm yours memory tests don't convince me. You should run, for example, Winrar AND Lame IN PARALLEL/SIMULTANEOUS (i.e multitasking), otherwise, caches don't are flushed (and it's when dual channel really is important). Note that it's not a superflous situation; under normal use a system commonly have several huge memory applications run concurrently (word, browser whith a lot of tabs open, anti-virus, etc. )
    el_bot
  • 3 Hide
    hellwig , November 21, 2008 6:33 PM
    I doubt anyone from Tom's will see this comment on such an old article, but it would have been interesting to see Single-vs.-Dual channel memory using an AMD processor. Since Tom's like Intel, the new Core i7's would also be beneficial. The point is, the article acknowledges the Core 2's have a tremendous amount of L2 cache to combat FSB (and consequently Memory) latencies. How is the comparison with an AMD or nwe Core i7 where there is NO FSB and the L2 Cache is significantly reduce? I would imagine this is where dual/tripple-channel shows is mustard. I hope we see a single vs. dual vs. triple channel comparison soon.
  • 0 Hide
    meodowla , September 15, 2009 9:27 AM
    Won't it be different when using a AMD processor with Memory Controller inside CPU.
  • 0 Hide
    junghm69 , March 24, 2011 4:48 AM
    My Windows Experience Index 3D gaming graphics score goes up from 3.8 to 5.1 when I switch from dual channel to single channel. This makes absolutely no sense. I thought dual channel was supposed to be better than single channel. Can anyone explain this?

    I seriously doubt that this score is accurate. I am using the built in graphics controller on the motherboard which is an AMD 760G chipset (ATI HD 3000 or 3200 I think). I've used Radeon HD 5450 video cards on similar systems and they give me a score of 5.4. How can a built in graphics controller give me a 5.1?

    AMD Athlon II X3 435 Rana (2.9 ghz)
    Asus M4A78LT-M motherboard
    4 GB G.Skill DDR3-1333 (2x2GB) F3-10600CL9D-4GBNT CL9-9-9-24 1.5V
    Windows 7 Ultimate 64-bit
  • 0 Hide
    Anonymous , April 24, 2011 12:57 AM
    junghm69
    because if you used 2 different memory chips both will run at the speed of the lowest memory chip when you activate dual channel in your motherboard