The Impact Of A DRAM Buffer
This might have gone unnoticed, but each drive in today's review employs a different capacity of DRAM buffer. Crucial's m4 has only 128 MB. Samsung's 830 sports 256 MB. And Plextor's M5S has 512 MB. When we disable the device cache, we can pinpoint the contribution that DRAM buffer makes to improving write performance.
The M5S' write performance, along with Samsung's 830, is badly impacted across the board when we disable the DRAM buffer. The m4's sequential write speeds are not impacted, but 4 KB random performance is. Clearly, the on-board cache is helping improve write speeds by consolidating written data before moving it over to the NAND. Not only can this help with performance, but it aids wear leveling, too.
By default, the cache in your SSD is enabled by Windows. It's accompanied by the following warning, though:
"Improves system performance by enabling write caching on the device, but a power outage or equipment failure might result in data loss or corruption."
That sounds like it could be pretty scary, right? What's the risk of leaving it enabled? The interaction between the volatile device cache and volatile file system cache is complex, and we're working on a separate article to fully explore this. But we can provide some insight using the graphs below, which are based on a large file transfer.
First, we transfer our file to a drive with its cache disabled. Where you see the arrow and black vertical line, Windows is reporting that the transfer is complete. All of the data has been transferred into the file system cache at this point. However, not all of the data has been moved to the device media. At the physical volume level, the transfer isn't complete for another 45 seconds. So, if the power goes out within that window, data loss or corruption could occur.
In the next chart, we transfer the same file with the drive's cache enabled. This time, there's an 85-second window after the operating system reports that the transfer is complete. There is also a time after that when data might still be stored in the device cache. We can't say exactly how much, but we know that there are various mechanisms that typically force the buffer to commit data to non-volatile media quite quickly.
As far as risk goes, the main threat to data integrity is the time between when the transfer completes within the file system and when it's committed to an SSD's NAND. In our example, we see that period lasts longer when the device cache is enabled.
We saw in our first graph that the file system cache was still a risk to data integrity in the event of an unplanned power loss, even with the DRAM cache disabled. This is because the file system still caches information before writing it to the drive itself. File system cache can be avoided by applications that use write-through flags, forcing data to bypass the file system cache, thereby minimizing the risk of data loss.
In our last graph, we enable the device cache and use the "Turn off Windows write-cache buffer flushing on the device" option. This strips away write-through flags from disk requests and removes flush-cache commands. This is potentially the highest-risk strategy. But, in our file transfer example, it also yields the fastest competition time at the physical volume level.
As mentioned, the interaction between the DRAM cache and the file system cache is complex. Clearly, though, disabling the device cache alone does not eliminate the risk of data loss in the event of a power loss. The extent of that danger is subject to a number of variables. However, it has to be considered against the benefits of improved write performance and less wear.