Skip to main content

Team Group T-Force Cardea SSD Review

Final Analysis & Verdict

By now, most enthusiasts have learned that proper case cooling is essential to stability and overclocking. Processors and video cards generate a lot of heat and require robust cooling, so solid-state drives are not the first component to use throttling algorithms to protect poorly cooled components. When I started building PCs, it was common to see one fan at the bottom of the case to bring in cool air, and another on the back to exhaust warm air. I haven't seen a case with only two fans in a very long time. That doesn't mean they are not out there; it's just rare. The more components you add, the higher the cooling requirement. But with the right cooling strategy, you can keep your CPU and video card cool enough to avoid throttling. When you run dual video cards, you also start to block components on the motherboard from receiving air, which can be a problem for flush-mounted M.2 SSDs placed under the video card.

We haven't encountered many SSD throttle issues during normal use. It hasn't been a problem even in a notebook with very limited airflow. This past Christmas I bought myself a Lenovo P70 with space for two NVMe SSDs and a single 2.5" SATA bay. The system runs cool under normal workloads with three SSDs, a high-performance CPU, and a beefy NVIDIA Quadro GPU--even during light gaming.

Our desktop test machines all use rackmount cases with three 120mm fans pushing air over the components. Those systems don't use video cards, just the CPU's onboard graphics. Even under heavy extended workloads, we don't experience a lot of throttling for most devices. The OCZ RD400 had a small issue with two tests, but both incidents occurred outside of normal desktop use.

Over the last two years, we've seen some third-party SSD cooling products come to market. Most have been in the form of add-on cards with large heatsinks like the Angelbird WINGS PX1 M.2 to PCIe adapter. Some motherboard companies have taken steps to move the M.2 slots. Asus has a vertical riser on some motherboards, and the new Apex uses a vertical card with M.2 SSDs next to the system memory. There is a lot of innovation going into third-party M.2 SSD coolers, but very few effective features coming from the SSD manufacturers.

The Team Group T-Force Cardea isn't the first SSD to use passive cooling, but it is the best we've seen so far. The drive uses every millimeter of space available to keep it cool under a GPU. I would go as far to say Cardea overbuilt the cooler for a component that only consumes roughly 7 watts of power. The E7 controller has the smallest amount of surface area of any SSD processor we've tested on the consumer side, but it can reach temperatures as high as 90C, so overbuilt is better than insufficient cooling capacity.

Pricing is one of the big issues. The Team Group T-Force Cardea 480GB costs $60 more than the MyDigitalSSD BPX 480GB with the same controller. The NAND shortage is at its peak, and companies are paying more for flash than they were just a few weeks ago. All SSD prices are going up, and as consumers, we just have to take it. The BPX is the overall value leader in NVMe SSDs, but we recently learned there would be a price increase. We don't know where BPX pricing will stabilize at, but the T-Force Cardea, as a new product, should already have the NAND price increases already factored in. That makes comparing prices in the future difficult.

The Cardea will not fit in a notebook with the heat sink, but it is possible to remove the cooler. We didn't find any warranty stickers warning users not to remove the heat sink, so your upgrade path isn't limited to desktops when it's time to migrate the Cardea out of your main system in a few years. You can move the drive to a notebook by simply pushing in a few tabs and sliding the heat sink off.

All things considered, we like this SSD. Team Group went outside of the box to provide users with a better E7-based SSD. The drive isn't for everyone, but if your system has less than ideal cooling, or you have a need to install a great looking component, this is a nice option to have available.

We can't complain about the Cardea's performance either. The drive provides excellent throughput and latency. There are some NVMe SSDs that are slightly faster, but the difference is small. Most users wouldn't notice the difference outside of running performance tests. We also really like the Toshiba 15nm MLC flash in this drive. We are starting to see TLC migrate into lower cost NVMe products like the Intel 600p and Western Digital Black PCIe, but planar MLC is a better storage media than IMFT and Flash Forward TLC. Get it while you can, because MLC has become difficult to source. Products like the Cardea are here today but will be gone within a year.


MORE: How We Test HDDs And SSDs

MORE: All SSD Content

  • damric
    Just put an AIO liquid cooler on it. You know you want it.

    Btw typo on last page "The NAND shortage is at it speak" should be" its peak"
  • bit_user
    I've never seen thermal throttling as a significant issue for most users.
    Have you ever tried using a M.2 drive, in a laptop, for software development?

    Builds can be fairly I/O intensive. Especially debug builds, where the CPU is doing little optimization and the images are bloated by debug symbols.

    And laptops tend to be cramped and lack good cooling for their M.2 drives. Thus, we have a real world case for thermal throttling.

    Video editing on laptops is another real world case I'd expect to trigger thermal throttling.

    We test with a single thread because that's how most software addresses storage.
    In normal software (i.e. not disk benchmarks, databases, or server applications), you actually have two threads. The OS kernel transparently does read prefetching and write buffering. So, even if the application is coded with a single thread that's doing blocking I/O, you should expect to see some amount of QD >= 2, in any mixed-workload scenario. About the only time you really get strictly QD=1 is for sequential reads.

    That said, I'd agree that desktop users (with the possible exception of people doing lots of software builds) should care mostly about QD=1 performance and not even look at performance above QD=4. In this sense, perhaps tech journalists delving into corner cases and the I/O stress tests designed to do just that have done us all a bit of a disservice.
  • bit_user
    my grandmother always said they are not quite as audacious as RGB everything, but heat sinks do provide positive benefits ...
    How I first read this.

    Me: "Whoa, cool grandma."

    Yeah, I read fast, mostly in a hurry to reach the benchmarks.

    Seriously, you could spice up your articles with a few such devices. Maybe tech journalists would do well to cast some of their articles as short stories, in the same mold as historical fiction. You don't fictionalize the technical details - only the narrative around them.

    Consider that - as odd as it might sound - it still wouldn't be quite as far out there as the Night Before Christmas pieces. The trick would be not to make it seem too forced... again, with my thoughts turning towards The Night Before Christmas pieces (as charming as they were). So, no fan fiction or Fresh Prince, please.
  • bit_user
    When I started building PCs, it was common to see one fan at the bottom of the case to bring in cool air, and another on the back to exhaust warm air. I haven't seen a case with only two fans in a very long time. That doesn't mean they are not out there; it's just rare.
    My workstation has a 2-fan configuration with an air-cooled 130 W CPU, 275 W GPU, quad-channel memory, the enterprise version of the Intel SSD 750, and 2 SATA drives. Front fan is 140 mm and blows cool air over the SATA drives, while rear fan is a 120 mm behind the CPU.

    860 W PSU is bottom-mounted, and only spins its fan in high-load, which is rare. The graphics card has 2 axial fans. Everything stays pretty quiet, and I had no throttling issues when running multi-day CPU-intensive jobs or during the Folding @ Home contest.

    In addition to that, I have an old i7-2600K that just uses the boxed CPU cooler, integrated graphics, and a single 80 mm Noctua exhaust fan, in a cheap mini-tower case. Never throttles, and I don't even hear it unless something pegs all of the CPU cores (it sits about 4' from my feet, on the other side of my UPS). It does have a top-mounted PSU, which is a Seasonic G-Series semi-modular, that I think is designed to keep its fan running full-time. It started as an experiment, but I never found a reason to increase its airflow. I haven't even dusted it in a couple years.

    I'm left to conclude that you only need more than 2 big fans in cases with poor airflow, multi-GPU, or overclocking.
  • jaber2
    Huh? I hate bots
  • mapesdhs
    The Comparison Products list includes the 950 Pro, but the graphs don't have results for this. Likewise, the graphs have results for the SM961, but that model is not in the Comparison Products list. A mixup here? To which model is that data actually referring?