Can AMD salvage QFX with an in-house chipset?

Well, the QFX has been released and while certain scenarios show incredible promise, certain areas are also saddled with too much baggage.

The fact that the Opteron dual can be outfitted with SLI in a wksta and offer good perf without these power levels implies that the total package could have been done better.

The Opteron 285 runs at 2.6GHz and this graph shows that without the additional SLI power AMD runs at 322W and the dual 5160 runs at 267W (full load).

Looking at the varous articles around teh web, it seems as though only Anand managed to actually find suitable tasks that were reasonable for multi-tasking. In his case he used BluRay movies which totally killed all the dual core systems.

His power numbers were also at least 100W lower than other test sites. He turned on CnQ and got the idle temps down to within 4W of the C2Q system. Of course this didn't dent the 456W the system drew at "full load" with an 8800GTX ( which I believe draws 225W+ ).

And these are FX74 numbers. FX70 is shown to use even less so I believe OEMs can get reasonable wksta power levels out of it in teh next few months, especially if AMD releases a new rev( they sorely need to drop power by at least 10%- perhaps more and the lessons learned can help get Agena down below the reported 125W)

Hexus is also reporting that they can show a defect in the NUMA implementation of the Asus board BIOS ( this post was going to be called "Did Asus and nVidia drop the QFX ball") that maybe why games are suffering so much from latency problems.

One review ( most are posted at AMDZone) stated that AMD is reporting that the Interleave mode will need to be turned off for Vista and on for XP.

I believe AMD reported that they would release their own branded chipset and hopefully it will be less power hungry than the 680a, which is reported to use more power than even 975X. Having two of them surely doesn't help. nVidia does have a two socket SLI hipset in the 3600 and ASUS' implementation is only $300. Even the $400 Asus 680i for Intel implements less PCIe for less power reserves.

Because 7950GT and the probably forthcoming 8950GT only require two slots for Quad SLI. I can see the need for 4 low end GPUs for certain content creators but even 3 PCIe slots can't really be used right now as no "Havok" type apps or cards have been released, except for the server (AMD Stream).

Only time will tell if AMD had planned to create an entire reference system based on an Ati chipset while allowing nVidia to be the launch partner.

But the real judgement is that only expert builders will make QFX something not too loud or hot, while Vista X64 may do wonders for it in multithreaded apps so it is not yet ready for prime time.

Let's go AMD! Show your true potential.
4 answers Last reply
More about salvage house chipset
  1. The FX-74 is the pinnacle of AMD's 90nm process. They can do nothing to make better. What's worse is this platform is part of the problem--even with CPUs with half the power consumption it still would use too much power.
  2. LKJHILUGTyudgbfcsd
  3. QFX has plenty of potential. If AMD gives us cooler running 65nm CPUs, an AMD/ATI chipset the draws less power than the Nvidia one and if DDR2 prices go down a little, QFX could be just what people are looking for. That and once more Apps start using multiple cores, the entire platform will show its true potential.
  4. Quote:
    His power numbers were also at least 100W lower than other test sites. He turned on CnQ and got the idle temps down to within 4W of the C2Q system. Of course this didn't dent the 456W the system drew at "full load" with an 8800GTX ( which I believe draws 225W+ ).
    The 8800GTX barely consumes more power than the X1900XTX, so that's not where the extra power was being used. 4x4 could have potential when K8 is released but at the moment it can't keep up with the Kentsfield.
Ask a new question

Read More

CPUs AMD