People tend to generalize the problems with Crossfire as being "microstuttering" or a lack of game compatibility, which are real and on-going issues. They also make the problems seem like simple fixes through normal driver updates.
But the real problem with Crossfire is frame metering and uneven spacing between frames that is so bad it messes with the accuracy of measuring Crossfire performance through conventional means. These unevenly spaced frames lead to inflated performance numbers and a lack of smooth and fluid gameplay despite apparently high FPS numbers.
Here's what's going on, using Crysis 3 as an example. When you measure framerates using FRAPS or any other conventional FPS counter, Crossfire looks great.
However, when you really start to look closer into what constitutes a "frame" when measuring "frames per second", it has been revealed that Crossfire is so bad at spacing out the frames that often a small fraction of a frame is counted as a full frame. So FRAPS is counting, for example, a 10% slice of the screen as a 100% full frame. This is a problem because it gives the illusion that Crossfire performance is better than it actually is. FCAT is able to filter out those small slices, called "runts", and see exactly what the full 100% Frames Per Second is, a more accurate "Observed" performance score.
And here's your microstuttering issue graphically illustrated.
Really, these issues are not something to defend or be defensive about. It's something for concern, major concern, and an expectation that AMD will fix the issue. I mean, essentially, what this new research is revealing is that adding a second card in Crossfire is basically a waste of money; and you don't want to waste your money do you? The whole point is that, while you may be happy with Crossfire, you're not getting as much as you would with SLI in terms of smooth, fluid gameplay and value for your money.