Well, if you're eagerly awaiting Ghost Recon: Advanced Warfighter, don't be holding your breath for DX 10 cards, as it appears that both ATi and nVidia are going to make us wait until at least the fall. Heck, both of them still have more DirectX 9 cards planned for later this year. (such as an X1700XT and 7900GS)
As for the cards out there right now, it depends on the price you're willing to pay. No card actually comes with memory even APPROACHING 2.0GHz DDR. The 7900GT actually defaults to GDDR3 @1.2GHz, and the 7900GTX is @1.6GHz.
As for the actual performance of each card, it seems little can actually best an X1900XTX Blizzard. Technically, some of the overclocked 7900GTX cards may argue with it, but the Blizzard does actually seem to win a fairer share of the rounds. Of course, it's also much more costly.
To be honest, I'm not sure how GRAW will play out here. I know that the makers claim to lean toward nVidia, but that has more to do with nVidia's marketting campaigns than the actual performance of their cards with the game. The last known Tom Clancey game, Splinter Cell 3: Chaos Theory, proved to perform a bit better on ATi's hardware, but this is a whole different series, with a different gameplay style, and hence a different emphasis on the graphics; there's little need for the insane attention to lighting detail seen in Chaos Theory.
However, I'm certain that whatever card you choose, be it the X1900XT, XTX, XTX Blizzard, or the GeForce 7900GT or 7900GTX, that it will perform fine. Even at 1600x1200, the only games that seem to drop any of those cards below 60fps are F.E.A.R. and The Elder Scrolls IV: Oblivion. (though some might rightfuly point out that at those games, the X1900 cards can still pull close to 60fps, and leave the GeForce cards in the dust in the meanwhile)
I doubt that GRAW will be as draining, though.