These are the backbone of the U.S. military's air power in 2025, apparently.
Call of Duty: Black Ops II certainly isn't the most demanding game out there, but its most demanding sequences do warrant a decent GPU. We consider the Radeon HD 6670 DDR3 or GeForce GT 630 GDDR5 entry-level hardware for 1280x720 or 1280x1024 at low-quality settings. But who wants that? Using medium-quality settings 1680x1050 you want at least a GeForce GTX 650 or Radeon HD 7750 to get through the tough parts. Stepping up to high-quality settings, 4x or 8x MSAA, and 1920x1080, a GeForce GTX 650 Ti or Radeon HD 7850 1 GB are bare minimums.
The loadout option screen
As far as the game itself goes, it's quite fun. At the risk of repeating myself to make a point, when it comes to entertainment, the premise isn't as important as the execution. Saying Call Of Duty: Black Ops II is just another first-person-shooter is like saying that Joss Whedon's Avengers is just another superhero movie. This isn't the second coming of Half-Life, and it's not without flaws, but Treyarch's production team clearly went out of its way to craft a compelling experience. As a hardware guy, I don't have time to finish all of the games I benchmark. But I'll finish Call Of Duty: Black Ops II.

I saw the trailer for this game, and it looks like a DX9 game with decent textures. So, ill pass, just as i did since MW1.
Going from mid to high level it would be in the best interest of the readers to submit the same ammount of antialiasing. It is very hard to know the impact of the graphics themselves when it comes to image quality, if you add both AA and higher textures.
I am quite sure the game will be layable with full HD and no AA, then adding Sweet FX AA far better than with MSAA.
Doesn't make sense otherwise.
http://www.rockpapershotgun.com/2012/11/15/black-ops-2-pc-review/
We welcome the good storyline but not the fact that this is clearly another port just to cover consoles. Not exactly helping to differentiate the gaming experience from any reasonable PC from the crowd.
Even AC III, iterative, yes but seeing that the 360 is unable to always hold 30 fps, should be interesting.
I think we all know the answer to that one....
Um...?
If a game dont have creative gameplay, aint a benchmark game like the old crysis, aint having decent graphic quality = what else that make it worth $60? more like $6.00
P.S. FWIW, I got a steady 60fps on my phenom II 940 @ 3.2 + 6950 2GB w/5% oc max in game settings and vsync, with occasional dips in 50's. Campaign mode.
P.P.S. at least we do get improved resolution, and full AA, xbox and ps only get 800x720 give or take with 2x MSAA I here. You can tell the detail given to the important character's faces since they are actual movie stars/actors, even when not in a cutscene.
Hm... I wonder if it's because Treyarch or Activision consider having a wider FOV, an unfair advantage.
I find it peculiar that you only used a 1GB HD7850, but not a 2GB version. I'm guessing it was because you didn't have one on hand, because I have a feeling that it might've shown different numbers compared to the former, especially with the 1600p test (considering the high resolution mixed with x8 MSAA, if I'm, not mistaken, the larger frame buffer (VRAM) might've helped). I'm not judging though, just pointing it out.
Do you guys think you could do an Assassin's Creed 3 Graphics Performance Review when the game comes out for the PC? Seeing as the game features a new engine, DX11 and a lot of patches and bug fixes for the PC, plus the developers themselves have taken lots of feedback on their forums to make it even better, I think it should be interesting.