Our high-detail preset enables 8x multi-sampling anti-aliasing, with every other image quality setting at its most demanding level. Moreover, we're starting at 1920x1080, which is a resolution that shouldn't be a problem for mid-range and high-end graphics hardware.


Unfortunately, AMD's Radeon HD 7770 is rendered unplayable, while the GeForce GTX 650 Ti is marginal, dipping down to a 27 FPS minimum. The rest of the cards easily chew through the highest detail settings this game offers.


Although the 1 GB Radeon HD 7850 makes it through 1920x1080, it really suffers at 2560x1600. The 2 GB Radeon HD 7870 and GeForce GTX 660 fare much better, but still can't achieve at least 30 FPS through the test run. It takes a GeForce GTX 660 Ti or Radeon HD 7950 Boost to get those numbers where we want them, while the Radeon HD 7970 and GeForce GTX 670 give us even more headroom.
When it comes to multi-card setups, two Radeon HD 7950s in CrossFire scale a bit better than a pair of GeForce GTX 660 Tis in SLI. This is somewhat surprising when you consider that this game tends to favor Nvidia's single-GPU cards over AMD's. Also, SLI doesn't work with the Call of Duty: Black Ops II-optimized 310.54 beta driver until you grab the 11/15/2012 SLI profile update.
I saw the trailer for this game, and it looks like a DX9 game with decent textures. So, ill pass, just as i did since MW1.
Going from mid to high level it would be in the best interest of the readers to submit the same ammount of antialiasing. It is very hard to know the impact of the graphics themselves when it comes to image quality, if you add both AA and higher textures.
I am quite sure the game will be layable with full HD and no AA, then adding Sweet FX AA far better than with MSAA.
Doesn't make sense otherwise.
http://www.rockpapershotgun.com/2012/11/15/black-ops-2-pc-review/
We welcome the good storyline but not the fact that this is clearly another port just to cover consoles. Not exactly helping to differentiate the gaming experience from any reasonable PC from the crowd.
Even AC III, iterative, yes but seeing that the 360 is unable to always hold 30 fps, should be interesting.
I think we all know the answer to that one....
Um...?
If a game dont have creative gameplay, aint a benchmark game like the old crysis, aint having decent graphic quality = what else that make it worth $60? more like $6.00
P.S. FWIW, I got a steady 60fps on my phenom II 940 @ 3.2 + 6950 2GB w/5% oc max in game settings and vsync, with occasional dips in 50's. Campaign mode.
P.P.S. at least we do get improved resolution, and full AA, xbox and ps only get 800x720 give or take with 2x MSAA I here. You can tell the detail given to the important character's faces since they are actual movie stars/actors, even when not in a cutscene.
Hm... I wonder if it's because Treyarch or Activision consider having a wider FOV, an unfair advantage.
I find it peculiar that you only used a 1GB HD7850, but not a 2GB version. I'm guessing it was because you didn't have one on hand, because I have a feeling that it might've shown different numbers compared to the former, especially with the 1600p test (considering the high resolution mixed with x8 MSAA, if I'm, not mistaken, the larger frame buffer (VRAM) might've helped). I'm not judging though, just pointing it out.
Do you guys think you could do an Assassin's Creed 3 Graphics Performance Review when the game comes out for the PC? Seeing as the game features a new engine, DX11 and a lot of patches and bug fixes for the PC, plus the developers themselves have taken lots of feedback on their forums to make it even better, I think it should be interesting.