BF4 better with Intel CPU and Nvidia GPU

leeb2013

Honorable
after all the hooha about AMD's exclusive deal with Dice to work on BF4 and everyone saying it'll work best with an AMD rig, particularly as it's ported from the AMD built PS4 and XBone and that it will need 3GB of Vram, thus ruling out most Nvidia cards, it seems from this review that the game is actually better on Intel machines, with a 2 generations old CPU and 1.5 year old Nvidia GPUs.

http://www.tomshardware.com/reviews/battlefield-4-graphics-card-performance,3634-8.html

The 7950 boost is an o/c version of the 7950, yet easily loses out to the GTX670, which only has 2GB VRAM and it's even match for the 7970.

http://www.tomshardware.com/reviews/battlefield-4-graphics-card-performance,3634-10.html

Here, a 2 generations old 4 core I5 at 3.4GHz easily wins over the 8 core FX-8350 at 4GHz!

http://www.tomshardware.com/reviews/battlefield-4-graphics-card-performance,3634-10.html

Not only that, but Nvidia released their first optimised drivers for BF4 2 days before AMD did!!

So much for multicore/AMD/exclusive/3GB optimisation.

I sold a perfectly good GTX680 to get 2 HD7950's and I can't even get crossfire to work with this game, let alone decent frame rates with and decent AA.

I had my initial suspicions when it appeared that Nvidia graphics cards performed so well during the Alpha tests;

http://www.bf4blog.com/battlefield-4-alpha-gpu-and-cpu-benchmarks/

The only good thing is that I've got an I5-3750K!!

What's going on AMD?!!!!
 

leeb2013

Honorable


LOL!!

The 7950's I have are great for the price, but it just seems AMD don't do anything properly and then try to trick customers with their marketing hype and make up for poor performance with ever decreasing prices.

I know Intel and Nvidia are generally more expensive, but their CPUs and GPUs are generally much more efficient, doing the same or better job with less cores and/or less power.

I hope all the hype about Mantle isn't just fluff too and that it really transforms GPU power.
 

Tradesman1

Legenda in Aeternum

________________________________

+1

Exactly, the 'advertising', generally, almost as arule - misleading, misrepresented. Keep hoping they'll come through with something they promise, but they seem to fail with every campaign
 

KoleTang

Distinguished
Sep 19, 2013
71
0
18,640
This is reassuring. Although the whole Mantle thing hasn't been realeased yet.

Hopefully if Mantle is really as revolutionary as people are saying it will come to NVIDIA as well sooner than later.
 

creamtown

Honorable
Oct 9, 2013
13
0
10,510
To be honest, i still prefer playing this game on console than on PC. Besides it's cheaper than high spec'd gaming laptops/PCs.
 
As I've said many, many times: IPC is far more important then the number of cores. Intels huge lead in that area keeps it ahead, even when you have tasks that scale well.

As for NVIDIA, they have always lead in pure shader performance, so the next-gen consoles turning up the eye-candy is probably going to benefit NVIDIA more then AMD.
 

MEC-777

Honorable
Jun 27, 2013
342
0
10,860
I have an i5 and 7950. Runs perfectly smooth at 50-60fps @ 1080p on ultra settings and in 64 player servers. Don't know what everyone's complaining about. Very happy with my single AMD GPU's performance. :)

leeb2013, it's nobody's fault but your own. A single GTX 680 is still a very powerful GPU and there was really no need to change it. I would have at least waited until the beta was released and then tried it out first before doing something that drastic based on preliminary data.
 

Devastating Dave

Honorable
Oct 15, 2013
1
0
10,510
I have crossfired 7970s and its as smooth as silk maxed out at 1080p. Since the frame pacing drivers came on the scene its been great for me.
I don't seem to get all these issues people go on about. All my games run at excellent frame rates without issues.
 

jason41987

Distinguished
Jan 1, 2012
188
0
18,680
dave i have a single HD7950 and its just as smooth as yours playing on 1080p at ultimate settings.. only playing at 60fps due to v-sync which i see little point in turning off

im thinking the OP is a butt-hurt intel/nvidia fanboy? in fact, before i built my computer 8 months ago, i spent about 4 weeks trying to decide whether i should go FX-8350 or I5 3570k, whether i should go 660TI, or HD7950.. comparisons were by price, and for the price the AMD products offered much better performance, so, completely and totally unbiased..

in fact i was close enough at times to flipping a coin to make my decision, at this time the HD7950s had a microstutter issue making the less powerful 660 appear cleaner and smoother, but literally during this decision making process, AMD release new drivers that solved that issue and it was a clear decision after that
 
I doubt you guys understand how this works...
Its quite obvious that both AMD and nVidia have already the technology and the capacity to produce far more powerfull GPUs (look at the tesla for example).

The reason AMD and nVidia are almost always competing at a similar level of performance is becouse they are not actually fighting for market share, but shareing the market.

neither AMD nor nVidia want to give us their best products and then get no revenue for the next 6 years becouse none needs to upgrade their GPU.

Its better to make 5-10% performance increments in each generation and bleed customers every year (those that do upgrade) and that way have a guaranteed income each year.

Its bussiness 101, not some hardcore consipracy theory.

Same goes for Intel, Televisions, mp3, flash memory, etc etc.

Hell, this is even true for games (Call of duty anyone?), as we need to get DLC to get the game full contenet and we need to pay fees to some online games.

So, "What's going on AMD?!!!!"

Nothing really.
 

ImPain

Distinguished


Are you sure about this ? I have an 680 and in all ultra (MSAA, etc) i'm around there also

 

MEC-777

Honorable
Jun 27, 2013
342
0
10,860


Yes, I'm sure. That is 50-60fps average. At times it's higher and at times it's lower. But most of the time it's hovering around 55.

Actually took 9 random benchmark samples the other day with fraps and made a little graph from the data. Will post it up here later today.
 

jason41987

Distinguished
Jan 1, 2012
188
0
18,680
hmm, i hover at about 60fps perfectly with my 7950.. wonder whypains is a bit lower?..hm, maybe he's running other stuff in the background?
 

ImPain

Distinguished
Even my Gforce experience optimizes my game in all ultra except for effects that are in medium, then it is perfectly smooth (60). But if all in ultra I'm like 45-60 fps.

I don't understand why I get so low fps with a card that if higher than the recommended specs... I hope it is just a beta problem..
 

MEC-777

Honorable
Jun 27, 2013
342
0
10,860
Here's the graph I mentioned earlier. Numbers extrapolated from 9 one-minute samples using fraps during game play. (With V-sync off)

BF4Benchmarkstockclocks_zps1fb75884.jpg
 

scirishman76

Distinguished
Mar 20, 2009
209
0
18,680


Most people are overlooking the resolution scale slider which adds supersampling and defaults at 100 even with ultra presets. This is by far the best eyecandy setting but big FPS hit. To be honest a mix of medium-high settings with a Res scale of 135 and higher will look far better than all ultra and the default 100 res scale. This is why your graphics cards all seem OP a 7950 will only push about 25-35 FPS on all ultra and 140 res scale and it maxes at 200. This is also where Nvidia pulls ahead(not by a lot) as it handles this a bit better.
 

Donovan Wiering

Honorable
Jan 19, 2014
1
0
10,510
I personally think AMD should spend some more cash on there Products and see what comes from it I would support then fully if they cost more or less the same as the other brands with their higher clock rates and all that key stuff that makes AMD, AMD but where they have all the basics working good

We need stats from AMD not cheap stuff that only works for every 3de person who wants a cheap PC