Sign in with
Sign up | Sign in
Your question
Solved

AMD A6 3400M vs A6 3410MX

Last response: in Laptops & Notebooks
Share
July 24, 2011 6:59:50 AM

How much better will the 3410MX be for gaming then the A6 3400M? Maybbe an FPS estimate? Pl0x??? :sol: 

P.S. I select an upgrade to a 1gb 6750m with dual graphics on this laptop: http://www.shopping.hp.com/webapp/shopping/computer_ser...
with an A6 so this can do dual graphics. THIS MEANS THAT THEY ADD A SEPERATE 6750M SO THERE ARE TWO SEPERATE CARDS (one in the cpu and 1 discrete card) RIGHT? TWO CARDS?

More about : amd 3400m 3410mx

July 24, 2011 7:12:28 AM

Are you so lazy that you can't even google some benchmarks? Try notebookcheck first to see if they have any reviews/benchmarks done.
m
0
l
July 24, 2011 7:15:23 AM

wintermint said:
Are you so lazy that you can't even google some benchmarks? Try notebookcheck first to see if they have any reviews/benchmarks done.


Ive checked notebookcheck and they have 2 benches to compare them and i was wondering If i could get some GAMING benches/estimates instead of some super pi bench. SO no im not lazy thank you very much. :kaola: 
m
0
l
Related resources
July 24, 2011 8:45:20 AM

Well, I think that 3410MX runs on a higher clock cpu-wise but everything else should be the same. You should be able to check on the website regarding your other question. See if there's a link that says "Help me choose"
m
0
l

Best solution

a c 119 À AMD
a c 439 D Laptop
July 24, 2011 1:47:06 PM

The A6-3410MX's CPU is clocked at 1.6GHz while the A6-3400M is clocked at 1.4GHz. The A6-3410MX's HD 6520G graphics core uses DDR3 RAM @ 1666MHz while the in the A6-3400M, the HD 6520G graphics core uses DDR3 @ 1333MHz.

Gaming performance will only be marginally better in the A6-3410MX. Generally speaking, the HD 6520G is somewhat close to the performance of a desktop Radeon HD 5550. The HD 6620G graphics core in the A8 Llano series is somewhat close to the performance of a desktop Radeon HD 5570. As a comparison the Intel HD 3000 graphics core is slightly better than the desktop Radeon HD 5450.
Share
July 24, 2011 6:03:59 PM

Best answer selected by Jeteroll.
m
0
l
July 24, 2011 6:06:42 PM

jaguarskx said:
The A6-3410MX's CPU is clocked at 1.6GHz while the A6-3400M is clocked at 1.4GHz. The A6-3410MX's HD 6520G graphics core uses DDR3 RAM @ 1666MHz while the in the A6-3400M, the HD 6520G graphics core uses DDR3 @ 1333MHz.

Gaming performance will only be marginally better in the A6-3410MX. Generally speaking, the HD 6520G is somewhat close to the performance of a desktop Radeon HD 5550. The HD 6620G graphics core in the A8 Llano series is somewhat close to the performance of a desktop Radeon HD 5570. As a comparison the Intel HD 3000 graphics core is slightly better than the desktop Radeon HD 5450.


Thanks, so if I upgrade and add 6750 there will be the gfx card on the cpu and a sepertate deiscrete 6750 right? SO it can do dual gfx?
m
0
l
a c 119 À AMD
a c 439 D Laptop
July 24, 2011 6:37:09 PM

I am not sure if the HD 6620G can do a hybrid crossfire with a HD 6750. I'll assume that the HD 6750 will be used on it's own.

If the laptop is capable of "GPU switching", then on light loads it will use the HD 6520G, to save power and keep the laptop cooler. When playing a game, the the laptop will switch over to the HD 6750. If there is no "GPU switching" ability, then the laptop will always use the HD 6750.
m
0
l
July 24, 2011 6:54:38 PM

jaguarskx said:
I am not sure if the HD 6620G can do a hybrid crossfire with a HD 6750. I'll assume that the HD 6750 will be used on it's own.

If the laptop is capable of "GPU switching", then on light loads it will use the HD 6520G, to save power and keep the laptop cooler. When playing a game, the the laptop will switch over to the HD 6750. If there is no "GPU switching" ability, then the laptop will always use the HD 6750.


Hmm notebook chech says there is a dual gfx config for these two: http://www.notebookcheck.net/AMD-Radeon-HD-6755G2.57278...
But tell me one thing please: there will be the gfx card in the cpu PLUS ONE DISCRETE SPERATE CARD (6750) so two seperate cards right? cuz hp kept saying theres "only one card"
m
0
l
a c 119 À AMD
a c 439 D Laptop
July 24, 2011 7:13:36 PM

Jeteroll said:
Hmm notebook chech says there is a dual gfx config for these two: http://www.notebookcheck.net/AMD-Radeon-HD-6755G2.57278...


Okay, they would know more than I would, so it's a dual graphics solution.

Jeteroll said:

But tell me one thing please: there will be the gfx card in the cpu PLUS ONE DISCRETE SPERATE CARD (6750) so two seperate cards right? cuz hp kept saying theres "only one card"


Well, HP is correct. There is only one graphics card; the HD 6750.

The HD 6620G or HD 6520G is a graphics core that is inside the Llano APU; much like the Intel HD 2000 / 3000 graphics core inside all Sandy Bridge i3/i5/i7. Therefore, it is not a "video card".
m
0
l
July 24, 2011 7:44:28 PM

jaguarskx said:
Okay, they would know more than I would, so it's a dual graphics solution.



Well, HP is correct. There is only one graphics card; the HD 6750.

The HD 6620G or HD 6520G is a graphics core that is inside the Llano APU; much like the Intel HD 2000 / 3000 graphics core inside all Sandy Bridge i3/i5/i7. Therefore, it is not a "video card".


Oh ok thanks a bunch! :sol: 
m
0
l
September 19, 2011 3:37:06 AM

Oddly enough, Pass Mark (www.passmark.com ... and warning arbitrary numbers based on computations) ranks the 3400M much higher than the 3410MX even though the 3410MX is slightly higher in MHZ (1.6 MHz vs 1.4 MHz for the 3400M). My "guess" is the laptop manufactures are pairing the 3410MX with 1333 MHz system ram (and not 1866Mhz) bottlenecking possibly killing the benchmark. Someone said the APU's 6520 GPU ran off of 1666Mhz Ram; but, I believe that's a typo and it's 1866Mhz ram... could be wrong though but that's what I remember. I may be confusing it with the A8-3850 Lano desktop chip. Either way, it's been a long day and I'm to tired to check). :kaola: 

www.passmark.com
m
0
l
January 15, 2012 11:12:55 PM

The AMD A6-3400M APU with the HD6520G uses which ever RAM that is currently installed.
I upgraded my "stock" 4GB DDR3 with a 8GB Crucial kit (DDR3 PC3-10600 • CL=9 • Unbuffered • NON-ECC • DDR3-1333 • 1.5V • 512Meg x 64). Which increases my HD6520G performance by allowing the system to use more RAM dynamically, with a faster transfer rate over the "stock" rate.

Comparing Intel HD2000/3000 series graphics solutions to the ATI HD6520G is an apple to an orange, not to start any flames but a simple statement.
"The Radeon HD 6520G, is without any doubt better than the Intel HD 3000 graphics. In the two tables below will will compare some of the features and the benchmarking that will establish the superiority of the Radeon HD 6520G." @ http://compare-processors.com/amd-radeon-hd-6520g-vsint...
m
0
l
March 18, 2012 2:31:32 AM

mauser1891 said:
The AMD A6-3400M APU with the HD6520G uses which ever RAM that is currently installed.
I upgraded my "stock" 4GB DDR3 with a 8GB Crucial kit (DDR3 PC3-10600 • CL=9 • Unbuffered • NON-ECC • DDR3-1333 • 1.5V • 512Meg x 64). Which increases my HD6520G performance by allowing the system to use more RAM dynamically, with a faster transfer rate over the "stock" rate.

Comparing Intel HD2000/3000 series graphics solutions to the ATI HD6520G is an apple to an orange, not to start any flames but a simple statement.
"The Radeon HD 6520G, is without any doubt better than the Intel HD 3000 graphics. In the two tables below will will compare some of the features and the benchmarking that will establish the superiority of the Radeon HD 6520G." @ http://compare-processors.com/amd-radeon-hd-6520g-vsint...



Technically yes but actually the on chip video processor for the A series chips can run at a max 1866Mhz memory clock for the gpu and from what I hear, to get best system performance, it's best to match with 1866Mhz system/board memory. Several posts out there will mention this and some claim up to a 40% increase in performance but I can't seem to find that review now and not sure if that's correct. I did find this review though confirming 1866Mhz system memory works fastest with the A8-3850.

PS. Agreed Intel 3000HD graphics blow chunks. The A series on dye GPUs destroy Intel's on chip graphics. Intel should either partner with Nvidia (lol that's not likely after the war over PCI-E they had) or just stop making on chip graphics altogther. lol

http://www.legitreviews.com/article/1652/1/
m
0
l
!