Sign in with
Sign up | Sign in
Your question
Solved

Media bias against AMD?

Last response: in CPUs
Share
January 9, 2013 5:42:42 PM

My favorite brand was Cyrix but they have not been available for years.

Not long ago I came across a famous site, that shall remain nameless, and it stated that the little AMD E-450 could only manage 11 fps while running Skyrim. Many gameplay videos seem to indicate that Skyrim will actually run at 18 - 20 fps on that platform. That is a signifcant difference.

It is true that many E-450s were shipped from the factory with memory set below the rated speed but the "experts" on that site should have known this and corrected the problem before testing.

Is there some sort of bias at work here?

More about : media bias amd

a b à CPUs
January 9, 2013 5:51:53 PM

More importantly what's the point, the E-450 isn't a gaming platform?

It's obvious that as long as reporters and journalists are human, pure 100% objectivity is impossible, though.
m
0
l
a b à CPUs
January 9, 2013 5:59:44 PM

There's no bias, that E-450 is not meant for gaming see here :

http://www.notebookcheck.net/AMD-Radeon-HD-6320.54746.0...

It's the on die gpu of that cpu, it reaches 11 fps in skyrim at lowest settings 720p lol.
Some videos could be lying, in youtube, you never know the veracity of the fps they're getting, and also there're could be videos but with a much lower resolution like 800x600 or lower.
m
0
l
Related resources
January 9, 2013 7:16:28 PM

FinneousPJ said:
More importantly what's the point...

The point is that the bad advice is killing the sale of millions of copies of hardware and software. It could also encourage someone like a student to purchase a lower priced machine without a GPU since almost any cheap machine can do schoolwork. That decision could also kill the sale of a half dozen more software titles over the next couple of years.

Exaggerated system requirements for most of these games already excludes 80% or 90% of the systems out there. Many developers are no longer covering their costs and are laying off staff by the thousands because they cant get enough volume. Is there a relationship between the exclusions and the lack of volume?

Here is the truth about Skyrim posted by HTWingNut on notebookreview.com, "Skyrim works on my AMD E-350 netbook with IGP. Runs at 800x450 low settings at about 20fps, depending on what's going on. No real extensive gameplay yet, but it runs and is playable."

The layoffs are sad considering that vast numbers of E-350 and E-450 owners that could have bought the software and been happy with it if only they knew the truth. That site has no excuse for understating performance the way they did.
djangoringo said:
There's no bias, that E-450 is not meant for gaming see here...

That is the site that shall remain nameless and they do not seem to have a clue. A gaming machine would not have separate CPU and GPU memory. Then it would always be choking while moving data from one to the other. Gaming consoles put the CPU on the other side of the memory controller and use the same high-speed memory channel for both. CPU memory to GPU memory transfers are no longer required. It is most PCs that are not meant for gaming but they do seem to manage if they have enough power.

Bad advice has consequences that can affect us all.
m
0
l
a b à CPUs
January 9, 2013 7:28:05 PM

That site is the one of the best for checking laptops gpus, apus, etc...it has credibility.
When you say a gaming machine would not have separate cpu and gpu memory, i hope you're trolling, did you see any decent pc for gaming that uses both ?...the only good ones are the apus from amd like the A8 or A10 series(FM1 or FM2), but it's still not good enough for gaming, for gaming, when we say gaming, i see playing at 1080p resolution and more than 30 fps minimum and that only with separate gfx cards.
m
0
l
a b à CPUs
January 9, 2013 7:34:45 PM

I don't know if you're trolling or not but nobody said Skyrim doesn't work. They said it runs at 11 FPS at 720p which is a resolution lower than I'd like and FPS that I think is unplayable. If some dude posts it runs better at a lower resolution then big surprise - it's still not enough.
m
0
l
a b à CPUs
January 9, 2013 7:37:51 PM

It is fair to say though that a lot of reviews are poorly done. Not the same as bias, but it can seem that way sometimes.
m
0
l
a c 143 à CPUs
a b À AMD
January 9, 2013 8:00:06 PM

Quote:
Many gameplay videos seem to indicate that Skyrim will actually run at 18 - 20 fps on that platform.


Dude. Seriously? If you thing 18 frames per second at vga resolution and 1990 graphics is 'gaming' or even fun then how about you buy my old atari. I dont even have a tv that has analog inputs anymore...
m
0
l
January 9, 2013 9:18:55 PM

FALC0N said:
It is fair to say though that a lot of reviews are poorly done. Not the same as bias, but it can seem that way sometimes.

The 11.4 figure has been posted for over a year from before it was known that most of these machines came with underclocked memory from the factory. Just because a module is rated for 667 MHz clock speeds does not mean it is running at that speed. Perhaps it was an honest mistake but they should have confirmed the correct speed before publishing the article.

What surprises me is that the site has retained the same old figure even after several users warned them the published fps value could be much lower than it should have been. The value has even remained constant through several Catalyst driver revisions.

It is one thing for them to make a mistake but quite another to fail to correct it after so much time. That indicates bias of some sort.

So does anyone have a properly configured one of these machines running Skyrim to let us know what they have experienced?

By the way, with hardware scaling, 800 X 480 resolution probably doesn't look like that resolution but a slightly fuzzy 1360 X 768.
m
0
l
a b à CPUs
January 9, 2013 11:00:30 PM

I see your point, but your giving the websites too much credit. Some of them are good and some of them are not so good. Its easier to be lazy than to correct a mistake that they probably think is of no consequence. Since you wouldn't game at either frame rate anyways. Remember, just because its on the internet doesn't mean its good info.
m
0
l
a c 473 à CPUs
a c 119 À AMD
January 10, 2013 12:18:41 AM

Murray B said:

Here is the truth about Skyrim posted by HTWingNut on notebookreview.com, "Skyrim works on my AMD E-350 netbook with IGP. Runs at 800x450 low settings at about 20fps, depending on what's going on. No real extensive gameplay yet, but it runs and is playable."



You should provide a link to your source of information when possible...

Anywaste, you need to realize that acceptable performance is subjective. That means you will get different opinions from people based on their preferences. I would speculate that a PC or laptop which can provide an average of 60 FPS in games a very large percentage of gamers would consider that acceptable performance. At an average 50 FPS there would still be a large percentage of people would say that is acceptable performance, but not as much as 60 FPS. As you go down in average FPS performance, the percentage of gamers saying the PC or laptop provides acceptable performance will go down as well. Below a certain performance level the PC or laptop would provide unacceptable performance. The question how low can average FPS go before you consider the performance unacceptable? It varies from person to person. Just because you think it is accept doesn't mean most other people will as well.

Here's a link to HTWingNut's thread called AMD E-350 / E-450 and Intel i3-2367 Compared:

http://forum.notebookreview.com/gaming-software-graphic...

There are many game benchmarks regarding the performance (or lack thereof) of the integrated graphics in those three CPUs. His very last sentence is about Skyrim:

Quote:
Skyrim actually plays decently on both machines at 1024x600 with low detail. I wouldn't want extended gaming sessions with it, but it will allow you to get a fix now and again while on the go.




Note in the title that it say Skyrim Intro which generally means the opening sequence. That is different from actual game play where it can become more taxing on the CPU and iGPU. For example, combat or large crowded cities. Also note his quoted remark that while the game is playable HTWingNut would not want to play an extended session of Skyrim with any of those CPU's integrated GPU since the benchmark scores are so close.

What constitutes "extended session"? Well, again, that is a matter of opinion and varies from person to person.
m
0
l
a c 146 à CPUs
a b À AMD
January 10, 2013 12:46:14 AM

Yes Jaguar you are exactly right. I would consider 60 FPS fine for gaming. The minimum limit for me is 60 FPS anything less than that (ie 30 or 40 FPS or lower) as unacceptable.
m
0
l
a b à CPUs
January 10, 2013 5:01:34 AM

The standard industry norms is 35FPS+ as playable though most budget systems are capable of producing 40+ on single screen HD setups. When you start talking big leagues 6 screen eyefinity your 60FPS is just a pipe dream.
m
0
l
January 10, 2013 5:02:29 AM

FALC0N said:
...Since you wouldn't game at either frame rate anyway...

That website has probably killed a lot of hardware and software sales because it is being referenced all over. They have little credibility with me because it looks like they tested a machine with underclocked memory and the wrong video driver and then did nothing to update the score. Not only that but I noticed they quoted graphic benchmark results on virtual GPUs as if they were comparable to actual GPUs and did not even bother to mention which GPUs were virtual. That kind of bad advice that favours one company over another does not seem to be accidental.

As far as what I would do it is better to ask than to assume. As a matter of fact modern framerate standards seem insanely high to me. Cartoons in the twentieth century shot each cell twice to save money and they were shown in the theatres at 24 frames per second but only 12 unique frames per second. They were also shown at 12 ufps on television as well.

Right now I play "Star Wars: The Old Republic" at 1024 X 600 on an E-450 at 60 fps but only 18 - 20 unique fps and it's fine. I'm level 15 now and just managed to jump all the way up to the Datacron in the Black Sun Territory which is supposed to be hard to do. The sad thing is that every site I could find indicated the E-450 could not possibly run SWTOR but since the download was free I downloaded it anyway. It runs well and any RPG that can manage more than 12 unique fps is probably going to be fast enough for me.

This might be true for other people too and I suspect there may be many others who would buy something like Skyrim if they only knew that it would run okay on the system. 60 fps would be great but 12+ may be good enough for some especially when the alternative is nothing.
m
0
l
a c 185 à CPUs
a b À AMD
January 10, 2013 5:29:44 AM

That my friends, is the problem with PC gaming. :p 
m
0
l
a b à CPUs
January 10, 2013 6:39:10 AM

I know for a fact 12 FPS is unplayable for me.
m
0
l
January 11, 2013 3:55:13 PM

FinneousPJ said:
... the E-450 isn't a gaming platform?...

djangoringo said:
...E-450 is not meant for gaming...

Anyone doing research on a laptop for school will soon find the vast majority of media sources agree that the E-450 and the later E2-1800 are not good enough for modern gaming.

In most cases, since their funds are limited, they choose a cheaper machine without a GPU that is unable to play most modern games for sure. It follows that they will not be buying any more game software until they get a better machine.
Few students have the time to search deep in the forums for the posts by people who have an E-450 and are satisfied with its gaming performance. I'm satisfied with SWTOR at 18 to 20 ufps on my machine but then cartoons at 12 ufps look fluid to me too.

To help illustrate why I beleive there is bias here are some figures regarding some bona fide gaming platforms. These machines all have a high-speed memory channed shared by GPU and CPU:

XBox NV2A 932 MP/s 1864 MT/s 64MB - 6.4GB/s peak
PS3 G70 2000 MP/s 4400 MT/s 256 MB - 20.8 GB/s peak
Wii U Latte 2232 MP/x 4423 MT/x 1024 MB - 12.8 GB/s peak
XB360 Xenos 4000 MP/s 8000 MT/s 512MB - 22.4 GB/s peak

Here is the E-450 with 4GB shared DDR3-1333 memory:
E-450 6320 2000 MP/s 4100 MT/s 1961 MB - 10.6 GB/s peak


[Note that i am quoting theoretical peak memory performance which is readily available and not the more important average sustained memory bandwidth which is much harder to find. Most of these figures are taken from wikipedia which I don't consider authoritative but should be good enough for a discussion like this.]

Given that more than 90% of the GFlops a modern gaming application needs comes from the GPU the perfocmance of the E-450 should be approaching that of the PS3. As I understand it several million users are happily running Skyrim on that platform..

So, does anyone know how fast Skyrim runs on the PS3?
m
0
l
a b à CPUs
January 11, 2013 4:06:42 PM

I believe PS3 games run 30 FPS.
m
0
l
January 11, 2013 4:20:08 PM

FinneousPJ said:
I believe PS3 games run 30 FPS.


Thanks but I expect they are capped at 30 but could run slower. Unless the game had a built-in fps counter it would probably be necessary to capture the video to an external device and measure it that way.
m
0
l

Best solution

March 26, 2013 7:50:20 PM

Murray B said:
My favorite brand was Cyrix but they have not been available for years.

Not long ago I came across a famous site, that shall remain nameless, and it stated that the little AMD E-450 could only manage 11 fps while running Skyrim. Many gameplay videos seem to indicate that Skyrim will actually run at 18 - 20 fps on that platform. That is a signifcant difference.

It is true that many E-450s were shipped from the factory with memory set below the rated speed but the "experts" on that site should have known this and corrected the problem before testing.

Is there some sort of bias at work here?


Of course, there is a lot of bias against AMD! I have seen reviews from popular sites that compare the power consumption of AMDs E-350 with Intel atoms + Nvidia ions but forget to comment that AMD TDP includes the GPU, whereas the atom TDP does not include the ION chip consumption. I have seen sites that claim that the E-450 has a half the cache of the E-350 (which is false) and give the E-450 a poor performance than the E-350 (which is ridiculous) when the E-450 is compared to Intel chips such as atoms and celerons.

One of the most impressive cases of bias is found in a review of AMD trinity A10-5800K made by a famous site. For instance, the reviewers give SYSMARK 2012 benchmarks where the A10 is performing poor than an Intel i3, except that they forgot to mention that SYSMARK 2012 is a biased suite that artificially increases the performance of Intel chips

http://news.softpedia.com/news/AMD-Nvidia-and-VIA-Quit-...

(Note: the news was published months before the review.)

The reviewers used the version 12.8 of AMD catalyst driver, when a better driver was available. But the more interesting part is that they used 1600MHz memory for all the chips, when the memory bus of the A10-5800K is of 1866MHz (vs only 1600Mhz for the Intel chips) and as several sites have noticed the AMD chip benefits a lot of from faster memory.

And even after all this bias against the AMD chip, it was still competing and surprising the reviewers: "AMD does surprisingly well here in SYSMark 2012. The Core i3 3220 manages a 12% advantage over the 5800K, but that's not as much as we'd normally expect given the significant single threaded performance deficit we pointed out earlier."

Of course, if you use non-biased benchmarks, up to date drivers (recent AMD drivers can give even a 23% more FPSs in the same hardware than older drivers), and if you use the correct memory modules then the AMD chip will be faster... but lots of ignorants and fanboys are still believing that their expensive i3-i5 beats the cheaper A10.
Share
March 29, 2013 2:28:38 AM

juanrga said:
Of course, there is a lot of bias against AMD!...

Thank you, Juanrga, for confirming the bias is not something I have imagined.

Benchmarks have always been a problem and it gets worse when companies pay to have the results improved for their products.

Most video benchmarks give inaccurate results when it comes to comparing integrated parts. I have yet to see a benchmark that mentions the increased CPU loading of an Intel EU compared to the more independent cores of other graphics coprocessors. in at least one case when running an HD video task the EUs loaded the CPUs to 25% compared to 7% for an NVidia card in the same class. The benchmarks were better in the first instance but the actual real world performance was better in the second. The benchmarks were almost useless for this comparison because they test the CPU and GPU separately whereas many tasks in the real world will load both at the same time.

There is still a lot of bias but maybe enough posts on enough forums will help to reduce the problem

m
0
l
a b à CPUs
March 29, 2013 6:51:30 AM

Murray B said:
Given that more than 90% of the GFlops a modern gaming application needs comes from the GPU the perfocmance of the E-450 should be approaching that of the PS3. As I understand it several million users are happily running Skyrim on that platform..

So, does anyone know how fast Skyrim runs on the PS3?


Coded totally differently. PS3 games are coded at a VERY low level; thats how you get decent FPS out of a modified 1950XT (with some 2000 series features). Nevermind you have a totally different CPU architecture (Power7 versus x86), different memory architecture, and different OS running in the background.
m
0
l
March 29, 2013 1:16:17 PM

Murray B said:

Thank you, Juanrga, for confirming the bias is not something I have imagined.


Thank you!

Yes the existence of biased benchmark must remain unnoticed to general public but was well-known in the industry for years

http://semiaccurate.com/2011/06/20/nvidia-amd-and-via-q...

Moreover, it must be also unknown for the general public that Intel compiler cheats when detects a non-Intel CPU, generating the slowest possible version of the code, even if the non-Intel CPU is fully compatible with a better version! It has been shown that this dishonest practice provides to Intel chips up to 47% more performance in some popular benchmarks:

http://www.osnews.com/story/22683/Intel_Forced_to_Remov...

This is still happening today, when you run other compilers you can see big boost in the performance of AMD chips. Using GGC you can see how an AMD FX 8350 beats a Intel i7 3770K in several tests.

Murray B said:

Benchmarks have always been a problem and it gets worse when companies pay to have the results improved for their products.

Most video benchmarks give inaccurate results when it comes to comparing integrated parts. I have yet to see a benchmark that mentions the increased CPU loading of an Intel EU compared to the more independent cores of other graphics coprocessors. in at least one case when running an HD video task the EUs loaded the CPUs to 25% compared to 7% for an NVidia card in the same class. The benchmarks were better in the first instance but the actual real world performance was better in the second. The benchmarks were almost useless for this comparison because they test the CPU and GPU separately whereas many tasks in the real world will load both at the same time.

There is still a lot of bias but maybe enough posts on enough forums will help to reduce the problem


One possible solution is the use of open source benchmarks. Since the code is open anyone can check if the benchmark is cheating or is not.
m
0
l
!