jennyh

Splendid
http://www.theinquirer.net/inquirer/news/1558372/intel-caught-dodgy-gpu-drivers

http://techreport.com/articles.x/17732/1

Intel prove that graphically they are years behind ATI and Nvidia by cheating at 3dmark. Both those companies stopped doing it 6 years ago...funnily enough thats how far behind intels graphics are today.

When you have nothing, intel believe it's better to cheat and lie instead of throwing some of their $billions at fixing the problem properly. Remember that next time you are thinking about buying anything with intel inside.
 
Intel's response:

We have engineered intelligence into our 4 series graphics driver such that when a workload saturates graphics engine with pixel and vertex processing, the CPU can assist with DX10 geometry processing to enhance overall performance. 3DMarkVantage is one of those workloads, as are Call of Juarez, Crysis, Lost Planet: Extreme Conditions, and Company of Heroes. We have used similar techniques with DX9 in previous products and drivers. The benefit to users is optimized performance based on best use of the hardware available in the system. Our driver is currently in the certification process with Futuremark and we fully expect it will pass their certification as did our previous DX9 drivers.

But why can't the driver do this regardless what's being run on it? Why can't the driver recognize, regarless of the name, that certain tasks can be offloaded to the CPU?

I agree with TechCrunch's conclusion that Intel graphics driver isn't compliant with 3dMark's guidelines for approval. My huge beef with this (which may surprise people who think I'm an Intel Fanboy) is that it does NOT translate into real-world performance, it just increases benchmark scores.
 

jennyh

Splendid
A good response on the TechReport from a jdaven..

Now you might say well the customer can just look at real gaming benchmarks. Intel thought of this too and made the optimization work for other major games benchmarked by review sites and not just every game. Tell yourself this, why can't Intel offload work to the CPU in every game and allow the user to check this in the device driver panel. They are only offloading in the most benchmarked games. Also, why can you just change the .exe name and lose the optimization? That doesn't make sense.
 

I read his\her response as well and I thought it was very level headed. He also addressed about optimizations targeting major games such as Crysis. He noted while the gain might be real in Crysis, that it still isn't kosher because after all, Crysis is probably being target in part because it's a benchmark.

What happens when I load up a less popular game such as SimCity 4, and all of a sudden the GPU can't handle it as well.




Oh noes... I just opened up a thread... made by JennyH... that was called "Intel is Cheating...".... and agreed with her...

SERENITY NOW!
 

werxen

Distinguished
Sep 26, 2008
1,331
0
19,310
a company cheating something to seem more profitable?!?!? NOOO!!! WTFF!! NEVER BEEN DONE IN THE HISTORY OF HISTORY!!! AMD WOULD NEVER DO SOMETHING LIKE THAT BECAUSE THEY ARE AMD AND THEY ARE ALWAYS THE UNDERDOG!

on a serious note, if this surprises you then you need to get out of your cave.
 

jennyh

Splendid
It's not the fact that intel are cheating again - believe me I'm the worlds least surprised by that - it's quite disturbing that they do this sort of thing and don't expect to get caught.
 


Is.

Intel IS cheating.




But that aside, it does question the legitimacy of their product reviews, that is why Intel should NOT do this kind of stuff. Make the better product and let it shine. It worked with Core 2. In this case (IGPs), AMD does have the better product and Intel is trying to hide that fact, and is doing so rather blatently.

Is this illegal? No. But it does however attack the foundation of something I love; the ability to benchmark and compare computer hardware objectively and consistently.
 

keithlm

Distinguished
Dec 26, 2007
735
0
18,990
Interesting how some people agree that doing this with GPU optimizations is a bad thing and is even "cheating".

But then they maintain that doing the same with CPU optimizations is acceptable.

And this is not "biased".




Just like "TC are again ranting about the word is."
 



Interesting how when someone finds common ground with a group of people he usually disagrees with he's attacked for not blindly agreeing with all the other points the group has tried to make in the past.
 
Well ...

If yah ain't cheating, yah ain't trying ;)


I hope this isn't a return to the bad ol' days when you couldn't trust any bench. What comes to mind is the old Diamond cheat where they would drop pixels to appear faster on a screen 'refresh'




 

keithlm

Distinguished
Dec 26, 2007
735
0
18,990


They are both wrong or they are both not wrong.

 


You're not comparing apples to apples. The situations are different.


Show me some solid evidence. Here, I'll start you off:

http://arstechnica.com/hardware/reviews/2008/07/atom-nano-review.ars/6


However, if the above link implicates Intel of "cheating", it also implicates AMD, so I'm sure you'll either reject it or somehow say that AMD is innocent and Intel isn't.
 


But their Windows OSes don't come with browsers, how could they ever get onto the internet? (<--lame joke)


I also found that saying "Intel are cheating\evil\monopoly\bk\copying\evil\very evil" is common among AMD fanboys.
 

keithlm

Distinguished
Dec 26, 2007
735
0
18,990


Nothing new here.

We already knew that using the CPUID to decide which optimization pathways to allow is not entirely honest. What you posted only confirms that even more. (Thanks for the support!)
 


So you're saying that AMD ARE CHEATING!?!?!?!?!?

OMG!!!
 

keithlm

Distinguished
Dec 26, 2007
735
0
18,990


Your attempt at sarcasm doesn't diminish that truth.

The brand should not be used decide what optimization to use when the ability to ask the hardware directly what optimization can be used is available. This is true of video cards as well as processors.


 


It wasn't sarcasm, it was mockery!

But in THIS CASE where PCMark 2005 changes results based on CPUID, I agree with you. However, it seems to more of an issue with the benchmark than with AMD and Intel (from my somewhat limited understanding of the situation).

However, the original topic is 100% pure Intel's fault.
 

keithlm

Distinguished
Dec 26, 2007
735
0
18,990


Oh... I see. TC are mocking peoples. (AMD is more popular than Intel in many foreign countries. So it is probable that you are seeing posts from people that don't use English as their native language. Either way we know it is one of your pet peeves.)

And the article you keep linking claims that Futuremark might have questionable quality in their code and created 3 "paths" for code optimization. What they don't hypothesize is the much more likely chance that the optimization paths were probably not discretely coded but were left to the compiler to decide.

Either way the practice raises the question of ethics when it is either done or used by a company that pretends to offer professional benchmarking software to allow unbiased benchmark comparisons.
 


TC, give up. Its keith. Hes not exactally open minded when it comes to this thing as shown in the past and its pretty much AMD is perfect, Intel is evil.

Would you expect anything else from shady Intel?
How many lawsuits are out there against Intel? Can you even count them? How many against AMD? Right.

There motto is steal the technology from someone else, implement it into the mainstream with all their engineering resources, make billions and pay a measly fine.
Works for them.

The smaller the company, the easier it is to hide cheating/corruption or for people to look past their mistakes. Look how many people forgot about Hector and his ability to sit on K8 while Intel created Core 2 to take back the performance crown.

BTW, you realize that AMD bougth DEC-Alpha for their IMC and IBM assisted them with x86-64 and the HTT, right? Wasn't exactally their ideas. But thats the way the market works. Most people have an idea and sell it to a larger company to retire nicely. And DEC_Alpha was not the first with a IMC. Many other companies before them had made them, even Intel did. but there was a before Intel as well. its just that since they don't exist they get no credit or memory.

Either way, its a possibility that the drivers itself are specifically coded for LRB. Its kinda funny how people are saying Intel is cheating when a LRB GPU gves a better performance boost with a Intel CPU but when AMDs platform is involved and gives better performance in CrossFire when a AMD CPU and chipset are involved, people don't get up in arms. or did people forget about that?

I don't see the problem with it since a companies goal is to sell its products, hence why AMD would also make those optimizations for ATI GPUs with AMD CPUs.

Also most of the benchmark games are the ones that are heavy in pixel and vertex shaders hence why they are used to benchmark new CPUs and GPUs since they stress both to the limits. Crysis is very GPU bound and even Left 4 Dead is very CPU bound. Both games utilize pixel shaders and vertex shaders a lot but one uses a CPU more and the other uses a GPU more. If a older game doesn't need to stress anything that much (like TCs Sim City 4) then the optimizations wont be needed. But future games will be.

This is much like how newer drivers tend to only optimize performance for the newest set of GPUs and maybe one gen behind and as well the newest and most popular titles. When was the last time you saw your drivers giving a performance boost all around unless it was for a just new card? I see it all the time for the newest GPU from ATI. but thats life. In fact my HD2900 I used to have stopped getting performance upgrades about 10 drivers ago. Kinda sad but meh.

So in the endoptimizing for your hardware, not a big deal. Everyone does it. I am sure that nVidia does it as does AMD. We just don't hear about it as much since well, Intel is evil and all.

Also here is the thread I read about AMD optimizing their GPUs for their platform for Crossfire setups:

http://www.tomshardware.com/forum/265155-28-multi-card-setup-myth

Theres starting to appear hints that the upcoming new drivers from ATI will benefit AMD cpus in crossfire much more than the i7s.
Now, this could be a platform thing. Or, it could be a gfx card wall, and i7 wont benefit like P2 will, which may have more cards hitting the gpu limitations.
Look for Cat 9.8s, some amazing claims are being leaked, up to 50% imrovements on AMD based HW, less with Intel

As I said, don't get pissed at one company and not the other.
 
This wouldnt even be needed except for the poor IGPs Intel has.
Lets put it this way, they arent any "better" than anyone else.
The proposed increase isnt happening as promised. Thier IGPs were supposed to be competitive now, and its not happening.
This could be a bad omen for the future, and it doesnt do anyone any good but Intel, and if thats OK with people, then I guess every company should do this type of thing, and we'll never have anything really decent, when we can alter things and make claims, and only go half way there.
Theres alot of games out there that can make use of this, again, because of their poor IGPs, otherwise, it wouldnt be needed at all.
Now, if ATI or nVidia only made decent drivers for certain games, thatd be OK too, right?
 

randomizer

Champion
Moderator
I still don't think this is cheating because the consumer is losing nothing. They are either gaining performance or not gaining performance, but they are not losing it or losing IQ. Graphics driver optimisations sometimes result in a loss of IQ in some way. As I said in the other thread where this was brought up: Misleading? Yes. Cheating? I don't think so.
 

jennyh

Splendid
It is cheating Random and I'll tell you why. For an intel IGP to score higher in Vantage or any benchmark than any ATI igp is a joke. There is a reason why the Vantage executable shouldn't be targetted for optimisations, and that reason is what is the point if nvidia and ati start doing that?

Is it ok for intel to break a very clear rule that futuremark made to prevent this? It is supposed to be a non-biased benchmark, yet some intel igp's are scoring higher than ati's, even though in real games the intels aren't even half as good.

You say the consumer is losing nothing, what if the consumer is multitasking and losing cpu power while running a game? Now what if that same user had seen the multitasking performance of this intel and made their purchasing decision based on that, only to find that the cpu was being offloaded to while playing *some* games and not the ones that they saw while being benched.

Yes, it's cheating.

"Cheating is an act of lying, deception, fraud, trickery, imposture, or imposition. Cheating characteristically is employed to create an unfair advantage, usually in one's own interest, and often at the expense of others, [1] Cheating implies the breaking of rules."

If futuremark set rules for their software, and intel break those rules, they are cheating.