Intel Chairman Says Company Had Lost Its Way

Status
Not open for further replies.
They're still making CPUs, so they haven't lost their way. I think Intel tried to chew more than it could. IMO it wasn't the size of their mouth the issue, but the timing they started to chew.

Oh well, I don't think they'll have any issues putting some order in the house and going after the markets they want.

Cheers!
 

stevejnb

Honorable
May 6, 2013
609
0
10,980
Eh, short term wise, the numbers agree with him, but... Tablets and smartphones are moving past their infancy and are reaching a point where they need more power and to be more capable. Intel is proving that they can be competitive on the fronts of size and efficiency, but do ARM processors have it in them to compete on the capability front while still offering small sizes and high efficiency? I have yet to see evidence of that.

The way I see it, Intel is doing so-so in the mobile market right now but, as demand for higher power tablets/phones comes around, Intel is the best show in town and this will be reflected in the CPU choices of various companies in the near future.

There is always the cloud revolution looming which could result in hardware requirements hitting rock bottom and size/battery life being pretty much everything, but I think that is far enough off that we'll see Intel's powerful but still small and efficient CPUs pick up big time.
 

NightLight

Distinguished
Dec 7, 2004
571
14
19,645
i still prefer intel over amd, just because i had so much bad luck with amd stuff. Fact is, they have the best proc's out there for the moment, however, the level of performance it being held down artificially by lack of competition... Time for a new, mindblowing age of computing.
 


You mean the compilers that are not even used in the majority (90%) of software? The same compilers that are designed to work best with Intels hardware, much like CUDA/PhysX with NVidia or Mantle/ TressFX for AMD?

If Intel does that then NVidia and AMD should open up their proprietary software/hardware as well, no?

Guess what, it wont happen as that's how companies one up the others. Ford has Microsoft SYNC. Other companies have their own equivalent but SYNC has advantages.

Without them everyone would have the same hardware and there would be no reason to pick one over the other.
 

MajinCry

Distinguished
Dec 8, 2011
958
0
19,010

I doubt that figure. Would you please provide evidence for it?



Read this.



Quite fallacious.

That comparison doesn't even work. They're technologies tailored to each card. If you're going to use THAT method, you'd best use an API. Such as DirectX. Which doesn't hold water, due to NVidia (por ejemplo) not being able to do a "If != NVidia then Cripple()" at runtime.



If what, Intel removes the "Cripple AMD()" function from their compilers, NVidia and AMD should make their software/hardware open source?

Ladies and gentlemen, I quote you a non sequitur!



Another completely, and utterly, fallacious comparison.

A more accurate one would be: Toyota has gained 80% of the oil marketshare and implemented a way to check if your engine is manufactured by Toyota or not. If your engine is a non-Toyota certified engine, it will use up twice as much oil than a Toyota engine. However, if you fool the check into thinking you have a Toyota engine, it uses up as much as a Toyota engine with no problems at all.

If Toyota were to do such a thing, you'd probably support it too. :pfff:



Without "them"? You mean different hardware manufactures or anti-competitive practices?
 

iamtheking123

Distinguished
Sep 2, 2010
410
0
18,780
The whole "Intel optimized compiler" thing is a joke. It's a product written BY INTEL. If they want to make it not work at all with AMD CPU's, it's their right to do it. You don't like it? Use a generic compiler.
 

ddpruitt

Honorable
Jun 4, 2012
1,109
0
11,360


The problem is that the Intel compiler is oft used for compiling benchmark apps, see the problem? To top that off the Intel compiler is know to use undocumented features in order to provide better performance than any non-Intel compiler can provide and thus it tends to be used fairly often. The biggest issue is that given the same feature set Intel put effort into deliberately making sure that optimized code is different depending on the chip that it runs on. For example code that would optimize well to a particular SSE instruction would only be optimized on Intel chips, on non-Intel chips the slowest instruction would be chosen. This is a no-no for a number of reasons. This problem is large enough that benchmarks on Intel chips can't be trusted anymore. See:

http://www.theregister.co.uk/2013/07/12/intel_atom_didnt_beat_arm
 

SteelCity1981

Distinguished
Sep 16, 2010
1,129
0
19,310
they got on the smartphone and tablet bandwagon way late, for years they kept their snobbish attitude towards ARM cpu's thinking they weren't any sort of threat, when they were looking at ARM completely backwards. ARM was never a cpu desktop or laptop threat, but where they were a threat was in ultra portable devices. Now ARM is now in almost every tablet and smart phone not to mention the highly popular 3DS. AMD took the if you can't beat them join them approach which was smart when they have helped implement some of the same instructions like their 64 bit to ARM's arch and because of this AMD and ARM are implementing each others technologies now.
 

JOSHSKORN

Distinguished
Oct 26, 2009
2,395
19
19,795
Not only did they get into the smartphone CPU game too late, but I'm not impressed with the fact that 8 core Desktop CPUs are not mainstream yet. Yeah, I get it. Many apps still only use one core. That has to change as well.
 

ZolaIII

Distinguished
Sep 26, 2013
178
0
18,690
Design wins? Data centers cloud servers? With what? Even if new gen of atoms cud deliver bigger performance per wat they should still have 10x silicon footprint over Arm competition making the chips of similar performance at lest 3x priced.
Now consider this if Intel is able to achieve bigger efficiency with 6-10x sized x86 core what cud they be able to produce with Arms A53 or MIPS ProAptive design similarly optimized? Future of computing is most definitely based on many core designs & hole thing with pushing x86 design is just keeping us back. So ok you have best possible manufacturing process, you have biggest manufacturing facilities & production facilities, you have best engineering teem & really good team of programmers & a huge deposit of financial resources. & what you do? You let others to produce they chips & sell them mean while you let all of your huge advantages just turn to waist? & for what? Little pride & cheap license? Well Intel swallow your pride & become man! If i cud have all of this not only that i should produce most effective architecture that is commercially available on my best manufacturing process bat all so the home & industry based goods (controllers) on my older processes on what ever architecture they are needed & adapted as standard (MIPS based) & under my brand & for my name & not as for somebody else. Really if i were all so all mighty i should even go beyond that & try to create completely new architecture (future & more optimized instruction set based on the best one known).
 

wdmfiber

Honorable
Dec 7, 2012
810
0
11,160

There is no conspiracy; Bulldozer architecture was terrible and almost destroyed AMD. Steamroller isn't coming to the FX line, only the APU's. The high end Radeon GPU's are going to need Intel CPU's. Haswell today and Broadwell for the future.
http://www.pcgamer.com/2013/11/21/amd-sadness-steamroller-wont-come-to-fx-cpus-in-2014/
They should badge the next gen(20nm) GPU's R9-3XX's ... ATI Radeon.
 

soldier44

Honorable
May 30, 2013
443
0
10,810
Starting to seem like they don't care about desktop cpus anymore, leaving us that still build our own and upgrade every 2 years in the dust. Im still using my 2600k from 2 years ago since the 4700 series just wasn't worth the upgrade to save a few degrees of temperature...
 

ptmmac

Distinguished
Dec 3, 2011
16
0
18,510
I do not believe Intel is ready to compete with $25-30 processors. The death of Denson scaling is what is killing Intel. If leading by one node = 50% faster processor then you can command more money! If it means 10% faster then why spend more money? Saving power helps sell a product, but it doesn't raise the price. Intel needs a material that can scale to Thz not Ghz. Silicon hits a wall at 3Ghz and simply becomes a better heater at any higher frequencies. The reason mobile chips have caught up is they run at lower frequencies to save power. Now that ARM designs are hitting 2 Ghz at the high end we will see less obvious gains except where less legacy code frees Arm to innovate. Apple will continue to lead because they will be optimizing both software and hardware together. The platform control they have kept will allow them to bring innovations that other players will struggle to match. When Apple increases their core count the software support to rewrite a code base will be there. If specialized silicon can speed a key function then it will be added to the system. Intel et al cannot plan this out 3 or 4 years in advance because they don't know if they can get the software support. Yvmv
 

legokangpalla

Honorable
Feb 28, 2013
23
0
10,510
I'm of mixed mind here. I want to take x86 hostage and yell"Don't move! I'm going to shoot him!" Other half wants to jump between the gun point and the ISA.
 

legokangpalla

Honorable
Feb 28, 2013
23
0
10,510
I mean, personal workstation and servers are saturated by Intel processors, but commercial racks and high scaling computing are mostly done by others. I'm scratching my head here. Do intel processors have shit for scalability? Or is it just cost/power usage?
 

tobalaz

Honorable
Jun 26, 2012
276
0
10,780
Funny how AMD wrote the X86-64 instruction set and Intel performs better than AMD now.
Intel will do anything to keep AMD down.
AMD cutting the cache on their chips didn't help either. It was all downhill after socket 939.
 

jurassic512

Distinguished
Jun 5, 2010
152
0
18,690
"Starting to seem like they don't care about desktop cpus anymore, leaving us that still build our own and upgrade every 2 years in the dust. Im still using my 2600k from 2 years ago since the 4700 series just wasn't worth the upgrade to save a few degrees of temperature..."

Intel is more than just desktop CPU's. Give them time.
 

falchard

Distinguished
Jun 13, 2008
2,360
0
19,790
I've actually stopped bothering with the x86 compiler. It's not useful anymore as we move onto 64-bit. I really don't know many companies that are moving forward with 32-bit only outside of mobile.
Right now the best tablet chip is an AMD. Its like an Athlon verse a Pentium. Not a Phenom verse a Core. Realistically there will be more Windows tablets in the future that outpace Android development. The thing to remember is that Windows Tablets are tools. Android tablets are toys. As the Windows tablet becomes more adopted it will also have the functionality of a toy. On the other hand Androids backbone to me is not that good where I would want to use it for productivity. I don't think that will change until there is more C++ focused development on Android instead of Java.
 

kartu

Distinguished
Mar 3, 2009
959
0
18,980
What about... margins, dear Intel?
What about the fact that total income of all ARM CPU manufacturers is laughable in your books?
 
Status
Not open for further replies.