Is AMD trying to go bankrupt

squareenixx

Distinguished
Feb 6, 2011
137
0
18,690
Well AMD's release off the FX series was a total failure and even the Phenom x6 1100t performs better than most of them and even some of the Phenom II quads.

So they release a cpu that basically has 8 cores (I know modules).

So Intel beat the crap out of the FX series in most software and all games. Main reason is not much at all supports 8 cores. Maybe a couple of 3d programs and a couple of video editors but thats about it.

So my cpu is the Phenom x4 965@4GHZ and there is no real upgrade option for me.

Well then I heard that AMD are release a revised CPU sometime this year called piledriver.

Well I looked it up and this model now has 10 cores. I mean is AMD stupid. Who is going to be able to use a 10 core CPU. Seems thats the last trick AMD has is just adding more cores now. Pointless and more power is needed. So whats next after that a 12 core.

I see AMD going bankrupt pretty soon unless they make something half good.

Looks like I will have to go Intel i7 2600k for an upgrade since AMD is only adding more cores which doesn't help anyone.
 
Time will tell if AMD have gotten the future plan right of more cores over speed. This will also require W8 and later versions using the cores correctly and apps supporting them. After all, there is no reason why a 12 core cpu running at a max of 2Ghz cannot play the most demanding game, if that game has been written correctly to offload sections of its code across cores. Make smore sense to use 4 cores at 2GHz than 2 cores at 4Ghz, I bet you would get a smoother gaming experience.
 
OP: Ivy Bridge will be out in a few months, and PD this summer or fall depending on what rumors you read.

Thought AMD decided not to go with the 10-core version of PD, but bringing forward the Trinity version with the improved GPU instead..
 
After all, there is no reason why a 12 core cpu running at a max of 2Ghz cannot play the most demanding game, if that game has been written correctly to offload sections of its code across cores

But again, thats near impossible to do. What you are asking is for 12 seperate parallel tasks to happen at the same time, not being dependent on any other task, avoiding IO locking, and a ton of other under the hood problems.

Games simply will not scale well, ever. The easiest thing to parallize was Rasterization, and we already offloaded that to the GPU. Aside from that, the majority of left over tasks are either easy to accomplish [sound, User Input], or have a deadlock with some other part of the program [AI and physics are at least partially dependent on the outcome of the Rendering equation], which limits how easy the software is to scale.

More cores helps with more applications at the same time. As most people only have one heavy application going at one time, there is a point where more cores doesn't add any extra benifit. BD is more optimal as a server architecture, which is what AMD has admitted BD is.
 

Vettedude

Distinguished
Apr 10, 2009
661
0
19,060
AI and Physics (which with PhysX can be offloaded) and of course rendering are the biggest CPU time eaters in games. Sure you can make the game run 30 threads but most of the time a lot of the cores would sit idle since many of the tasks games process on CPU are very light on CPU (sound, input). I think we'll hit a wall for a while at 8 cores. I mean 8 true cores, not "Bulldozer module" garbage.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860
the number one problem with BD release was the marketing stupidity (hence why they got fired) of bragging about how great it was going to be and how much faster than intel, before they even had a chip.

if you looked beyond the marketing idiots, you find articles like this

For the record, AMD says that a single Bulldozer module has around 80 per cent of the performance of two conventional CPU cores. In other words, a four-module Bulldozer core should be at least as quick as a six-core processor.
But which six-core processor?
http://www.techradar.com/news/computing-components/processors/can-amd-s-eight-core-bulldozer-crush-intel--921019

so 80% *8 = 640%
6 core cpu = 600%

Of course bd is going to be close to thuban, their ENGINEERS said so, not the retards in marketing that got fired.

How does theory scaling relate to real world? interesting find.

http://atenra.blog.com/2012/02/01/amd%E2%80%99s-bulldozer-cmt-scaling/

POV ray

1100t = 639% (intersting that its over 100% per core)
FX = 760%

cinebench
1100T = 621%
FX = 761%

7-zip
1100T = 541%
FX = 639%

x264
1100T = 576%
FX = 599%


Pretty close to what ENGINEERS stated.

Whats interesting to note is how 4thread 4 module and 2 by 2 (unshared bd resources) scales over 100% per core (435% for pov-ray and 455% for cinebench), suggesting that single-single thread tests are not that accurate on actual ipc.
 

majorgibly

Distinguished
Jan 17, 2011
928
0
19,010
It was a punt at some new architecture, I would not say they failed but I wish they would of tested their CPUS before they started building this hype. 12 Core CPU sounds interesting but it need to be clocked correctly and make sure they are proper cores. Also they need to get pricing right, I feel that bulldozer was let down but it's price.
 
Highly unlikely considering Intel pays AMD to even have the right to make 64 bit chips.

Besides, why do people say the Bulldozer is bad? I'm not dropping 300 bucks on a 2600k. I'll buy an FX-4100 for 100 bucks, with a watercooler and overclock the crap out of it and get the same performance for less money. And I guarantee there are a million other people out there like me who take the same view of it.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810
Bulldozer was bad because the chip draws too much power past the 3.8ghz mark. Sure you can overclock, put a massive cooler and PS, but monthly you're paying more to the electric company to keep that running over a i5-2500k.

Within a year your TCO (Total cost of ownership) of the Bulldozer system is greater than a comparable Intel system.
 



I'd have to see statistics to substantiate that honestly. Most of my electric bill goes into the refrigerator, and my air conditioning in the summer, not the computer. Also, in my case, I haven't watched nor own a television in about 6 years. Well, I do own a few years old 27 inch Phillips, suffice to say I'm not even sure if it still works. As far as I know, I don't care what my computer is pulling in terms of electricity, its not gonna come close to what my bigger household appliances use.

As far as computer specific power uses, most of your electric draw is going to be on the video card(s)., yes I'm aware the FX processors can be power hungry at high clock speeds, but still not substantial compared to high Graphics cards. And lets face it, who is going to overclock a processor with a cheapo vid card?

electricity_use_lesson1_activity4_image1.jpg




Don't get me wrong, I'm not going to pick a side on the Intel vs. AMD war, I built a system that exceeds my needs, and in this case, felt that AMD was the better deal.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810
You can see the overclock power consumption here:

http://www.tomshardware.com/reviews/bulldozer-efficiency-overclock-undervolt,3083-14.html

There was a 152 Watt difference from 3.8 to 4.6Ghz.

If run continuously (say you run FAH) that adds up to a good amount.

152 Watt * 24 hours * 365 days / 1000 = 1331 KW hours

Power rates vary by city/state. Mine's at 0.14/hour.

1331 KWh * ($0.14 / KWh) = $186


PS: That chart is hugely out of date and doesn't match my usage at all. I never have more than an effective 20 Watt light on at a time and I don't run AC either. You can get fridges that only cost $80/year in electricity. My computers are easily 75% of my electric bill.
 
if you want to fold@home thats your thing but normal people don't have their computer at load 24h a day.

bulldozer sucks btw. And piledriver isn't going to have 10 cores. AMD is probably going to just drop the performance race and focus on getting at the mobile and low end money. Not a bad market since enthusiasts don't actually occupy much of the market.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810



It's just an example that power consumption is more significant than people think when it comes to the cost of a computer.

Anyhow AMD has dropped the performance race, they specifically said so. AMD's problem focusing on the low end is ARM.

ARM is growing fast in speed. Intel is aggressively pushing down the power consumption. Windows 8 will run on ARM and X86.

AMD's market window is shrinking by the month. And by the looks of it neither AMD or ARM will have a competitor for Haswell (2013).

I just hope they remain competitive in the discrete video market or we're stuck with just NVidia.
 
ARM performance is still far far behind even AMD. the in order ops used by ARM cpus are going to make any true multitasking lag especially if more software are being ran on the background. ARM performance are still behind that of the first gen atoms so AMD still has the ability to produce chips above the ARM power consumption window and give enough performance for the average user. ARM is only updating to 64 bit this year so AMD is quite far ahead in any performance race, the power consumption is another story but the low power AMD apus aren't too bad, we will see what they can do.

With ATI's graphics technology and an x86 license, AMD can do a lot of things no other company can so I don't think they will be going bankrupt any time soon.
 


As already pointed out, who runs their computer 24/7? On a day off with nothing better to do I may run it 12 hours a day (lets be honest, we're never honest when faced with giving away how much of a "no-life gamer" we can be) :eek:

BTW, I'm running the quad core bulldozer, not the big monster 8 core (which kinda reminds me of a super bulked up body builder who cant even make his elbows touch each other because he has so much bulk)

As far as refrigerators, not all of us can afford that fancy super-expensive new model at Sears that runs on $80 a year, sure it might cost us more in the long run (btw, I googled it and found one supposedly only costs $45 a year in electricity), and the cheapest one I can find is about 1900 bucks, yikes.. I wish I could throw that down in one lump sum!, my credit's shot from this economy and prior layoffs. But I digress.

And I have no clue what my electricity costs anyway, sorry if my previous post was misleading in any way. But may as well clear it up, electricity is covered in my utilities. The landwoman ain't bothered me, so I don't assume its breakin' the bank, then again, I'd love to see what that huge plasma TV she has is pullin.

BTW, I hope AMD doesn't go Bankrupt (not that I think they will).. Lets face it, the Microsoft monopoly is still there despite all the years of lawsuits to bring them down. Not that I'm particularly political, but if we have just Intel running the processor market, well who says they can't charge whatever they darn-well please for their chips? Last time I checked Microsoft Office 2012 was something to the tune of $400+, and here we're already talking about having to buy the next edition are you kidding me?
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


A15 introduces out of order ops BTW.

That's true the performance numbers are not entirely known yet but a 2.5Ghz Quad-core ARM Cortex-A15 will pack a decent punch. It's just the sheer volume of money being invested in ARM.

You have Apple, Samsung, NVidia (Tegra4), TI (Omap5), Qualcomm, Applied Micro, Calxeda, etc. SOOOO much money being pumped into ARM designs right now. It looks like several A15 chips will be shipping in 2012, including a couple of the new 64bit variants.

I hope AMD can pull off a surprise with Trinity but I'm doubtful. Intel is narrowing the GPU gap with Ivy as well.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Mobile market sales surpassed PC sales this year so Intel doesn't have close to a monopoly on the processor market anymore.

With the ARM competition, Intel could buy AMD and get away with it. They need the GPU boost. How many processors do we need?

There's already quite a list (In order of CPU power and scalability)
MIPS(China - low end mobile)
ARM(Soon 3Ghz and 64bit)
AMD
Intel
Oracle (Sparc series)
IBM (Power series)


AMD processor (CPU) wise isn't really necessary anymore to maintain a balance. Imagine what Intel could do with the Radeon design if they made it at 22nm Tri-Gate.

Every market consolidates from time to time.

PS: I still use Office 2003. Whats the big feature to upgrade for in 2012? lol
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860

I can tell you one thing right now, that article is flawed. I am sorry to say but if any of the testers at toms would like to ask me I would be more than happy to give you the rundown on my entire bios settings.

first is this

Clock Frequency: 4.5 GHz, Multiplier: 22.5x, CPU Voltage: 1.428 V

cpuz-amd-fx-22-5-cpu-load.png


Notice anything? ya, thats right, at the time of the screenshot its running at 1.38V, but the title sais 1.428V

Thats because BD is extremely aggressive in AMD's power gating. For stock its perfect. for overclocking, not so much. here is why. The cpu voltage ranges from 1.38V to 1.428V while running at 4.5ghz. Now here is where the problem is. the cpu is perfectly stable at 1.38V but because of early bios/lack of settings, in order to obtain a minimum voltage of 1.38v, they had to crank it up to 1.428v. What if they just set it at 1.38V? then the low voltage is now 1.332. wich is unstable with their particular chip.

New bios or more advanced bios than the one used for this test have an option called LLC, load-line calibration. This setting changes the effect of what I just said. therefore if you set it to maximum or 100%, 1.38v is 1.38v to 1.38v. there is no variance. so now instead of running from 1.38v to 1.428v, your running 1.38v period. power consumption down, stability is up.

There are also some other minor tweaks to other parts that will increase stability at a lower voltage. Essentially getting the most out of my 8120 with the smallest voltage, here is the result.

2212932.png


Thats right, toms got 4.6 ghz with 1.5V, I managed to squeeze it to 4.7 ghz at 1.344V ... flawed power consumption from toms article for the win.

As I said, if any of the testers wants to duplicate my settings, let me kow.
 



The other CPU companies as of today really don't provide any serious competition compared to AMD and Intel, as of now, I don't think there are too many ATX mobos floating around for anything but the two, at least I've never heard of anyone building their own system in the past couple years with anything but AMD or Intel.. Be patient with me, I'm a first year computer tech major, I'm still learning here.

How many processors do we need? About as many candybars as the local corner store offers, some people like variety. Even if we all like Hershey bars, I still like the option to buy a Clark bar if I decide one day I'm in the mood for it. From a business standpoint, lack of competition promotes not only overpricing, but laziness in innovation. Is it a concern for computer technology in the 21st century? Probably not. But one never knows.

As far as Office 2010.. Thank God I got it free through my college. Theres no way I'd buy it. As far as whats new in it vs 2003? Not a whole lot honestly, the features are a lot easier to navigate is one thing I'd say it has going for it. Left to my own devices, I'm satisfied with OpenOffice. I miss old Clippy from the old Office, but man was he a patronizing little you know what....
 

Chad Boga

Distinguished
Dec 30, 2009
1,095
0
19,290

They lost money last qtr on GPU's.


Did you miss all those qtr's from only a few years ago, where AMD racked up billions in losses.

Besides, why do people say the Bulldozer is bad? I'm not dropping 300 bucks on a 2600k. I'll buy an FX-4100 for 100 bucks, with a watercooler and overclock the crap out of it and get the same performance for less money. And I guarantee there are a million other people out there like me who take the same view of it.
Considering that it is widely held that only 5% of computer buyers overclock their machines and because the FX-4100 is so unimpressive, I very much doubt that many people would be doing what you are espousing.

What's more, even a madly overclocked FX-4100 is going to be struggling to beat the 2600K, plus you then get the heat, electricity cost and quite possibly extra noise issues to deal with.

 
@chad
Want to post something to support your claims that AMD lost money in the gpu department? or that AMD had "billions" in loses? Or post something about the claim 5% overclockers? Or compare the FX4100 to a cpu in the right price range?