Sign in with
Sign up | Sign in
Your question

Is AMD trying to go bankrupt

Last response: in CPUs
Share
February 9, 2012 11:44:01 AM

Well AMD's release off the FX series was a total failure and even the Phenom x6 1100t performs better than most of them and even some of the Phenom II quads.

So they release a cpu that basically has 8 cores (I know modules).

So Intel beat the crap out of the FX series in most software and all games. Main reason is not much at all supports 8 cores. Maybe a couple of 3d programs and a couple of video editors but thats about it.

So my cpu is the Phenom x4 965@4GHZ and there is no real upgrade option for me.

Well then I heard that AMD are release a revised CPU sometime this year called piledriver.

Well I looked it up and this model now has 10 cores. I mean is AMD stupid. Who is going to be able to use a 10 core CPU. Seems thats the last trick AMD has is just adding more cores now. Pointless and more power is needed. So whats next after that a 12 core.

I see AMD going bankrupt pretty soon unless they make something half good.

Looks like I will have to go Intel i7 2600k for an upgrade since AMD is only adding more cores which doesn't help anyone.

More about : amd bankrupt

February 9, 2012 12:10:11 PM

Amd will not go bankrupt, they still make a fortune from their gpu's
m
0
l
a c 103 à CPUs
February 9, 2012 12:19:40 PM

Time will tell if AMD have gotten the future plan right of more cores over speed. This will also require W8 and later versions using the cores correctly and apps supporting them. After all, there is no reason why a 12 core cpu running at a max of 2Ghz cannot play the most demanding game, if that game has been written correctly to offload sections of its code across cores. Make smore sense to use 4 cores at 2GHz than 2 cores at 4Ghz, I bet you would get a smoother gaming experience.
m
0
l
Related resources
a b à CPUs
February 9, 2012 12:19:50 PM

OP: Ivy Bridge will be out in a few months, and PD this summer or fall depending on what rumors you read.

Thought AMD decided not to go with the 10-core version of PD, but bringing forward the Trinity version with the improved GPU instead..
m
0
l
a b à CPUs
February 9, 2012 2:58:18 PM

Quote:
After all, there is no reason why a 12 core cpu running at a max of 2Ghz cannot play the most demanding game, if that game has been written correctly to offload sections of its code across cores


But again, thats near impossible to do. What you are asking is for 12 seperate parallel tasks to happen at the same time, not being dependent on any other task, avoiding IO locking, and a ton of other under the hood problems.

Games simply will not scale well, ever. The easiest thing to parallize was Rasterization, and we already offloaded that to the GPU. Aside from that, the majority of left over tasks are either easy to accomplish [sound, User Input], or have a deadlock with some other part of the program [AI and physics are at least partially dependent on the outcome of the Rendering equation], which limits how easy the software is to scale.

More cores helps with more applications at the same time. As most people only have one heavy application going at one time, there is a point where more cores doesn't add any extra benifit. BD is more optimal as a server architecture, which is what AMD has admitted BD is.
m
0
l
February 9, 2012 7:42:03 PM

AI and Physics (which with PhysX can be offloaded) and of course rendering are the biggest CPU time eaters in games. Sure you can make the game run 30 threads but most of the time a lot of the cores would sit idle since many of the tasks games process on CPU are very light on CPU (sound, input). I think we'll hit a wall for a while at 8 cores. I mean 8 true cores, not "Bulldozer module" garbage.
m
0
l
a c 203 à CPUs
a b À AMD
February 9, 2012 7:50:58 PM

Quote:
Is AMD trying to go bankrupt
Of course not.
They're just trying to help Intel maximize their profits and market share. ;) 
m
0
l
a c 141 à CPUs
a b À AMD
February 9, 2012 7:54:11 PM

Its not easy to program a game to utilize more than 2 cores. I doubt well ever see one use 6 cores much less 10 or 12 being used in anything other than rendering type tasks.
m
0
l
a b à CPUs
February 9, 2012 8:20:47 PM

the number one problem with BD release was the marketing stupidity (hence why they got fired) of bragging about how great it was going to be and how much faster than intel, before they even had a chip.

if you looked beyond the marketing idiots, you find articles like this

Quote:
For the record, AMD says that a single Bulldozer module has around 80 per cent of the performance of two conventional CPU cores. In other words, a four-module Bulldozer core should be at least as quick as a six-core processor.
But which six-core processor?

http://www.techradar.com/news/computing-components/proc...

so 80% *8 = 640%
6 core cpu = 600%

Of course bd is going to be close to thuban, their ENGINEERS said so, not the retards in marketing that got fired.

How does theory scaling relate to real world? interesting find.

http://atenra.blog.com/2012/02/01/amd%E2%80%99s-bulldoz...

POV ray

1100t = 639% (intersting that its over 100% per core)
FX = 760%

cinebench
1100T = 621%
FX = 761%

7-zip
1100T = 541%
FX = 639%

x264
1100T = 576%
FX = 599%


Pretty close to what ENGINEERS stated.

Whats interesting to note is how 4thread 4 module and 2 by 2 (unshared bd resources) scales over 100% per core (435% for pov-ray and 455% for cinebench), suggesting that single-single thread tests are not that accurate on actual ipc.
m
0
l
February 9, 2012 8:37:17 PM

It was a punt at some new architecture, I would not say they failed but I wish they would of tested their CPUS before they started building this hype. 12 Core CPU sounds interesting but it need to be clocked correctly and make sure they are proper cores. Also they need to get pricing right, I feel that bulldozer was let down but it's price.
m
0
l
a c 78 à CPUs
February 9, 2012 9:17:34 PM

Highly unlikely considering Intel pays AMD to even have the right to make 64 bit chips.

Besides, why do people say the Bulldozer is bad? I'm not dropping 300 bucks on a 2600k. I'll buy an FX-4100 for 100 bucks, with a watercooler and overclock the crap out of it and get the same performance for less money. And I guarantee there are a million other people out there like me who take the same view of it.
m
0
l
a b à CPUs
February 10, 2012 2:32:46 AM

Bulldozer was bad because the chip draws too much power past the 3.8ghz mark. Sure you can overclock, put a massive cooler and PS, but monthly you're paying more to the electric company to keep that running over a i5-2500k.

Within a year your TCO (Total cost of ownership) of the Bulldozer system is greater than a comparable Intel system.
m
0
l
a c 78 à CPUs
February 10, 2012 2:56:01 AM

Cazalan said:
Bulldozer was bad because the chip draws too much power past the 3.8ghz mark. Sure you can overclock, put a massive cooler and PS, but monthly you're paying more to the electric company to keep that running over a i5-2500k.

Within a year your TCO (Total cost of ownership) of the Bulldozer system is greater than a comparable Intel system.



I'd have to see statistics to substantiate that honestly. Most of my electric bill goes into the refrigerator, and my air conditioning in the summer, not the computer. Also, in my case, I haven't watched nor own a television in about 6 years. Well, I do own a few years old 27 inch Phillips, suffice to say I'm not even sure if it still works. As far as I know, I don't care what my computer is pulling in terms of electricity, its not gonna come close to what my bigger household appliances use.

As far as computer specific power uses, most of your electric draw is going to be on the video card(s)., yes I'm aware the FX processors can be power hungry at high clock speeds, but still not substantial compared to high Graphics cards. And lets face it, who is going to overclock a processor with a cheapo vid card?





Don't get me wrong, I'm not going to pick a side on the Intel vs. AMD war, I built a system that exceeds my needs, and in this case, felt that AMD was the better deal.
m
0
l
a b à CPUs
February 10, 2012 4:55:48 AM

You can see the overclock power consumption here:

http://www.tomshardware.com/reviews/bulldozer-efficienc...

There was a 152 Watt difference from 3.8 to 4.6Ghz.

If run continuously (say you run FAH) that adds up to a good amount.

152 Watt * 24 hours * 365 days / 1000 = 1331 KW hours

Power rates vary by city/state. Mine's at 0.14/hour.

1331 KWh * ($0.14 / KWh) = $186


PS: That chart is hugely out of date and doesn't match my usage at all. I never have more than an effective 20 Watt light on at a time and I don't run AC either. You can get fridges that only cost $80/year in electricity. My computers are easily 75% of my electric bill.
m
0
l
a b à CPUs
a b À AMD
February 10, 2012 5:09:41 AM

if you want to fold@home thats your thing but normal people don't have their computer at load 24h a day.

bulldozer sucks btw. And piledriver isn't going to have 10 cores. AMD is probably going to just drop the performance race and focus on getting at the mobile and low end money. Not a bad market since enthusiasts don't actually occupy much of the market.
m
0
l
a b à CPUs
February 10, 2012 5:42:19 AM

esrever said:
if you want to fold@home thats your thing but normal people don't have their computer at load 24h a day.

bulldozer sucks btw. And piledriver isn't going to have 10 cores. AMD is probably going to just drop the performance race and focus on getting at the mobile and low end money. Not a bad market since enthusiasts don't actually occupy much of the market.



It's just an example that power consumption is more significant than people think when it comes to the cost of a computer.

Anyhow AMD has dropped the performance race, they specifically said so. AMD's problem focusing on the low end is ARM.

ARM is growing fast in speed. Intel is aggressively pushing down the power consumption. Windows 8 will run on ARM and X86.

AMD's market window is shrinking by the month. And by the looks of it neither AMD or ARM will have a competitor for Haswell (2013).

I just hope they remain competitive in the discrete video market or we're stuck with just NVidia.
m
0
l
a b à CPUs
a b À AMD
February 10, 2012 5:51:44 AM

ARM performance is still far far behind even AMD. the in order ops used by ARM cpus are going to make any true multitasking lag especially if more software are being ran on the background. ARM performance are still behind that of the first gen atoms so AMD still has the ability to produce chips above the ARM power consumption window and give enough performance for the average user. ARM is only updating to 64 bit this year so AMD is quite far ahead in any performance race, the power consumption is another story but the low power AMD apus aren't too bad, we will see what they can do.

With ATI's graphics technology and an x86 license, AMD can do a lot of things no other company can so I don't think they will be going bankrupt any time soon.
m
0
l
a c 78 à CPUs
February 10, 2012 5:54:25 AM

Cazalan said:
You can see the overclock power consumption here:

http://www.tomshardware.com/reviews/bulldozer-efficienc...

There was a 152 Watt difference from 3.8 to 4.6Ghz.

If run continuously (say you run FAH) that adds up to a good amount.

152 Watt * 24 hours * 365 days / 1000 = 1331 KW hours

Power rates vary by city/state. Mine's at 0.14/hour.

1331 KWh * ($0.14 / KWh) = $186


PS: That chart is hugely out of date and doesn't match my usage at all. I never have more than an effective 20 Watt light on at a time and I don't run AC either. You can get fridges that only cost $80/year in electricity. My computers are easily 75% of my electric bill.


As already pointed out, who runs their computer 24/7? On a day off with nothing better to do I may run it 12 hours a day (lets be honest, we're never honest when faced with giving away how much of a "no-life gamer" we can be) :o 

BTW, I'm running the quad core bulldozer, not the big monster 8 core (which kinda reminds me of a super bulked up body builder who cant even make his elbows touch each other because he has so much bulk)

As far as refrigerators, not all of us can afford that fancy super-expensive new model at Sears that runs on $80 a year, sure it might cost us more in the long run (btw, I googled it and found one supposedly only costs $45 a year in electricity), and the cheapest one I can find is about 1900 bucks, yikes.. I wish I could throw that down in one lump sum!, my credit's shot from this economy and prior layoffs. But I digress.

And I have no clue what my electricity costs anyway, sorry if my previous post was misleading in any way. But may as well clear it up, electricity is covered in my utilities. The landwoman ain't bothered me, so I don't assume its breakin' the bank, then again, I'd love to see what that huge plasma TV she has is pullin.

BTW, I hope AMD doesn't go Bankrupt (not that I think they will).. Lets face it, the Microsoft monopoly is still there despite all the years of lawsuits to bring them down. Not that I'm particularly political, but if we have just Intel running the processor market, well who says they can't charge whatever they darn-well please for their chips? Last time I checked Microsoft Office 2012 was something to the tune of $400+, and here we're already talking about having to buy the next edition are you kidding me?
m
0
l
a b à CPUs
February 10, 2012 7:18:07 AM

esrever said:
ARM performance is still far far behind even AMD. the in order ops used by ARM cpus are going to make any true multitasking lag especially if more software are being ran on the background. ARM performance are still behind that of the first gen atoms so AMD still has the ability to produce chips above the ARM power consumption window and give enough performance for the average user. ARM is only updating to 64 bit this year so AMD is quite far ahead in any performance race, the power consumption is another story but the low power AMD apus aren't too bad, we will see what they can do.

With ATI's graphics technology and an x86 license, AMD can do a lot of things no other company can so I don't think they will be going bankrupt any time soon.


A15 introduces out of order ops BTW.

That's true the performance numbers are not entirely known yet but a 2.5Ghz Quad-core ARM Cortex-A15 will pack a decent punch. It's just the sheer volume of money being invested in ARM.

You have Apple, Samsung, NVidia (Tegra4), TI (Omap5), Qualcomm, Applied Micro, Calxeda, etc. SOOOO much money being pumped into ARM designs right now. It looks like several A15 chips will be shipping in 2012, including a couple of the new 64bit variants.

I hope AMD can pull off a surprise with Trinity but I'm doubtful. Intel is narrowing the GPU gap with Ivy as well.
m
0
l
a b à CPUs
February 10, 2012 8:24:15 AM

nekulturny said:


BTW, I hope AMD doesn't go Bankrupt (not that I think they will).. Lets face it, the Microsoft monopoly is still there despite all the years of lawsuits to bring them down. Not that I'm particularly political, but if we have just Intel running the processor market, well who says they can't charge whatever they darn-well please for their chips? Last time I checked Microsoft Office 2012 was something to the tune of $400+, and here we're already talking about having to buy the next edition are you kidding me?


Mobile market sales surpassed PC sales this year so Intel doesn't have close to a monopoly on the processor market anymore.

With the ARM competition, Intel could buy AMD and get away with it. They need the GPU boost. How many processors do we need?

There's already quite a list (In order of CPU power and scalability)
MIPS(China - low end mobile)
ARM(Soon 3Ghz and 64bit)
AMD
Intel
Oracle (Sparc series)
IBM (Power series)


AMD processor (CPU) wise isn't really necessary anymore to maintain a balance. Imagine what Intel could do with the Radeon design if they made it at 22nm Tri-Gate.

Every market consolidates from time to time.

PS: I still use Office 2003. Whats the big feature to upgrade for in 2012? lol
m
0
l
a b à CPUs
February 10, 2012 8:40:20 AM

Cazalan said:
You can see the overclock power consumption here:

http://www.tomshardware.com/reviews/bulldozer-efficienc...

There was a 152 Watt difference from 3.8 to 4.6Ghz.

If run continuously (say you run FAH) that adds up to a good amount.

152 Watt * 24 hours * 365 days / 1000 = 1331 KW hours

Power rates vary by city/state. Mine's at 0.14/hour.

1331 KWh * ($0.14 / KWh) = $186


PS: That chart is hugely out of date and doesn't match my usage at all. I never have more than an effective 20 Watt light on at a time and I don't run AC either. You can get fridges that only cost $80/year in electricity. My computers are easily 75% of my electric bill.

I can tell you one thing right now, that article is flawed. I am sorry to say but if any of the testers at toms would like to ask me I would be more than happy to give you the rundown on my entire bios settings.

first is this

Clock Frequency: 4.5 GHz, Multiplier: 22.5x, CPU Voltage: 1.428 V



Notice anything? ya, thats right, at the time of the screenshot its running at 1.38V, but the title sais 1.428V

Thats because BD is extremely aggressive in AMD's power gating. For stock its perfect. for overclocking, not so much. here is why. The cpu voltage ranges from 1.38V to 1.428V while running at 4.5ghz. Now here is where the problem is. the cpu is perfectly stable at 1.38V but because of early bios/lack of settings, in order to obtain a minimum voltage of 1.38v, they had to crank it up to 1.428v. What if they just set it at 1.38V? then the low voltage is now 1.332. wich is unstable with their particular chip.

New bios or more advanced bios than the one used for this test have an option called LLC, load-line calibration. This setting changes the effect of what I just said. therefore if you set it to maximum or 100%, 1.38v is 1.38v to 1.38v. there is no variance. so now instead of running from 1.38v to 1.428v, your running 1.38v period. power consumption down, stability is up.

There are also some other minor tweaks to other parts that will increase stability at a lower voltage. Essentially getting the most out of my 8120 with the smallest voltage, here is the result.



Thats right, toms got 4.6 ghz with 1.5V, I managed to squeeze it to 4.7 ghz at 1.344V ... flawed power consumption from toms article for the win.

As I said, if any of the testers wants to duplicate my settings, let me kow.
m
0
l
a c 78 à CPUs
February 10, 2012 9:05:30 AM

Cazalan said:
Mobile market sales surpassed PC sales this year so Intel doesn't have close to a monopoly on the processor market anymore.

With the ARM competition, Intel could buy AMD and get away with it. They need the GPU boost. How many processors do we need?

There's already quite a list (In order of CPU power and scalability)
MIPS(China - low end mobile)
ARM(Soon 3Ghz and 64bit)
AMD
Intel
Oracle (Sparc series)
IBM (Power series)


AMD processor (CPU) wise isn't really necessary anymore to maintain a balance. Imagine what Intel could do with the Radeon design if they made it at 22nm Tri-Gate.

Every market consolidates from time to time.

PS: I still use Office 2003. Whats the big feature to upgrade for in 2012? lol



The other CPU companies as of today really don't provide any serious competition compared to AMD and Intel, as of now, I don't think there are too many ATX mobos floating around for anything but the two, at least I've never heard of anyone building their own system in the past couple years with anything but AMD or Intel.. Be patient with me, I'm a first year computer tech major, I'm still learning here.

How many processors do we need? About as many candybars as the local corner store offers, some people like variety. Even if we all like Hershey bars, I still like the option to buy a Clark bar if I decide one day I'm in the mood for it. From a business standpoint, lack of competition promotes not only overpricing, but laziness in innovation. Is it a concern for computer technology in the 21st century? Probably not. But one never knows.

As far as Office 2010.. Thank God I got it free through my college. Theres no way I'd buy it. As far as whats new in it vs 2003? Not a whole lot honestly, the features are a lot easier to navigate is one thing I'd say it has going for it. Left to my own devices, I'm satisfied with OpenOffice. I miss old Clippy from the old Office, but man was he a patronizing little you know what....
m
0
l
February 10, 2012 8:47:47 PM

seumas_beathan said:
Amd will not go bankrupt, they still make a fortune from their gpu's

They lost money last qtr on GPU's.

nekulturny said:
Highly unlikely considering Intel pays AMD to even have the right to make 64 bit chips.

Did you miss all those qtr's from only a few years ago, where AMD racked up billions in losses.

Quote:
Besides, why do people say the Bulldozer is bad? I'm not dropping 300 bucks on a 2600k. I'll buy an FX-4100 for 100 bucks, with a watercooler and overclock the crap out of it and get the same performance for less money. And I guarantee there are a million other people out there like me who take the same view of it.

Considering that it is widely held that only 5% of computer buyers overclock their machines and because the FX-4100 is so unimpressive, I very much doubt that many people would be doing what you are espousing.

What's more, even a madly overclocked FX-4100 is going to be struggling to beat the 2600K, plus you then get the heat, electricity cost and quite possibly extra noise issues to deal with.

m
0
l
a c 146 à CPUs
a b À AMD
February 10, 2012 9:08:32 PM

Exactly what I was thinking Chad. Many people probably are not going to be doing what he is doing. Also an overclocked Bulldozer has to be over clocked to around 5 GHz just to be competitive to a stock 3.3 GHz Intel CPU.
m
0
l
a b à CPUs
a b À AMD
February 10, 2012 9:10:17 PM

@chad
Want to post something to support your claims that AMD lost money in the gpu department? or that AMD had "billions" in loses? Or post something about the claim 5% overclockers? Or compare the FX4100 to a cpu in the right price range?
m
0
l
a c 78 à CPUs
February 10, 2012 9:11:11 PM

Chad,

Quote:
Did you miss all those qtr's from only a few years ago, where AMD racked up billions in losses.


What company wasn't losing billions in the current economy 'a few years ago'?
AMD Stock


Intel Stock


Looks to me like they both had some rough times.

Quote:

Considering that it is widely held that only 5% of computer buyers overclock their machines and because the FX-4100 is so unimpressive, I very much doubt that many people would be doing what you are espousing.

What's more, even a madly overclocked FX-4100 is going to be struggling to beat the 2600K, plus you then get the heat, electricity cost and quite possibly extra noise issues to deal with.


Again, 300 dollar chip vs 100 chip. At that price, the FX-4100 is as powerful as it needs to be. You're trying to compare a VW Rabbit to a Ferrari, no wonder you're disappointed.

If you want to compare the two, at least keep it proportionate. As far as noise, I'm not particularly concerned about that either. I've never seen in hardware reviews as fan noise being a practical concern.

For 2 reasons (my personal reasons)
1. I'm the kind of person who runs a fan in my bedroom in the dead of winter for the ambient noise.
2. When I'm working on the computer for homework, or gaming, I've got music going anyway.

And as far as power consumption, well, a "noob" has already challenged Tom's review on it. I admit that is beyond my technical knowledge, so I won't defend or contradict it either way, only to point out to you that it has been disputed.
m
0
l
a b à CPUs
February 10, 2012 9:29:49 PM

rds1220 said:
Exactly what I was thinking Chad. Many people probably are not going to be doing what he is doing. Also an overclocked Bulldozer has to be over clocked to around 5 GHz just to be competitive to a stock 3.3 GHz Intel CPU.

In what, w-prime? 80% of the time the 8150 is within 5% of the 2500k, how does that even come close to fanboy claims of 33% other than programs that don't mean squat like w-prime.

Edit : sorrry, super pi, not w-prime
m
0
l
a b à CPUs
a b À AMD
February 10, 2012 9:31:46 PM

wprime runs well on the FX I though :o 
m
0
l
a b à CPUs
February 10, 2012 9:31:47 PM

nekulturny said:
The other CPU companies as of today really don't provide any serious competition compared to AMD and Intel, as of now, I don't think there are too many ATX mobos floating around for anything but the two, at least I've never heard of anyone building their own system in the past couple years with anything but AMD or Intel.. Be patient with me, I'm a first year computer tech major, I'm still learning here.

How many processors do we need? About as many candybars as the local corner store offers, some people like variety. Even if we all like Hershey bars, I still like the option to buy a Clark bar if I decide one day I'm in the mood for it. From a business standpoint, lack of competition promotes not only overpricing, but laziness in innovation. Is it a concern for computer technology in the 21st century? Probably not. But one never knows.



PC sales are lagging and it's not due to a lack of competition. Just the opposite. The mobile market is getting powerful enough to displace PC upgrade cycles.

On the desktop today there is little competition for Intel, but in 1-2 years that will be a very different story. It's rumored that Apple has already dropped Intel for their laptops in the 2013 time frame. That signifies that Apple (the largest buyer of components) is satisfied enough with the ARM roadmap to switch processors yet again (PowerPC->X86->ARM).

If you're starting out in tech today you want to go where the money is heading, not where it used to be. Variety is good to a degree but you want skills that are transferable. You can't learn the nuances of 20 different platforms with much success. Apple and Android are successful today because they pumped tons of cash into those ecosystems.
m
0
l
a c 78 à CPUs
February 10, 2012 9:36:52 PM

Cazalan said:
PC sales are lagging and it's not due to a lack of competition. Just the opposite. The mobile market is getting powerful enough to displace PC upgrade cycles.

On the desktop today there is little competition for Intel, but in 1-2 years that will be a very different story. It's rumored that Apple has already dropped Intel for their laptops in the 2013 time frame. That signifies that Apple (the largest buyer of components) is satisfied enough with the ARM roadmap to switch processors yet again (PowerPC->X86->ARM).

If you're starting out in tech today you want to go where the money is heading, not where it used to be. Variety is good to a degree but you want skills that are transferable. You can't learn the nuances of 20 different platforms with much success. Apple and Android are successful today because they pumped tons of cash into those ecosystems.



Sounds reasonable to me, I can accept that. I really lack the mathematical skills to go into heavy programming, largely I'm undecided in what field concentrations I want to be in. I just know that the IT industry seems to be the only one thats actually growing. At 27, I have a strong background in logistics management. I was fortunate enough to become the youngest person running an entire state's freight operations in a company. Alas, I was laid off from that job. I'd be content to work in a warehouse the rest of my life, even with a BS degree in Network Admin or *insert computer tech field here*.
m
0
l
February 10, 2012 11:34:43 PM

esrever said:
@chad
Want to post something to support your claims that AMD lost money in the gpu department?

Actually my mistake, AMD in the last two QTR's made a whopping $27 million and $12 million respectively from their GPU division.

http://ir.amd.com/phoenix.zhtml?c=74093&p=quarterlyearn...
http://phx.corporate-ir.net/External.File?item=UGFyZW50...

A somewhat different perspective to the claim that "Amd will not go bankrupt, they still make a fortune from their gpu's"

Quote:
or that AMD had "billions" in loses?


From the above link to AMD's Financial Results, their Q4 2008 Financial Reports show that the losses for the 2007 Calendar year was $3.379 Billion and for the 2008 Calendar year it was $3.098 Billion.

Quote:
Or post something about the claim 5% overclockers?


This 5% is quoted regularly by Industry Insiders.

Even your High Priest John Fruehe made the 5% overclocking claim.

Quote:
Or compare the FX4100 to a cpu in the right price range?


What is the right price range?

Why are you comparing the FX4100 only to the 2600K, why don't you compare it to the cheaper 2500K?

Overclocked, the 2500K absolutely destroys any similarly overclocked FX4100 by embarrassing margins.



nekulturny said:
Chad,

Quote:
Did you miss all those qtr's from only a few years ago, where AMD racked up billions in losses.


What company wasn't losing billions in the current economy 'a few years ago'?

Intel, Apple, IBM, etc

And I am talking about declared Profits or Losses, not declines in Market Capitalisation.

Quote:

Again, 300 dollar chip vs 100 chip. At that price, the FX-4100 is as powerful as it needs to be. You're trying to compare a VW Rabbit to a Ferrari, no wonder you're disappointed.

So compare the FX-4100 to a i5 2500K then.

I can't even begin to imagine why anyone would buy an FX-4100 over a i5-2500K.

If you are that poverty stricken to be considering taking the performance hit to save a few pennies, then you would probably be better off buying a 2nd hand computer.

m
0
l
February 10, 2012 11:40:45 PM

Chad Boga said:
Actually my mistake, AMD in the last two QTR's made a whopping $27 million and $12 million respectively from their GPU division.

http://ir.amd.com/phoenix.zhtml?c=74093&p=quarterlyearn...
http://phx.corporate-ir.net/External.File?item=UGFyZW50...

A somewhat different perspective to the claim that "Amd will not go bankrupt, they still make a fortune from their gpu's"

Quote:
or that AMD had "billions" in loses?


From the above link to AMD's Financial Results, their Q4 2008 Financial Reports show that the losses for the 2007 Calendar year was $3.379 Billion and for the 2008 Calendar year it was $3.098 Billion.

Quote:
Or post something about the claim 5% overclockers?


This 5% is quoted regularly by Industry Insiders.

Even your High Priest John Fruehe made the 5% overclocking claim.

Quote:
Or compare the FX4100 to a cpu in the right price range?


What is the right price range?

Why are you comparing the FX4100 only to the 2600K, why don't you compare it to the cheaper 2500K?

Overclocked, the 2500K absolutely destroys any similarly overclocked FX4100 by embarrassing margins.





Intel, Apple, IBM, etc

And I am talking about declared Profits or Losses, not declines in Market Capitalisation.

Quote:

Again, 300 dollar chip vs 100 chip. At that price, the FX-4100 is as powerful as it needs to be. You're trying to compare a VW Rabbit to a Ferrari, no wonder you're disappointed.

So compare the FX-4100 to a i5 2500K then.

I can't even begin to imagine why anyone would buy an FX-4100 over a i5-2500K.

If you are that poverty stricken to be considering taking the performance hit to save a few pennies, then you would probably be better off buying a 2nd hand computer.


The i5 isn't in the same range as the 4100 though....Isn't the i5 2500k's MSRP 229? And the 4100's is 109? Doesn't seem fair to me ;_;
m
0
l
a c 78 à CPUs
February 10, 2012 11:45:32 PM

Cynasnow, I'm not even going to give a thoughtful reply to your post, it was heavily laced with sarcasm and disrespect over something so arbitrary as a computer chip.

Addendum: Looks like someone edited their post, and on second glance I'm not sure if it was Chad or Cyna who made the comments I deemed disrespectful. Looking at the posts, it seems it was Chad, my apologies to Cyna. Chad, I don't know what planet you came from, but you will not speak to me disrespectfully like that and expect to have a serious discussion, my advice to you is to grow up.


As far as the cost of the FX-4100, I paid 99.99 for it from Microcenter last week.
m
0
l
February 11, 2012 12:34:02 AM

nekulturny said:
Cynasnow, I'm not even going to give a thoughtful reply to your post, it was heavily laced with sarcasm and disrespect over something so arbitrary as a computer chip.

Addendum: Looks like someone edited their post, and on second glance I'm not sure if it was Chad or Cyna who made the comments I deemed disrespectful. Looking at the posts, it seems it was Chad, my apologies to Cyna. Chad, I don't know what planet you came from, but you will not speak to me disrespectfully like that and expect to have a serious discussion, my advice to you is to grow up.

What exactly are you referring to here?

You seem to be very sensitive and delicate. :heink: 
m
0
l
a c 78 à CPUs
February 11, 2012 1:09:31 AM

I'm not sure any more you know. Somebody editied something, somebody sarcastically said "genius" somewhere. As far as being overly sensitive, not really. I just have better things to do with my time than have a discussion with someone who is going to show lack of respect.

Out of the benefit of the doubt, I'll add my reply to your previous post in this one.

Quote:
Overclocked, the 2500K absolutely destroys any similarly overclocked FX4100 by embarrassing margins.


Overclocked, the 2500k absolutely costs a lot more if you fry it. As far as anything else, I dont see the problem. Seriously, I'm not a heavy gamer, I play frickin Runescape, a browser based Java MMORPG. I don't have high needs for a system, theres no justification for more than 100 dollar chip to do that. My significant other is the real gamer, he does the Skyrim thing lately, I haven't heard him complain once.

Quote:
And I am talking about declared Profits or Losses, not declines in Market Capitalisation.


No comment to that, as it goes beyond my realm of knowledge. As to what this thread originally started out as, I don't see AMD going bankrupt. For every mistake AMD has ever made, they still hold the patent on 64 bit technology, not Intel. Anyone who wants to use it (at least as I understand it) is going to have to pay the toll.

Quote:
So compare the FX-4100 to a i5 2500K then.


2500k 225 bucks from Walmart per Google, we still aren't in the same ballpark, care to step down to the i3?

Quote:
I can't even begin to imagine why anyone would buy an FX-4100 over a i5-2500K.


I've given my reasons in this thread, you're entitled to think it was foolish. I have no regrets, and thats all that really matters.

Quote:

If you are that poverty stricken to be considering taking the performance hit to save a few pennies, then you would probably be better off buying a 2nd hand computer.


And what would I buy? LOL! I wanted to build a computer to replace my aging Dell XPS M1530 (a pretty ballsy laptop back in the day) and by George, I've done it! (who the hell is "George" anyway?)
m
0
l
February 11, 2012 2:11:54 AM

nekulturny said:
I'm not sure any more you know. Somebody editied something, somebody sarcastically said "genius" somewhere. As far as being overly sensitive, not really. I just have better things to do with my time than have a discussion with someone who is going to show lack of respect.

Out of the benefit of the doubt, I'll add my reply to your previous post in this one.


There is no need for benefit of the doubt.

Whilst it annoys me that posts that one edits here on Tom's always has the "edited by . . . ." byline(even if you edit within seconds of posting), in this instance it works quite well in my favour.

None of my posts in this thread have the "edited by . . . ." byline. It is not hard to work out the rest.


Quote:
Overclocked, the 2500k absolutely costs a lot more if you fry it.

So one should settle for the cheaper FX-4100, in case you end up frying your CPU? :pfff: 

Quote:
2500k 225 bucks from Walmart per Google, we still aren't in the same ballpark, care to step down to the i3?

No. My argument is that the extra performance you get from the 2500K over the FX-4100, makes it the much more compelling buy.



Quote:
I've given my reasons in this thread, you're entitled to think it was foolish. I have no regrets, and thats all that really matters.

Your Cat also thinks you are foolish.
m
0
l
a c 78 à CPUs
February 11, 2012 2:44:35 AM

You leave my cat out of it thanks.

2 places you don't go with me lest you incur the wrath of fairly large, strong person such as I am. :pt1cable: 

1. My significant other
2. My cat


Either way, we agree to disagree. We live in a capitalistic society where people are free to choose whatever product they want. No need to make it personal, its not your computer to worry about.

BTW, am I to assume we're not going to compare the i3 to the 4100? Yes, the 4100 it knocks the stuffing out of it. And guess what, they're about evenly priced.
m
0
l
February 11, 2012 3:09:45 AM

nekulturny said:
You leave my cat out of it thanks.

2 places you don't go with me lest you incur the wrath of fairly large, strong person such as I am. :pt1cable: 

1. My significant other
2. My cat

I would like to kidnap your cat and give him/her a better life. :p  :D 

Quote:
Either way, we agree to disagree. We live in a capitalistic society where people are free to choose whatever product they want. No need to make it personal, its not your computer to worry about.

I thank you for your sacrifice.

No one who cares about buying CPU's at a reasonable price, wants to see AMD disappear.

Quote:
BTW, am I to assume we're not going to compare the i3 to the 4100? Yes, the 4100 it knocks the stuffing out of it. And guess what, they're about evenly priced.

The i3 beats the FX-4100 in gaming and the FX-4100 does better in video encoding.

m
0
l
a c 78 à CPUs
February 11, 2012 3:26:28 AM

Chad Boga said:
I would like to kidnap your cat and give him/her a better life. :p  :D 

Quote:
Either way, we agree to disagree. We live in a capitalistic society where people are free to choose whatever product they want. No need to make it personal, its not your computer to worry about.

I thank you for your sacrifice.

No one who cares about buying CPU's at a reasonable price, wants to see AMD disappear.

Quote:
BTW, am I to assume we're not going to compare the i3 to the 4100? Yes, the 4100 it knocks the stuffing out of it. And guess what, they're about evenly priced.

The i3 beats the FX-4100 in gaming and the FX-4100 does better in video encoding.


She (the cat) is no longer with me, thats why its a touchy subject. Yes, she was quite a sweetheart. As far as the significant other, hes 5'7 140lbs, he can take care of himself, but I'm still the protective type at 6'1 240, what can I say? :kaola: 
m
0
l
February 11, 2012 3:54:43 AM

nekulturny said:
She (the cat) is no longer with me, thats why its a touchy subject. Yes, she was quite a sweetheart.

A shame. She looks like a very sweet kitty indeed.
m
0
l
a c 112 à CPUs
a b À AMD
February 11, 2012 4:50:33 AM

amd took a gamble with the new architecture and it hasnt payed off. it may well pay off in future as programs are specifically written to make better use of it. the problem is, getting programmers to completely rethink there approach to the new cpu. windows 8 will help a little but in reality it will be linux and other opengl sources that will adopt it faster. after all fx is basically a server part and the majority of servers run on linux.

amd can still remain competitive in the cpu market by dye shrinking the phenom and increasing its functionality by adding more code paths. but it looks like thats not what there road map is pointing to. it looks like they are heading to the fusion of cpu/gpu.
i think they see there future more in the tablet and mobile sector where the competition although intense isnt so far a head in development. meaning less pressure on the cpu manufacturer to make the best cpu's. as its an even field theres more of a chance that they can distinguish themselves by bringing what they have learned from development of the desktop/server market to bear on what is a fledgling industry. i guess they hope they can hit the ground running and steal a lead...
m
0
l
a c 146 à CPUs
a b À AMD
February 11, 2012 5:34:45 AM

noob2222 said:
In what, w-prime? 80% of the time the 8150 is within 5% of the 2500k, how does that even come close to fanboy claims of 33% other than programs that don't mean squat like w-prime.

Edit : sorrry, super pi, not w-prime


It's been repeatly cited that the Bulldozer needs to be overclocked just to be competitive with Intel CPU's.
m
0
l
August 29, 2012 11:28:42 PM

I own 3 AMD systems i built all were very affordable compared to intel's setups and with 4 cores and a good card i can run skyrim on high with a decent framerate...for people's pockets these days they offer a great all purpose system that can run some of the most demanding of modern games...intel makes great processors too...but they tend to cost more and add very little for the difference in performance vs. what amd offers in parts/systems. Plus their onboard gpu's work well with other pci-e cards (i haven't had one conflict or lockup using 2 gpu's together for years of building amd and intel based systems) these systems are ran for many years and replaced when the new playstation or xbox etc comes out to coincide with gaming improvements.
m
0
l
a c 471 à CPUs
a c 118 À AMD
August 30, 2012 1:01:56 AM

Just to let you know, Skyrim does not use 4 cores, only 2 cores.

Dual core Intel CPUs tends to perform just as well or better than more expensive AMD quad core CPUs in games, even in some games that can actually use more than 2 cores. I recall benchmarks where a Core i3-2100 giving the same gaming performance +/- 3FPS as the Phenom II X4 975 using the same graphics card.

I am not aware of any conflicts between Intel integrated graphics and using 2 GPUs together. However, I only use a single graphics card.

The main advantage of AMD is the lower price for quad core CPUs that can be overclocked. Unfortunately, that "affordable" price is directly related performance in comparison to Intel CPUs. Generally speaking, I recommend a Phenom II X4 965 to those people who want to go with AMD instead of Intel. The X4 965 is slightly cheaper than a Core i3-2100 and it is a quad core CPU. While at stock speed the X4 965 does not beat a Core i3-2100 in games, it is an overall better performing CPU if you do other CPU intensive task like encoding video. Of course, when OC'ed the X4 965 should provide the same performance in games as the Core i3-2100; if not better; depending on the OC.

I don't recommend the FX series because in general gaming performance is marginally worse than a Phenom II. You'll need benchmarks to see the difference in most cases, but why pay more money for marginally less performance and higher power consumption?
m
0
l
a c 78 à CPUs
August 30, 2012 2:32:21 AM

Quote:
Just to let you know, Skyrim does not use 4 cores, only 2 cores.


You are incorrect sir.

http://www.techspot.com/review/467-skyrim-performance/p...

Quote:
The game appears to only be optimized for quad-core processors, as just four threads of the Core i7-2600K were active when playing. However, of those four threads, only one reached between 90–100% while the other three worked at around 30–50%. This means a decent dual-core processor should have no trouble playing Skyrim.


Skyrim is a console port as well. Both X-box 360 and Playstation 3 use tri-core CPUs.
m
0
l
a c 141 à CPUs
a b À AMD
August 30, 2012 2:42:30 AM

Hmm. not many hex-core people chiming in I see. I havent played Skyrimi in awhile but i seem to recall it using all 6 cores.
m
0
l
a c 78 à CPUs
August 30, 2012 2:45:22 AM

Yea. lol, btw this thread is an old one. I was less educated 6 months ago than I am now, and thats all I have to say about that, (preemptively before anyone brings it up.) :lol: 

But, from my experience with Skyrim, it definitely uses 4 cores. This isn't a screen shot of mine, mainly because I refuse to play skyrim again til I ditch my 550 TI, but it looked exactly like this (found this on google).




m
0
l
August 30, 2012 3:05:10 AM

LMAO AMD is quite well off seeing as they are competitors to intel in other components.
m
0
l
a c 78 à CPUs
August 30, 2012 3:26:38 AM

Yea.. well I'm by no means a stock expert, so don't take this as gospel, but I bet now is a good time to buy. Right before PileDriver comes out, even if it turns out to suck, I bet the stocks will get a nice bump for a bit. You could sell.
m
0
l
!