Sign in with
Sign up | Sign in
Your question
Closed

Another Long-time Graphics Leader Leaves AMD

Last response: in News comments
Share
April 3, 2012 1:38:27 AM

So the graphics guys who came up with good products are leaving yet the CPU guys who made failure after failure are still there.
Score
42
April 3, 2012 1:45:21 AM

For one moment, I thought the title of the article was referring to XFX leaving ATI/AMD. Sad man, now I(
Score
9
Related resources
April 3, 2012 1:45:40 AM

AMD's ship is sinking. Abandon ship! Women and Mogs first!
Score
14
April 3, 2012 2:09:50 AM

I hope AMD/ATI will not completely sink out... Or else we will have Intel and Nvidia monopolizing the market and it will be BAD news for consumers... and I mean BAD news!!!
Score
25
a b à CPUs
April 3, 2012 2:13:41 AM

Another one bites the dust
Another one bites the dust
And another one gone, and another one gone
Another one bites the dust
Hey, I(ntels) gonna get you too
Another one bites the dust
Score
8
April 3, 2012 2:13:46 AM

wowww what are you smokinggg

ATi started doing PHENOMENALLY after about 1 year they were acquired... they slumped on the hd 2xxx series, but the hd 3xxx fixed it in less than a year... and ever since the hd 4xxx series, they've been doing GREAT. i just RECENTLY replaced my hd 4850... i paid like $100 for it.

and before that, they were getting their asses handed to them regularly. at least now they put up a good fight
Score
26
a b à CPUs
April 3, 2012 2:18:55 AM

"AMD believes its New Zealand and 'Sea Islands' architectures will be a potent challenge for Nvidia's GK104 and noted that the Radeon HD 7000 family competes very well against Kepler's reduced GPGPU performance."

GK110 is all I have to say....
Score
-8
a b à CPUs
April 3, 2012 2:21:15 AM

ATI and Nvidia have been going back and forth since the beginning of time.

I think it's alright that these guys are leaving, bring in new blood and shake things up for once.
Score
14
April 3, 2012 2:26:02 AM

Does suprise me, AMD will always be #2 to Nvida as long as they keep playing catch up. "AMD believes its New Zealand and 'Sea Islands' architectures will be a potent challenge for Nvidia's GK104" Always playing catch up. Next cards come out that are just barely better than Nvidias year old cards, then Nvida releases there next gen that puts AMD way behind again.
Score
-16
a b à CPUs
April 3, 2012 2:27:38 AM

If it keeps up amd is going under for sure, I hate to even think about what will result of it.
Score
-9
April 3, 2012 2:40:10 AM

Well, I hope AMD can find something to make up for the lost talent. Losing your top employees is a pain for any companies, especially in groups.
Score
4
April 3, 2012 2:42:04 AM

AMD graphics are going strong, the only thing ruining the company is their FAILDOZER division...
Maybe AMD needs to do something bold for a chance like Building fabs, and trying to compete with intel.
Score
12
Anonymous
a b à CPUs
April 3, 2012 2:52:22 AM

Bulldozer has its place. It powers 13 HPC machines for molecular dynamics we have on campus, each with 64 cores and 256GB 1600mhz RAM. You can't beat the low price/core, even with the shared FPU.
Score
8
April 3, 2012 2:54:04 AM

Sad day for the industry...AMD seems like heading to go under...bad for consumer when Intel and Nvidia dominate the market....hmm...maybe Samsung can take over AMD...Samsung has manufacturing capability(and $$), but not the technical design....or maybe Qualcomm can take over, since they're doing pretty well in the mobile market and expand to the desktop arena...
Score
-2
April 3, 2012 2:54:44 AM

sp0nger

"Does suprise me, AMD will always be #2 to Nvida"

You should quit smoking whatever it is your smoking,it clouds your comments.
Score
17
April 3, 2012 3:02:49 AM

He headed a project at ATI since 1998 and just now left the company, but only worked there for 12 years? Article also vaguely implies that he may have worked there even before 1998. Hmmm....
Score
5
April 3, 2012 3:03:08 AM

AMD seems to have more dumb ass on top of the company than I imaging. They have a huge advantages over Intel for graphic department, if AMD is smart enough they could have integrate GCN into every opteron long b4 Nvidia come up with Tesla card. With the combination of x86 + GPU compute architecture in 1 package AMD can pretty much set up a stronghold in highly profitable server market giving Intel a headache, but sadly they didnt do it. Even till today we dont have a Opteron integraing GCN into it. Such a fail company.
Score
-2
April 3, 2012 3:13:36 AM

TomfreakAMD seems to have more dumb ass on top of the company than I imaging. They have a huge advantages over Intel for graphic department, if AMD is smart enough they could have integrate GCN into every opteron long b4 Nvidia come up with Tesla card. With the combination of x86 + GPU compute architecture in 1 package AMD can pretty much set up a stronghold in highly profitable server market giving Intel a headache, but sadly they didnt do it. Even till today we dont have a Opteron integraing GCN into it. Such a fail company.

Question, why would you want the graphics chip integrated with your main CPU on one huge chip? Server chips are built with reliability in mind, and the reliability of integrated CPU and GPU chips all in one package still hasn't been tested in full yet. Those type of APUs seem more a fit for consumer devices.
Score
8
April 3, 2012 3:23:50 AM

yezsterI hope AMD/ATI will not completely sink out... Or else we will have Intel and Nvidia monopolizing the market and it will be BAD news for consumers... and I mean BAD news!!!


Maybe not. If PC gamers just have one GPU architecture. Maybe programmers will be able to optimize games and PC can get the same optimization that the weak console gets.
Score
-7
April 3, 2012 3:51:07 AM

SwolernMaybe not. If PC gamers just have one GPU architecture. Maybe programmers will be able to optimize games and PC can get the same optimization that the weak console gets.


and the same (over)price as weak a$$ consoles
Score
1
a b à CPUs
April 3, 2012 4:33:06 AM

Restructuring is good if the management has become old and stale, but this guy seems to have the cred...why AMD let him go boggles me.
Score
0
April 3, 2012 4:33:56 AM

The PC has a standard, it's called OpenGL. It works across dozens of CPU/GPU's but the problem is everyone keeps buying into MS. Hate to break this news to everyone but 1) MS doens't use it's own OS to manage it's company (it uses IBM's AS/400 (i5) series. 2) MS didn't write NTFS, it came from IBM along with the NT KERNEL (formally known as OS/2 Warp). 3) THere hasn't been a major upgrade to the kernel since Windows NT.... seriously.4) WIndows 7 kernel is a lightweight Windows Vista Kernel (Same drivers, same regsitry, blah blah). 5) XBOX doesn't run windows.

Stop buying games that are Direct 3D only and maybe, just maybe there is a future that is enjoyable. Otherwise, all this AMD vs NVDA is pure crap.

PS: AMD/ATI can't write drivers still even after 10 yrs they still suck b*llls at it. That isn't going to change that is why NVDA will ALWAYS be first choice by developers.
Score
-1
April 3, 2012 4:36:18 AM

oh well, this sucks....i wished ATI never sold to AMD....it would have been better to have sold to nVidia 50 percent of the company and keep the other 50 and concentrate on the mobile market to compete with nVidia's mobile graphics
Score
-3
April 3, 2012 4:52:22 AM

i hate nvidia business practices, and i wont go into detail but in the past they royally screwed me and i wont forgive them.

this is just me but if amd gpu division died, i would go intel before i go nvidia that's how deep my hate goes.
Score
2
a b à CPUs
April 3, 2012 6:10:54 AM

alidani hate nvidia business practices, and i wont go into detail but in the past they royally screwed me and i wont forgive them. this is just me but if amd gpu division died, i would go intel before i go nvidia that's how deep my hate goes.


Nvidia doesnt sell the cards, whatever beef you had with an nvidia base GPU should be taken up with the reseller. But hey why not hate an entire company.
Score
3
a b à CPUs
April 3, 2012 6:16:51 AM

sp0ngerDoes suprise me, AMD will always be #2 to Nvida as long as they keep playing catch up. "AMD believes its New Zealand and 'Sea Islands' architectures will be a potent challenge for Nvidia's GK104" Always playing catch up. Next cards come out that are just barely better than Nvidias year old cards, then Nvida releases there next gen that puts AMD way behind again.


The 7970 is right behind the GTX 680 most of the time and it beats the 580 handily (more than it loses to the 680), so no, AMD isn't a generation behind like you imply. Besides that, the only reason that Nvidia is winning with GK104 like this is that AMD made GCN with compute in mind (expending very large fractions of the die size and power consumption to accomplish this) and AMD also clocked their cards way too low.

Also, the only reason that the GTX 580 was so much faster than the 6970 (the difference was actually fairly small) is because it's GPU was almost twice as large. Yeah, 530mm2 compared to 375mm2 or so, what do you think will win if they're on the same process node? Nvidia, for whatever reason, decided to abandon their compute performance (the 680 loses in DP compute to even the GTX 470 and 560 TI), so they could focus on gaming throughput only.

The problem with this approach is that more and more games are becoming reliant on compute performance, so the 7970 is a more future proofed option. The 7970 also has more memory and memory bandwidth, so it's a more future proofed option for another reason. In fact, the GTX 680's memory bandwidth bottleneck coupled with it's poor compute performance is what causes it to lose in some games to the 7970 despite it supposed to being a faster card. f course, games often favor Nvidia or AMD so this is nothing new, but all Nvidia had to do to fix this was improve compute a little (just getting a 1 to 12 instead of a 1 to 24 DP to SP ratio would have let the 680 match the 580 in DP performance) and give the 680 a 384 bit bus.

AMD wanted to hammer Fermi in compute, and that they do. The 7970 can run at over three times faster for DP and I don't remember how many times faster for SP than the 580 (the 680 is okay for SP compute, but most compute work is DP and it still loses by a huge margin in SP anyway)
Score
8
April 3, 2012 7:19:28 AM

He probably is going to work to nVidia, searching a better salary or something else... o.O
Score
0
April 3, 2012 7:44:18 AM

mister gQuestion, why would you want the graphics chip integrated with your main CPU on one huge chip? Server chips are built with reliability in mind, and the reliability of integrated CPU and GPU chips all in one package still hasn't been tested in full yet. Those type of APUs seem more a fit for consumer devices.
well if realiability is ur question, it still doesnt explain why AMD didnt use Opteron and bundle with GCN together as a strong hold for server market. Now we got Nvidia laughing their ass off with tesla, then we got Intel smiling with the 90% cpu market share, tell what good is AMD if they got a gun that didnt put into good use?
Score
1
April 3, 2012 8:15:40 AM

AMD worked out that integer performance was far more important in the server world than floating point performance, hence the significant boost in integer resources with Bulldozer which does actually pay off where it's required. Intel CPUs are so damned good at integer calculations, and they happen to own the vast majority of the server market without having any onboard GPUs. So, why would AMD want to throw a GPU into its server CPUs?

Notwithstanding the fact that this wouldn't be good whatsoever for power usage... something AMD is struggling with thanks to the first-gen 32nm process at GF.
Score
4
April 3, 2012 8:51:01 AM

stm1185So the graphics guys who came up with good products are leaving yet the CPU guys who made failure after failure are still there.


This world's awesome logic :L
Score
0
a b À AMD
a c 123 à CPUs
April 3, 2012 10:03:53 AM

Ok, it sounds like AMD lost a manager, not an engineer or developer; a coordinator (and maybe a good one), but not a creator. Minor hitch maybe, but I see no great loss here. Moving right along...
Score
4
April 3, 2012 1:17:37 PM

vrumorNvidia doesnt sell the cards, whatever beef you had with an nvidia base GPU should be taken up with the reseller. But hey why not hate an entire company.


before windows 7 came around, nvidia had 3d vision... i was willing to spend as much money as i had to to play games in 3d, i played once with a head mount and a game built for it and have always wanted to play that way again...

so im ready to buy a new monitor, a top end gpu, and a 3d kit from them... double check everything... wait... whats this... vista only... no xp version... they managed to crush my hopes, my dreams... in one fell swoop.

than with business practices, in wimtbp games, they lock code and forced ati to get a work around going, artificially making the game look worse on their cards, not sure if this still goes on

making phsyx an nvidia only thing, could easily have ported the code to x86 and allowed it to be better on computers, and sell the phsyx code as the best (currently it is) physics tool available, but no, to make their cards look better they gimp the code to x87, it runs like crap on x86, makes their cards look better, and screw over half the gaming market on the pc.

lets also not forget the drivers that burnt out cards

ill say it again, many reasons for me to despise nvidia.
Score
0
April 3, 2012 1:25:41 PM

vrumorNvidia doesnt sell the cards, whatever beef you had with an nvidia base GPU should be taken up with the reseller. But hey why not hate an entire company.


Not true if his beef is with the mobile Nvidia cards. The debacle with the mobile 8600GT that lead to a major class action against Nvidia still leaves a bad taste in my mouth. Especially since only a few of the big name companies were actually a part of that class action and others just got screwed over.
Score
2
a b à CPUs
April 3, 2012 1:28:19 PM

stm1185So the graphics guys who came up with good products are leaving yet the CPU guys who made failure after failure are still there.


I don't know if I should cry or laugh about this coment, it's so true.

I really hope AMD do something !!!!!! And with something I mean SOMETHING GOOD!
Score
0
April 3, 2012 1:34:23 PM

If the Intel fanboys get thier way and AMD goes under(not going to happen)they'll complain that prices are too high. Hey don't get me wrong I would love to have an I5-2500 but I'm not going to pay 2x as much for a %10-%20 performance increase.
Score
1
April 3, 2012 2:13:33 PM

antilycus 2) MS didn't write NTFS, it came from IBM along with the NT KERNEL (formally known as OS/2 Warp).

Must set this one straight, NT most certainly did not come from OS/2 Warp, or any other OS/2. That becomes apparent if you have ever used both. If memory serves, NT has it's roots in System V.
Score
1
April 3, 2012 3:46:16 PM

sounds like poor management that doesn't listen to those who know their stuff. Pretty sad that they're so delusional that they let their people quit before conceding.

Lets see how their future reflects these decisions
Score
1
April 3, 2012 4:47:43 PM

These guys were in charge all the talent that actually made things work is still there. They'll be fine.
Score
0
April 3, 2012 5:27:42 PM

yep cuz he knows its all geared towards shit consoles that don't need high end GPU's, this market is doomed.
Score
0
April 3, 2012 6:23:04 PM

get rid of all the former people from ATI who had anything to do with the drivers, they are all worthless.
Score
-1
April 3, 2012 6:31:54 PM

alidanbefore windows 7 came around, nvidia had 3d vision... i was willing to spend as much money as i had to to play games in 3d, i played once with a head mount and a game built for it and have always wanted to play that way again... so im ready to buy a new monitor, a top end gpu, and a 3d kit from them... double check everything... wait... whats this... vista only... no xp version... they managed to crush my hopes, my dreams... in one fell swoop. than with business practices, in wimtbp games, they lock code and forced ati to get a work around going, artificially making the game look worse on their cards, not sure if this still goes onmaking phsyx an nvidia only thing, could easily have ported the code to x86 and allowed it to be better on computers, and sell the phsyx code as the best (currently it is) physics tool available, but no, to make their cards look better they gimp the code to x87, it runs like crap on x86, makes their cards look better, and screw over half the gaming market on the pc. lets also not forget the drivers that burnt out cardsill say it again, many reasons for me to despise nvidia.

your reasons for hating Nvidia over bad drivers, well founded, this ATI guy that got axed was one of the guys that helped release all the terrible drivers for ATI before AMD bought ATI up and straightened out most of the mess ATI was.
as far as your 3-D argument goes, i'm laughing in your face because NOBODY has been able to produce a real working 3-D imaging solution for gaming or watching movies. that red blue glasses or shuttering is not 3-D. if it's displayed on a 1-D screen it's still 1-D.
any more of this 3-D rubbish and i am liable to buy a steam roller and run over everything and any one saying 3-D and put them in a picture frame just to show them what this image they claim to be 3-D really is.
Score
0
Anonymous
a b à CPUs
April 3, 2012 8:18:41 PM

Today AMD needs to make waves entering the mobile graphics market with a killer low watt APU. For all the talk of Bulldozer there is Tegra 3!!! Hello! McFly!!!
Score
0
April 3, 2012 10:03:18 PM

olafand the same (over)price as weak a$$ consoles

yes, but the weakest cpu's and graphics cards will cost more then the core i5 2500k or the gtx 580, since they would have a monopoly. you NEED competition to keep prices down.
Score
0
April 4, 2012 12:27:39 AM

f-14your reasons for hating Nvidia over bad drivers, well founded, this ATI guy that got axed was one of the guys that helped release all the terrible drivers for ATI before AMD bought ATI up and straightened out most of the mess ATI was.as far as your 3-D argument goes, i'm laughing in your face because NOBODY has been able to produce a real working 3-D imaging solution for gaming or watching movies. that red blue glasses or shuttering is not 3-D. if it's displayed on a 1-D screen it's still 1-D.any more of this 3-D rubbish and i am liable to buy a steam roller and run over everything and any one saying 3-D and put them in a picture frame just to show them what this image they claim to be 3-D really is.


you never played a game with a head mount display, or you never played a game in 3d that was optimized for 3d.

let me put it another way. in real life i have no death perception, dont know why, but i cant judge distance at all, but any style of 3d, be it the red blue, magenta green, shutter, passive, or head mount (2 monitors, one for each eye) i can see, and i would pay damn near anything for that. after the nvidia thing though... i decided to hold off for either a head mount, or a universal standard to emerge.
Score
0
April 4, 2012 6:19:52 AM

blazorthonThe 7970 is right behind the GTX 680 most of the time and it beats the 580 handily (more than it loses to the 680), so no, AMD isn't a generation behind like you imply. Besides that, the only reason that Nvidia is winning with GK104 like this is that AMD made GCN with compute in mind (expending very large fractions of the die size and power consumption to accomplish this) and AMD also clocked their cards way too low.Also, the only reason that the GTX 580 was so much faster than the 6970 (the difference was actually fairly small) is because it's GPU was almost twice as large. Yeah, 530mm2 compared to 375mm2 or so, what do you think will win if they're on the same process node? Nvidia, for whatever reason, decided to abandon their compute performance (the 680 loses in DP compute to even the GTX 470 and 560 TI), so they could focus on gaming throughput only.The problem with this approach is that more and more games are becoming reliant on compute performance, so the 7970 is a more future proofed option. The 7970 also has more memory and memory bandwidth, so it's a more future proofed option for another reason. In fact, the GTX 680's memory bandwidth bottleneck coupled with it's poor compute performance is what causes it to lose in some games to the 7970 despite it supposed to being a faster card. f course, games often favor Nvidia or AMD so this is nothing new, but all Nvidia had to do to fix this was improve compute a little (just getting a 1 to 12 instead of a 1 to 24 DP to SP ratio would have let the 680 match the 580 in DP performance) and give the 680 a 384 bit bus.AMD wanted to hammer Fermi in compute, and that they do. The 7970 can run at over three times faster for DP and I don't remember how many times faster for SP than the 580 (the 680 is okay for SP compute, but most compute work is DP and it still loses by a huge margin in SP anyway)


can you elaborate on the why you think the 7970 is better than the 680 in more detail please..
whats compute performance? is this where the GPU is also used for the work that a CPU would usually have to do?
how would this apply to games? what games are using this now?

im saving up for a graphics card and my canidates are either the 7970 or the 680.
my current card 9600gt has lasted me 4years. and i spent $160 on it.
im hopeing that if i drop $550 on one of those cards i can get another 4 years out of my new rig.
i plan to be playing alot of first person shooters. including bf3.

so whats the scoop?
Score
0
April 9, 2012 7:40:24 AM

horaciopzI don't know if I should cry or laugh about this coment, it's so true. I really hope AMD do something !!!!!! And with something I mean SOMETHING GOOD!


I just can't take a person seriously when he/she writes like this "It's so truuee!!!!!" Till this day Toms has yet to bring up some facts to say that Bulldozer is a "failure"

Quote:
Features & Benefits
Experience the world’s first native 8-core desktop processor.
Overclock for a big boost in performance and speed1.
Perform mega-tasking and get pure core performance with new CPU architecture.
Get an extra burst of raw speed when you need it most with AMD Turbo CORE Technology.
Push your performance with tuning controls in the easy-to-use AMD OverDrive™ software1.
Enjoy stable, smooth performance with impressive energy efficiency thanks to a 32nm die.


Where does it say it'll be any match to Intel in gaming? No where. Toms is the largest site that hyped the FX before it was out with all the b*lls*it it being twice as powerful when AMD never actually said that, so stop that hate towards it, Bulldozer is an Server CPU brought to the consumer market, they are fixing the flaws with Piledriver. It's just funny to see how many of you think you know it all but actually just make fool out of yourself.
Score
0
April 10, 2012 6:40:02 PM

You do have to smirk when you read about the "impressive energy efficiency", though.
Score
0
!