Talking Heads: VGA Manager Edition, September 2010

Status
Not open for further replies.
G

Guest

Guest
Who knows, By 2020, AMD would have purchased Nvidea and and renamed Geforcce to GeRadeon... And talk about considering integrating RAM, Processor, Graphics and Hard drive in Single Chip and name it "MegaFusion"... But there will still be Apple selling Apple TV without 1080p support, and yeah, free bumpers for your Ipods( which wont play songs if touched by hands !!!)
 

Kelavarus

Distinguished
Sep 7, 2009
510
0
18,980
That's kind of interesting. The guy talked about Nvidia taking chunks out of AMD's entrenched position this holiday with new Fermi offerings, but seemed to miss on the fact that most likely, by the holiday, AMD is going to already be starting to roll out their new line. Won't that have any effect on Nvidia?
 
G

Guest

Guest
The problem I see is while AMD, Intel, and Nvidia are all releasing impressive hardware, no company is making impressive software to take advantage of it. In the simplest example, we all have gigs of video RAM sitting around now, so why has no one written a program which allows it to be used when not doing 3d work, such as a RAM drive or pooling it in with system RAM? Similarly with GPUs, we were promised years ago that Physx would lead to amazing advances in AI and game realism, yet it simply hasn't appeared.

The anger that people showed towards Vista and it's horrible bloat should be directed to all major software companies. None of them have achieved anything worthwhile in a very long time.
 

corser

Distinguished
Sep 3, 2010
2
0
18,510
I do not think that including a IGP on the processor die and conecting them doesn't means that discrete graphics vendors are dead. Some people will have graphics requirements that will overhelm the IGP and connect an 'EGP' (External Graphics Processor). Uhmmmm... maybe I created a whole new acronym.

Since the start of that idea, believed that IGP on the processor die could serve to offload math operations and complex transformations from CPU to IGP, freeing CPU cycles for doing what is intended to do.

Many years ago Apple made somewhat similar to this with their Quadra models that sported a dedicated DSP to offload some tasks from the processor to the DSP.

My personal view on all this hype is that we're going to a different computing model, from a point that all the work was directed to the CPU and making some small steps making that specialized processors around the CPU do part of the work of the CPU (think on the first fixed instruction graphics accelerators, sound cards that off-load CPU, Physx and others).

From a standalone CPU -> SMP ->A-SMP (Asymetric SMP).
 

silky salamandr

Distinguished
Sep 16, 2009
277
0
18,810
I agree with Scort. We have all this fire sitting on our desks and it means nothing if theres no software to utilize it. While I love the advancement in technology, I really would like devs to catch up with the hardware side of things. I think everybody is going crazy adding more cores and having an arms race as a marketing tick mark but theres no devs stepping up to write for it. We all have invested so much money into what we love but very few of us(not me at all)can actually code. With that being said, most of our machines are "held hostage" in what they can and cannot do.

But great read.
 

corser

Distinguished
Sep 3, 2010
2
0
18,510
Hardware should be way time before software starts to take advantage of it. Has been like this since the start of the computing times.
 

jestersage

Distinguished
Jul 19, 2007
62
0
18,630
Awesome start for a very ambitions series. I hope we get more insights and soon.

I agree with Snort and silky salamandr, we are held back by developments on the software side. Maybe because developers need to take backwards compatibility into consideration. Just take games for example: developers would like to keep the minimum and even recommended specs down so that they can get more customers to buy. So we see games made tough for the top-end hardware but, thru tweaks and reduced detail, can be played on a 6-year old Pentium 4 with a 128mb AGP card.

From a business consumer standpoint, and the fact that I work for a company that still uses tens of thousands of Pentium 4s for productivity related purposes, I figure that adoption of the GPU/CPU in the business space will not happen for another 5-7 years AFTER they launch. There is simply no need for an i3 if a Core2 derivative Pentium Dual Core or Athlon X2 could still do spreadsheet, word processing, email, research, etc. Pricing will definitely play into the timelines as the technology ages (or matures) but both companies will have to get money to pay for all that R&D from somewhere, right?
 

smile9999

Distinguished
May 16, 2010
137
0
18,680
great article btw, out of all this what I got seems that the hyprid model of cpu/gpu seems more of a gimmick that an actual game changer, the low end market has alot of players, IGPs are a major player in that field and they are great at it and if that wasnt enough there still is nvidia and ati offerings, so I dont think it will really shake the water much as they predict.
 

smile9999

Distinguished
May 16, 2010
137
0
18,680
[citation][nom]smile9999[/nom]great article btw, out of all this what I got seems that the hyprid model of cpu/gpu seems more of a gimmick that an actual game changer, the low end market has alot of players, IGPs are a major player in that field and they are great at it and if that wasnt enough there still is nvidia and ati offerings, so I dont think it will really shake the water much as they predict.[/citation]

I meant nvidia and ati dedicated low end GPUs
 
I also agree with scort. Part of the cause, however, I believe is nVidia's unwillingness to coexist. Whether it's their x86/chipset issues, or disallowing PhysX (minus 3rd party hacks) if there's an AMD card in the system, I suspect developers are averse to putting a lot of work into something that may not ever be standardized. I think GPGPU could give us the AI needed for vastly more immersive experiences, but it won't happen until the hardware vendors agree on (and/or accept) some standards.
 

NotYetRated

Distinguished
Aug 30, 2010
68
0
18,630
Death of the graphics card business? No. Way. Not for a long long time. The sound card only began to die because modern processors are in no way phased by the hit sounds takes on the system. Sound cards have become almost obsolete for most except for the audio enthusiast. (Though, I am not enthusiast and still have an X-fi...) We are nowhere near ready for CPU's to be able to handle graphics without taking very much a hit. In fact, graphics cards themselves cannot keep up with the pace that software is pushing it to. I think it will be a long, long time before we begin to see the phase out of dedicated graphics cards. At least for the gamers/engineers/creative professionals out there.... If we ever even do see the phase out.
 

hundredislandsboy

Distinguished
If I can still get the same gaming experience, same visual quality with same smooth fast framerates from a CPU/GPU, then go ahead and pour huge investments to R&D to make this happen. Why? Because it's a win-win fro everyone, users play less, cases gets less heat, electric bills get lowered, etc, etc...
 

Aerobernardo

Distinguished
Apr 2, 2006
135
0
18,680
I have two ideas for this kind of topics:

1 - Tom's start a pool on wich questions do we want to ask?
2 - I kinda hoped to hear something about what's next. We know about NDA's, but it would be nice to hear if AMD have intention to adopt new features like a proprietary 3D solution or what both AMD and Nvidia think about how a next generation VGA should do (or wich architecture should it have) to be successfull on todays perspective.
 

LORD_ORION

Distinguished
Sep 12, 2007
814
0
18,980
If Nvidia can bring out the best $150 card by Black Friday, I think they will be fine.

Basically, they need something that blows the 5770 out of the water and costs the same or less... because AMD will lower the 5770 prices when that happens, so it can't perform on par.
 

rocky1234

Distinguished
Sep 9, 2008
130
0
18,680
I do not think they will be getting rid of the stand alone graphics cards any time soon. Reason is that if they did it would send us back in performance at least 10 to 12 years in performance if we lost stand alone graphics cards all together. Maybe by 2017 they will have tech that can do it but right now it is not going to happen. Well it could I guess if all of the save the planet people have their way look at what the auto makers are starting to do rolling out 4 bangers in mid sized cars GM just said that is what they plan to do so I guess if enough pressure is applied anything can happen. Which is to bad because all it does is effect the consumers we get less & pay more for that less.
 

porksmuggler

Distinguished
Apr 17, 2008
146
0
18,680
Very good premise for an article, and an excellent write-up for the companies involved. I haven't read any articles from Andrew Ku before, is Tom's keeping the one guy who can spell hidden away? Beware, he may get criticism from the TL;DR crowd though. Those were great questions, but lets be honest *cough NDAs* the answers were generic PR we already know. Without Ku's analysis, this series would sink, thanks for putting the leg work into this.

oh, and the 100% pie chart for question 1 is too funny.
 

kilthas_th

Distinguished
Mar 31, 2010
40
0
18,530
High phone bill? If only there were some way to have our conversations or even conferences routed effectively and cheaply over the internet ;)
 

bin1127

Distinguished
Dec 5, 2008
736
0
18,980
they talked about power restrictions on a single chip preventing the gap with discrete from closing. Isn't that just an engineering or design problem due to lack of intention to integrate everything on a single chip?

the whole motherboard can be redesigned to give 800W to the cpu socket if necessary.
 
[citation][nom]bin1127[/nom]they talked about power restrictions on a single chip preventing the gap with discrete from closing. Isn't that just an engineering or design problem due to lack of intention to integrate everything on a single chip?the whole motherboard can be redesigned to give 800W to the cpu socket if necessary.[/citation]
The problem is that a given piece of silicon can only dissipate so much heat, regardless of what functions are creating it. This remains true no matter how much you shrink the process. If you increase the area over which that heat is generated / dissipated, you either increase the size of the die to the point where the likelihood of manufacturing defects increases dramatically (e.g. look at Fermi's yields), or you use separate dies and/or chips; i.e. break some functions out, in this case into a discrete graphics solution. Unless or until heat dissipation becomes a non-issue, this is why the hybrid solution will always remain at least one generation behind.
 

Tijok

Distinguished
Jun 28, 2010
24
0
18,520
Great article, and I love the idea for this as a continuing series. I totally agree with many of the above posters; dedicated graphics aren't going anywhere.

If I was to use this information to cast further into the future, I think that Corser had it about right: our computers are becoming more and more discretized, and this will be put to use by software much more readily, allowing for a single processor that works well for photoshop utilizing only IGP, but then when paired with an appropriate discrete card, splits processing between the units and allows a smooth gaming experience.

I'm all for this, more parts means easier and more modular upgrades instead of the one-size-fits-all approach; it also allows for cheaper and easier repairs!
 
Status
Not open for further replies.