Sign in with
Sign up | Sign in
Your question

Miracles happen, GT300 tapes out!

Last response: in Graphics & Displays
Share
July 31, 2009 12:48:17 AM

http://n4g.com/industrynews/News-368685.aspx

It made me laugh too:)  nice read, busy with work dunno if som1 else put it up, but here it is.

Enjoy!
a b U Graphics card
July 31, 2009 3:29:07 AM

Sorry guys but you really are wishing, kinda like wishing the tooth fairy was real when you knew it was all a lie.

There is nothing on g300 coming out of Nvidia. No comments with windows 7 just around the corner and inquisitive stockholders asking what the lineup will be. AMD on the other hand are coming right out and saying it - the competition is nowhere to be seen on dx11.

There will be no Nvidia dx11 part this year.
Related resources
July 31, 2009 3:33:24 AM

Jenny u and Charlie have something in common atleast;)

I'm not dreaming about anything, frankly I don't care as it is, my next upgrade prob won't even be PC based since I'm on the go most of the time, its prob a laptop.

I mainly posted it for the charlie mini bash.

I think ppl need to stop both over hyping and underestimating, look what happen with the 4000 series, ppl were expecting another 2900 XT incident, then they surprised us and put our words where our waste comes out.

frankly I think both companies will be too close to make a choice on performance, just a choice between A) color B) System setup.
a b U Graphics card
July 31, 2009 7:36:06 AM

Yea, seems that the 4890 does handle the 285 in the highend models.
The current hype is this gen may remind of 2900, as theyre doing it again. 1 thing is certain. Both were delayed by TSMCs inability to have decent yields at 40nm. ATI got their foot in the door early on a very promising card, only to suffer the 40nm woes.
What ATI gleened from that experience will move them ahead with their new gen. nVidia appears to be 2 months behind. Traditionally, nVidia doesnt launch its highend on new node, so the delay of their 40nm parts, and how that applies to their new arch, which in itself will take more time, could just be weeks adding tessellation, full DX10, plus DX11 capability into their cards, plus their overall comfort zone, or feel its ready to go, may move it to 3 months out.
So, I could conceivably see nVidia with a handful of new gen parts by the end of the year, with ramp up just starting to happen, and availability in Jan 10
a c 130 U Graphics card
July 31, 2009 7:49:32 AM

I agree with Liquid,
I realise its quite a wait until new hardware hits the shelves and everyone myself included are clamoring for info as to what exactly is coming.
However i think we may be getting to the point where we are starting to muddy the waters with all the info/speculation that is coming out.
I know its hard but i think its wait and see time.

As far as this Charlie and Theo issue is concerned im still of teh same opinion as i was the other night when JD posted about it.

As Theo has said you don't air your dirty laundry in public, If Theo is guilty of whats alleged then i fully understand where charlie is coming from, but surely this can be dealt with internally?
It seems to me that the whole basis of the thing comes down to Charlie believing that its more likely that Theo would rip of his work than his source would double deal him. That and two different people are unlikely to miss-hear or miss transcribe an article. It all sounds very dodgy granted but its hardly a hard case either way.

Mactronix
a b U Graphics card
July 31, 2009 8:18:20 AM

To have put up the same misinformation someone else got wrong could be coincidence. As to the internally part. Charlie did chase down all avenues and leads, and they all pointed back to Theo. I dont see anyone at the Inq or at TR disputing these things.
Theo did change his misinformation back on his page, which could be correcting himself or covering himself, but, Charlie went to Theo, talked to him, explained how itd go, so youd have thought Theo would somehow keep Charlie abreast of whats going on when he made those changes, for the simple fact itd help Theo having Charlie for corrabulation and support. To basically put Charlie back on Theos side of the court.
These people arent bulletproof because of their profession, and I say, once fired, let the arrow land where it will, as that just may make for better journalism.
Im tired of the respect the media has for itself when "one of theirs" stumbles, but has no problem with any number of scenarios, a raped teen and family, a family dealing with the loss of their child kid napped, or even foun murdered. Theo deserves nothing more than you or me, or the press needs to lighten up on everyone, not just themselves, when theirs blood in the water.
As far as wait n see time, I look at it as, the info Ive gathered makes a certain sense to me, and theres always more leaks coming, and as those leaks come, they shape that info overall into a certain direction, being careful to weed out what doesnt fit.
Theres plausibility in alot of things, rumor wise, but having knowledge of certain things help shape whats rumor only vs potential.
Ive heard this direction, whatever it is, from ATI is a big leap. I posted it in my R800 pic thread. This comes from ATI itself. If that means multi chip, that fits. If its simply inter communication thats very helpful between chips, that also fits. If its simply the compute shaders, and how they do things overall, and theres alot of potential in them, that also fits.
At this point, there is no certainty, only possibilities, with more info coming in, eliminating 1 possibility at a time, leading to a true direction
a b U Graphics card
July 31, 2009 12:49:04 PM

You're giving Demerjian way too much credit by calling him a journalist. Is that article linked above news? or is it opinion? blatant advocacy? facts or just speculation? I would no more trust Demerjian then i would trust newegg reviews, especially where it concerns his own paychecks or another "journalists" accuracy.
July 31, 2009 1:30:38 PM

Whether its speculation or fact, its still a good read especially since its an anticipated release from Nvidia, just like any1 would jump on the ATI release of info.

Since all cards are going to boast the same technology (DX 11) I'll assume that the winner won't be on highest performance, but on P/P, especially during this recession. So in my option, I would say that ATI is better known for creating such a card, but who knows, new cards can give hope to both companies.

Remember ATI's 2000 series failure made it work extra hard on its next series, maybe Nvidia can do it too.

Who knows, but any little information/Speculation atm, is a grace, especially since those cards are quite a while away.
a b U Graphics card
July 31, 2009 2:08:32 PM

Well, it appears TSMC has doubled their yield on 40nm. Funny how the 2900 is maligned to this day as ATIs fault. Theres the M$ DX10 swicthover, and TSMCs inability to make silicon that wouldnt need a 1000 watts to go thru it, while it heats your house.
Funny how it all changed when it hit 65nm, using basically the same arch. and again, using an even larger die, thru a smaller process of the same arch at 55, where it became the first arch to hit 1Ghz.
To me, to not know how it happened, and to blame ATI is worse than any tabloid writing, cause if Charlies doing it, and hes totally wrong, at least hes getting paid heheh
a b U Graphics card
July 31, 2009 2:28:59 PM

Well that site you linked liquid, called CD a 'major ati/amd fanboy', which is complete garbage. He isn't an ATI/AMD fanboy just because he hates Nvidia. If anything, he's more of an Intel fanboy.
a b U Graphics card
July 31, 2009 2:55:56 PM

What I find interesting is people who havnt followed along with these things, and have set their minds as to who and why these things happened, whos to blame, whos biased etc etc
The guy that just recently left nVidia? Well, Charlie wrote a story on him, and his departure from nVidia. It wasnt a bad story, and he did mention a few unfortunate incendents, like the other guys ignoring him, tho he was entertaining other people at the time, and wrote that as a possibility as to why. But, its the same guy the got into a physical scuffle with Charlie, and Charlie would have right to disagree with him to an extent after such an event, but that wasnt shown in what hed written, and especially in his own forums, where he made it clear at the beginning, no one was to malign this person.
If you think renaming parts doesnt effect people, thats fine, its your opinion, and nothing more, just like Charlies opinion that its wrong for nVidia yo do such things, making your opinion just as valid as Charlies.
If you think Assassins Creed removing DX10.1 from their game was ok, and Annandtach disagrees with what happened, fine.
If yo think how the 285/295 was launched as a paper launch was ok, but again, Annand prefers it available to the public before writing about it, again fine.
If this scenario happens again with the G300, and that thats ok in your book, others might not find it ok.
If you can ignore having faulty chips in notebooks, and not be one of those effected, feel it was ok, tho those effected by it might not agree with your opinion.
Maybe some peoples opinion, and Charlies arent the same, but adding that all up, and say its ok, without wincing at least a little, whos saying who is biased is tatamount to calling the kettle black. Or, maybe not
August 1, 2009 3:59:13 AM

Chill jaydee man, I just said the series was bad, I didn't say it was because of ATI, whether its their fault or not, I don't care. The fact is it wasn't all that great.

Its like saying, I won the lottery, and Either the ticket was stolen or I blew all the money on pointless stuff, either way the money is gone, and thats what I was pointing out.

Geez ATI ppl are really sensitive:p :D 

Frankly at this point I don't care if ATI and Nvidia Suck each other's ego.

Come DX 11, what card will I buy? prob ATI. Why you say?

A) because I've been with Nvidia for 2 years, and since both are at pretty much an equal play field (as far as we see) I would love to make a change.

B) Morraly, I like the fact that innovation over greed

C)Well I started off as an ATI person, so I might as well switch back to my roots.

What do I see in the future? If patterns are kept, Nvidia will have the highest performance at the highest price, while ATI will match or be slightly lower for a much lower price.

We'll have to see.

Oh and point D) now I can have both Nvidia and ATI cards in my system;) 2 ATI mains and a Nvidia PhysX card. LOL I have no idea if thats possible, I've been out of the loop for a loooooonnnnng time.


Anyways good nite gents and gentles.

a b U Graphics card
August 1, 2009 4:12:16 AM

Whats to chill about? I just pointed to most of the reasons someone or representitives of a larger entitity (ala Annandtech) would find displeasure with nVidia conduct recently, while forgetting about the past, which also has its own story. I just think its simple to understand why some people feel this way.
I think youre wrong about Halo ths time, I have my theories arrived from rumor, and if true, we will see a repeat of the 280 vs the 4870x2 sorta, but this time, there may be a surprise in store
August 1, 2009 12:44:02 PM

so another dual card faster than a single card situation? If ATi does that, I'm not buying it, I can tell u right now. I'm done with dual GPU cards.

I want regular single slot, single gpu cards. What I'm at the same time worried about is that this is prob officially a AMD card, seeing as the 4000s were the last designed by ATI itself. So I hope they know what they are doing.
a b U Graphics card
August 1, 2009 1:07:38 PM

So, the 4890 as tested here http://www.techpowerup.com/reviews/ATI/HD_4890_CrossFir...
shows youre willing to give up maybe 30% more perf, because it loses 8% in scaling?
A lot of the better ideas have come from the same people that are still there running the show that helped form the 4xxx series, which is AMD as we know it.
Scaling gone thru the roof, and the best cf/sli perf seen is the 4890 with 1Ghz cores as the top multi card performer, beating out the 295 and the 285 in sli
August 1, 2009 1:11:02 PM

well when I read in the news paper about ATI (Canadian company) being bought out by AMD, they had massive lay offs for many of the engineers, and whats even sadder one of the heads (which was my friend`s friend). He always gave me massive discounts on cards, I remember the X1900 AIW was around 350, and he grabbed it for around 125-130, I can`t remember.

I still drive by the HQ in Canada, the logo is still here. :p 

Oh and I`m sure they won`t have their CPU buffs doing it, but at the same time it won`t be the reliable people that we had for the X1800/1900/9800s blah blah

I remember TGGA telling me that the 4000s were the last official design that was done by the former ATI, don't hold me to that though, its what I remember, hopefully he sees the thread and comments.

@ JAydee No i'm giving up 30% because I don't want driver dependancy, and awkward min frame rates. I remember when I had quad and single 9800 GX2 CSS would dip to 20s...for no reason I may add.
I'm not saying that its a bad move for ati, since it could be cheaper to get that 30% as we've seen in the past, but its just not for me.
a b U Graphics card
August 1, 2009 1:18:02 PM

You cant compare sli, which doesnt show the same scaling, and a g92, which had poor memory management and couldnt buffer worth a hoot to the 4xx series or the 200 series
Here, if you think you can do better, great opportunity, and you may be able to walk to work too
https://www.amd.apply2jobs.com/index.cfm?fuseaction=mEx...

PS No wait, its at corporate
a b U Graphics card
August 1, 2009 10:47:20 PM

L1qu1d said:
well when I read in the news paper about ATI (Canadian company) being bought out by AMD, they had massive lay offs for many of the engineers, and whats even sadder one of the heads (which was my friend`s friend). He always gave me massive discounts on cards, I remember the X1900 AIW was around 350, and he grabbed it for around 125-130, I can`t remember.

I still drive by the HQ in Canada, the logo is still here. :p 

Oh and I`m sure they won`t have their CPU buffs doing it, but at the same time it won`t be the reliable people that we had for the X1800/1900/9800s blah blah

I remember TGGA telling me that the 4000s were the last official design that was done by the former ATI, don't hold me to that though, its what I remember, hopefully he sees the thread and comments.

@ JAydee No i'm giving up 30% because I don't want driver dependancy, and awkward min frame rates. I remember when I had quad and single 9800 GX2 CSS would dip to 20s...for no reason I may add.
I'm not saying that its a bad move for ati, since it could be cheaper to get that 30% as we've seen in the past, but its just not for me.

Do this, read or reread this
http://www.anandtech.com/video/showdoc.aspx?i=3469&p=7
Those people are still there, and are the same ones who brought the 1900s et al.

One telling tidbit:
"Within 30 hours we had our first preview up and made it already clear that ATI was on to something. The GeForce 9800 GTX got an abrupt price drop to remain competitive and even then it wasn’t enough, the Radeon HD 4850 was the card to get at $199.

The last hiccup in ATI’s launch ended up not being bad at all, ATI got some extra PR, drummed up some added excitement and in the end did justice to a product that deserved it.

Recon from Taiwan
One thing I wondered was how well ATI knew NVIDIA’s plans and vice versa, so I asked the obvious: where do you guys get your information from? The answer was pretty much as expected: Taiwan. All of the board makers know one another and are generally open with sharing information, once information hits Taiwan it’s up for grabs. Then there’s a bit of guesswork that’s done."
If nVidia knew about AMDs plans, as AMD knew about nVidias, why was nVidias pricing so high til the last minute? These are all questions people asked then and now.
When you read the article, youll start to see, eventually all gpus will be made the way you dont want them, multicored. Its the future, ATI got there first
August 2, 2009 1:19:05 PM

everything looks good on paper;)

Like I said, we'll see what happens, I personally think going dual is basically cheating, thats why I don't like either 4870 X2 or the 295 GTX.

why see 1 boxer face 2 boxers when you can see a fair 1 on 1, :p 

I just don't want driver dependency, then again I always wanted to go back to omega drivers, but I doubt that will happen either.

We'll see, oh and p.s I prob can work for ATI when I'm done, but not as a card designer:p :D 
a b U Graphics card
August 2, 2009 1:24:48 PM

If you want to use the boxer analogy, one would be a 20 stone heavyweight while the other would be a lean middleweight.

ATI could very easily make 1.5 billion transistor gpu's same as nvidia, they just decided not to a long while ago because they don't scale down so well, or cheaply.
a b U Graphics card
August 2, 2009 2:46:11 PM

The strategy is already going full tilt. Look how quickly AMD brought their hex core to market. Same for Intel. Finding what interconnects, and then making those interconnects faster is the key. LRB will be much the same way. nVidia is the only holdout, as the old single monster is dying out.
Getting throughput thru such huge dice can cost more in power for easy areas (lower power requirement) than in other critical, yet smaller elements of the die that uses far more power, but no longer just due to die size alone.
Then you have the mid and lower sections of the market to appease, which didnt even happen at all with nVidia with the G200 series, but is why nVidia kept up with their renaming scheme, which no one liked, and is seen as being dishonest, especially the way theyve done it. Having huge dice is even effecting marketing.
Like I said, its going to happen, even with nVidia, and the longer they stay away from innovation, which has been their current theme as of late, the harder its going to be to successfully adapt down the road, due to their competitions experience
a b U Graphics card
August 2, 2009 3:23:07 PM

LOL Yea, and as the practice of downsizing to create new abilities continues, the theory of bigger is better may just disappear, and yes, were talking gfx cards heheh
If you look at the in die solution of igps as an example, or memory controllers etc etc. Its happening now as its always been happening.
nVidia is going the other direction, creating more than just a gfx card, thus adding more instead of adopting
a b U Graphics card
August 2, 2009 3:27:13 PM

The single biggest thing we will see the soonest doing this is LRB. Itll be multifunction, but heres the thing, itll also be able to be done in halved or quartered solutions, as we see with ATI.
nVidias approach is by adding more die space for DP, things of that nature to compete with LRB, while having no solution for a lower market in gpgpu. Theyve added and not adopted.
!