Sign in with
Sign up | Sign in
Your question

AMD has started shipping 90nm products

Tags:
  • CPUs
  • Products
  • AMD
  • Intel
Last response: in CPUs
Share
August 13, 2004 1:01:31 AM

<A HREF="http://www.xbitlabs.com/news/cpu/display/20040812170423..." target="_new">There you go</A>, AMD is apparently going through a seamlessly well executed transition towards 90nm.

Intel better watch out. Their products look extremely uninteresting if compared to a 2.4/2.6/2.8Ghz A64 lineup, which will probably be quite feasible on 90nm (heck, even 3.0Ghz and above!)

More about : amd started shipping 90nm products

August 13, 2004 2:39:09 AM

omg

Athlon 2700xp+ (oc: 3200xp+ with 200fsb)
Radeon 9800pro (oc: 410/360)
1024mb pc3200 (5-3-3-2)
Asus A7N8X-X
Related resources
August 13, 2004 3:12:34 AM

Honestly I have no idea what Intel is going to do about all this... they couldn't be much further behind right now and nothing ever seems to go right when they attempt to catch up.

AMD 64 3400+
MSI K8N Neo Platinum
1 GB Kingston HyperX PC3200
ATI AIW 9600XT
WD Raptor 74GB
August 13, 2004 3:17:28 AM

Ow well, I guess things could be worse for them... Not much, but things could be worse.

Their new processor doesn't scale nearly as much as they expected it to, it's much too hot and is probably useless in the long run. They'll have to rely on smithfield being any good. They'll have to rush it out the window and still make it good... If they have any wits, they'll go with Dothan and make us all happy. If not, we'll be ranting about dual netburst cores all the way to 2006!

At least the sonoma platform has received a green light now...
August 13, 2004 3:22:27 AM

BTW Meph, I'm kinda new here but I find your posts probably the most helpful/informative of anyones. Thanks for all the info you post here every day.

AMD 64 3400+
MSI K8N Neo Platinum
1 GB Kingston HyperX PC3200
ATI AIW 9600XT
WD Raptor 74GB
August 13, 2004 3:38:10 AM

It funny how you guys are talking out of your ass.

You all think you know better than Intel.

You all seem to know what Intel needs to do and the only ones not knowing are Intel themselves.



===========================
<A HREF="http://www.clan-chaos.com" target="_new">clan CHAOS</A>
August 13, 2004 3:55:15 AM

You are the one who is really funny. How many of Meph's posts have you read?
It may be that I haven't called him an Intel fanboy for a couple of months, but that isn't because it wasn't true.
Meph has been an Intel cheerleader for too long. It's nice to see that he has looked past an old prejudice.
August 13, 2004 4:00:32 AM

Thanks for the informative post Pete... nice website you have there too by the way. Anyway, the numbers/history don't lie man... they ARE falling further behind and every time they try to catch up its a fruitless effort. We know they are a big company and you probably believe they know what they are doing, but even big companies with all their marketshare and "power" can make mistakes. We aren't saying they are down for the count, just that things don't look so good right now. BTW I can still feel the Intel fan inside Meph's posts, thats why I asked him what his rig setup was the other day.

AMD 64 3400+
MSI K8N Neo Platinum
1 GB Kingston HyperX PC3200
ATI AIW 9600XT
WD Raptor 74GB
August 13, 2004 4:04:27 AM

Quote:
You all think you know better than Intel.

You all seem to know what Intel needs to do and the only ones not knowing are Intel themselves.

Well, Intel certainly isn't doing much to prove us wrong lately. :wink:

<i>"Intel's ICH6R SouthBridge, now featuring RAID -1"

"RAID-minus-one?"

"Yeah. You have two hard drives, neither of which can actually boot."</i>
August 13, 2004 5:19:38 AM

>Meph has been an Intel cheerleader for too long.

Dunno. To me it sounded more like he was buying too much into company PR and expecting miracles (especially from Intel) where there wheren't any to be expected (think EM64T, higher bus speeds, DDR2, PCI-E,..). I guess he learned the hard way not always to trust PR statements, put too much faith into new acrynoms or intel's roadmaps :) 

= The views stated herein are my personal views, and not necessarily the views of my wife. =
August 13, 2004 5:19:40 AM

Seems you haven't caught up on current events. Intel CEO doesn't BLATANTLY and PUBLICLY call out the troops unless some thing is FUBAR. This is more than a "cracked" crystal ball here!

Abit IS7 - 2.8C @ 3.5ghz - Mushkin PC4000 (2 X 512) - Sapphire 9800Pro - TT 420 watt Pure Power
Samsung 120gb ATA-100 - Maxtor 40gb ATA - 100
Sony DRU-510A - Yellowtail Merlot
August 13, 2004 5:27:39 AM

Its still a bit early to call it "seamless", but its good news nevertheless. One has to credit AMD for executing extremely well these last 12+ months; especially ironic since intel hasn't managed nearly a single deadline on any product in the same time frame. Times, the are chaaa-aanging :) 

Oh just a small word of caution; I may have said this before, but don't expect clock scaling miracles from AMD's initial 90nm products. My guess is they will geared towards low power (mobile A64, opteron EE) and high volume (cheap 3000+ class mass production). It may take until early next year when they move to their 11 layer process before we see what its capable of in terms of scaling. OTOH, I am really looking forward to see what AMD can do in the mobile area. They made it their second priority after servers, and so far, we haven't seen a really credible attempt to counter Dothan/Centrino in anything but DTR while they have been well ahead in the server market (top priority) and the desktop (lowest priority). Maybe with 90nm they will make something competitive for the thin&light mobile market as well ? Sure could use some competition there, Dothan based products are still expensive as hell, especially considering how cheap the cpu must be to produce.

= The views stated herein are my personal views, and not necessarily the views of my wife. =
August 13, 2004 6:32:46 AM

Here is the full transcript <A HREF="http://www.investorshub.com/boards/read_msg.asp?message..." target="_new"> link </A>

Some interesting quotes:
Quote:
AMD appears to be executing well on its AMD64 roadmap. <b>AMD is one of the few companies on
90nm that does not seem to have had significant delays or defect issue</b>. Revenue shipments of
AMD64 notebooks on 90nm started this week, well within the planned schedule for shipments
prior to the end of Q3. Desktop AMD64 shipments on 90nm will commence a month later,
followed by servers.

]

Defect issues ? i've not heard anyone having trouble with that part of 90nm (well, maybe UMC).

Quote:
AMD is tracking particularly well with Opteron. It is on target to reach 10% market
share by year-end, up from 4% today, based on design wins already in place. <b>Given the 12-month
or more design cycle in servers, AMD has some visibility into 2005, when it expects to further
increase its server share, ultimately toward 20%.</b> It has very strong customer support from Sun, HP
and IBM, and is continuing to work on securing a design win with Dell. The catalyst for the latter
would likely be market share growth to up to 20%.


and here is the real kicker:
Quote:
AMD's roadmap in microprocessors includes launchinga dual-core processor next year, and potentially a <b>quad-core processor in 2007 </b>on the 65m/300mm



= The views stated herein are my personal views, and not necessarily the views of my wife. =
August 13, 2004 11:09:32 AM

I think AMD will show Intel haw to use 90 nano on a decent way.
What intel is dooing with 90 nano we just cant take serious, now can we ?
For a joke okay, but serious selling those things ?


Toms Hardware Site is a joke !
Looks like intel spent more on bribing reviewers to cover up it aint that great than they did in R&D, you know what im talking about Tom !
August 13, 2004 2:35:17 PM

Quote:
I think AMD will show Intel haw to use 90 nano on a decent way.

Funny, since IBM showed AMD in the first place. :o  But in a way you might be right. Intel has denied needing SOI. We've seen how well that denial has helped them. Strained silicon just isn't enough. Hopefully Intel's next core revision will just include SOI in the manufacturing process and we can be done with this nonsense. Unfortunately Intel doesn't look to be prepared to go to these extremes until their 65nm process.

<pre><b><font color=red>"Build a man a fire and he's warm for the rest of the evening.
Set a man on fire and he's warm for the rest of his life." - Steve Taylor</font color=red></b></pre><p>
August 13, 2004 2:45:39 PM

NO ONE HAD BETTER SAY that AMD is doing it well because they let Intel iron out the bugs


theres no way Intel would give AMD benficial information liek that

-------
<A HREF="http://www.albinoblacksheep.com/flash/you.html" target="_new">please dont click here! </A>

Brand name whores are stupid!
August 13, 2004 2:54:19 PM

Lots of valuable lessons can be learned simply by observing. Of course Intel aren't going to actively <i>help</i> AMD, but that doesn't mean AMD haven't learned anything from Intel's 90nm tech.

---
Epox 8RDA+ V1.1 w/ Custom NB HS
XP1700+ @200x10 (~2Ghz), 1.4 Vcore
2x256Mb Corsair PC3200LL/1x512Mb Corsair XMS PC4000 2.5-3-3-7
Sapphire 9800Pro @412/740
August 13, 2004 2:57:11 PM

how would they? how much can you actually learn from just buying a retail Prescott and looking at it? honestly im asking because i dont know

-------
<A HREF="http://www.albinoblacksheep.com/flash/you.html" target="_new">please dont click here! </A>

Brand name whores are stupid!
August 13, 2004 3:30:28 PM

Quote:
how would they? how much can you actually learn from just buying a retail Prescott and looking at it? honestly im asking because i dont know

AMD wouldn't have learned from buying a retail Scotty. AMD would have learned from reading Intel's publicly available whitepapers. But that doesn't really matter as it wasn't Intel that AMD was doing most of its learning from.

<pre><b><font color=red>"Build a man a fire and he's warm for the rest of the evening.
Set a man on fire and he's warm for the rest of his life." - Steve Taylor</font color=red></b></pre><p>
August 13, 2004 3:36:03 PM

>how much can you actually learn from just buying a retail
>Prescott and looking at it?

Depends what you are looking for. If you look at it like Hans de Vries, you can learn a surprising ammount of things :)  But afaics, not on the process itselve. besides, Intels and AMD's processes are far too different (beyond even SOI/SS differences) to the point where I doubt AMD would learn a whole lot of usefull things (for them) from intel, even if they where allowed to sit next to intel process engineers for a couple of weeks :) 

That being said, rest assured intel and amd buy each others products, and thest them thoroughly. To the point of slicing them up and putting them under xray microscopes or whatever. I used to subcontract for a small company that makes CMOS imaging sensors, and I once saw an x-rayed picture of one of their products.. done by a competitor trying to figure out how they did it.. :) 

= The views stated herein are my personal views, and not necessarily the views of my wife. =
August 13, 2004 3:53:58 PM

What would 2.4, 2.6, 2.8Ghz lineups be rated...4000+? also what Cahce size should i be drooling over here? 1MB?

How high should AMD be able to push 90nm? I know people said 3Ghz easily....what about 3.5 or 4?

What timeframe will the higher end (i.e. 3Ghz+)hit the market? any projections?

4Ghz A64...that would make me blow a load in my pants.

What do you call a group of midgets? A pack, gaggle, a pride, or maybe just a murder?
August 13, 2004 3:59:31 PM

I think you're kind of daydreaming when thinking about 4Ghz; after all, AMD would then run into power dissipation problems too, in all likelyhood.

Not that I wouldn't be interested in an A64 @ 4Ghz, but I'm thinking AMD won't make much past 3Ghz on 90nm. Could be wrong, though. I wouldn't expect a full 4Ghz, though. If Intel keeps stalled and 90nm enables 4Ghz athlons, AMD might reach Intel in clock rates... which would of course mean hell for Intel.
August 13, 2004 4:16:37 PM

Everyone gives AMD 3Ghz easily, i was just picking something that is very high but MAYBE on the outskirts of a possibility. I know dual cores are comming soon and single CPU will slowly be phased out over 3 maybe 5yrs.

RIGHT NOW i would pay the $800+ a A64 FX chip if it were 3.5Gz. Hell i would pay the $1000 for a 3.5Ghz A64.
That would be like a A64 4800+ or something....*Drools*

but anyway back to the real point, Is AMD going to 65nm like Intel? also when Dual core comes out and AMD has dual 2.2Ghz cores....will it be called a 4.4Ghz processor?

Sorry i am asking so much, i am illiterate and also i dont know how to read.

What do you call a group of midgets? A pack, gaggle, a pride, or maybe just a murder?
August 13, 2004 4:33:16 PM

>Everyone gives AMD 3Ghz easily

3 Ghz seems a reasonable target for K8 on 90nm, but I wouldn't say "easy". 4 GHz is daydreaming for sure. If you want a number "MAYBE on the outskirts of a possibility", I'd go with 3.4.. and that would definately be the outer edge of those outskirts IMO :) 

> I know dual cores are comming soon and single CPU will
>slowly be phased out over 3 maybe 5yrs.

Says who ? I don't believe single core will be phased out over the next ~10 years honestly.

> Is AMD going to 65nm like Intel?

Of course.

> also when Dual core comes out and AMD has dual 2.2Ghz
>cores....will it be called a 4.4Ghz processor?

No.

= The views stated herein are my personal views, and not necessarily the views of my wife. =
August 13, 2004 4:38:50 PM

Quote:

RIGHT NOW i would pay the $800+ a A64 FX chip if it were 3.5Gz. Hell i would pay the $1000 for a 3.5Ghz A64.
That would be like a A64 4800+ or something....*Drools*

Heck, I'd even pay that too for such a kickass product. :smile:
Quote:

but anyway back to the real point, Is AMD going to 65nm like Intel? also when Dual core comes out and AMD has dual 2.2Ghz cores....will it be called a 4.4Ghz processor?

Intel is already transitioning one of its fabs to 65nm; they seem to be ahead of AMD again. In theory, they should be able to mass-produce 65nm-based products still in very late 2005 - theoretically; if they don't flop that transition like they did with the 90nm one, that might be their strike back against A64 technology. If well executed, that is. And it seems that's asking a lot of Intel nowadays. :eek: 

As for calling a dual 2.2Ghz a 4.4Ghz, that would be a blatant lie of course, but I'd suspect the marketing freaks to like that idea. Personally, I think that's so wrong it's offensive; they'd better call it "<b>Gemini 2.2Ghz</b> - two twin processors in one" or whatever... really catchy and quite accurate.

BTW, what is it with chip names? Why do they always have to be so boring? For instance, you could codename a chip "Chimera", then its predecessor "Bellerophon", for instance... Chimera for the mithological beast with three heads: one lion, one dragon and one goat head, on a lion's body... And Bellerophon for the beast's slayer...

And dual-core ones: Gemini. Personally, I liked the sound of "tanglewood" for Intel's multi-core (up to 16) itanium, 'cause it made me feel as if it would be a great processor to deal with lots and lots of threads - "entagled threads"... But then they renamed it Tukwila, which makes me kind of go "huh?"...

Hammer was ok, even if slightly dull and uninspired... Claw/Sledgehammer was great, but did they have to come up with the new, boring names? Paris, Athens, Newcastle, San Diego... Not really inspiring.
Quote:

Sorry i am asking so much

That is not something to be sorry about. Great things happen when people come up with unexpected questions. The most basic questions are the ones we never quite seem to ask ourselves. So keep asking! :wink:

<P ID="edit"><FONT SIZE=-1><EM>Edited by Mephistopheles on 08/13/04 03:41 PM.</EM></FONT></P>
August 13, 2004 4:40:43 PM

Really? i would figure once the dual cores hit the market in mass....not just initial samples, that AMD and Intel would stop innovating for single core CPU's.
What would the benifits of single core in 5years? how much further can the push the limits of single core?
Wouldn't that be like AMD spending money and time innovating a 32bit chip 10yrs from now? maybe for a while because of the cost of 64bit. but in 5yrs let alone 10 will AMD be making 32bit chips?

I must be so lost on what dual core is and the benifits of it. How will the dual cores be labled if not the total of the two? something like A64 3200++? or 3200+*2?

Do you have a link that can point out the direction of dual core and also explain the benifits of it?

What do you call a group of midgets? A pack, gaggle, a pride, or maybe just a murder?
August 13, 2004 4:55:39 PM

Quote:

Really? i would figure once the dual cores hit the market in mass....not just initial samples, that AMD and Intel would stop innovating for single core CPU's.

I'd imagine that budget-minded CPUs will indeed be single core for a long time to come, too... I agree with P4Man.
Quote:

I must be so lost on what dual core is and the benifits of it. How will the dual cores be labled if not the total of the two? something like A64 3200++? or 3200+*2?

I guess they'll probably come up with something, but we don't know what the marketing department will do exactly, AMD hasn't divulged anything. I wouldn't be surprised if they came up with a whole processor name or high-end lineup for dual-core.
Quote:
Do you have a link that can point out the direction of dual core and also explain the benifits of it?

I think a good place to start, if I may be so bold, would be 2cpu's <A HREF="http://www.2cpu.com/articles/6_1.html" target="_new">FAQ</A>. Granted, it's really about two processor systems and SMP, but it shows you the general direction things are going. Dual-core chips are essentially a more efficient version of dual-processor systems. They're more interesting from a technical point of view, and they're more sophisticated; processor-to-processor communication is more streamlined and cache is usually shared.

Therefore, dual-core systems also share dual-processor setup's limitations and advantages. In order to truly reap the benefits of going multicore, though, you must use multithreaded code. If you're not, then multicore will only make multitasking smoother - much smoother and troublefree, but it won't speed individual, single-threaded apps at all. Of course, ideally, a dual-core chip would perform twice as well as a single-core, but that will certainly almost never the case.

Which is the main limitation of going dual-core. If indeed AMD needs to somewhat reduce core clock to keep power dissipation at acceptable levels, then single-thread performance will be penalized - and keep in mind that a huge fraction of software nowadays is single-threaded, not multithreaded.

So, If AMD releases, say, a shiny dual-core processor with the cores at 2.4Ghz, and you can get, say, a single 2.8Ghz processor for less cash, then there is a big chance that, for many things, the single 2.8Ghz will be faster.

(which is also what makes a theoretical dual-core dothan a fantastic idea: it wouldn't need to be scaled down to keep power dissipation low; it's already very, very low with a single core!)
August 13, 2004 4:55:47 PM

>Really? i would figure once the dual cores hit the market
>in mass....not just initial samples, that AMD and Intel
>would stop innovating for single core CPU's.

I wouldn't bet on it. Not all software can benefit subtantially enough from multithreading to be able to completely ignore single threaded performance. There may be a shift in focus from single threaded to multithreaded, but it will take a while for software to catch on on such a scale that there is no longer a point in improving single threaded cpu performance. Maybe this will even never happen.

>What would the benifits of single core in 5years? how much
>further can the push the limits of single core?

They've pushed it a little while over the last 30-40 years, I'm sure they will push it a bit further over the next 10 :) 

>maybe for a while because of the cost of 64bit

Nothing to do with it.

>but in 5yrs let alone 10 will AMD be making 32bit chips?

Probably not for anything but the embedded market, but again, that has nothing to do with multicore. you could build a 32 bit dual/quad core just as well, or a 128 bit single core if you wanted to.

>I must be so lost on what dual core is and the benifits of
>it.
Dual core is just two cpu's in a single package ('die'). Its performance characteristics are nearly identical as a dual cpu machine, ie ranging from roughly identical as a single cpu machine to ~80% faster in the best cases (100% in theory). In reality, only multithreaded apps will gain (or running more than one single threaded app simultaneously), and performance increases rarely exceed ~20% for most desktop oriented tasks. For games, currently the speedup is zero. Rendering, photoshop and encoding tasks can make better use, ranging from 20 to ~70% in some cases.

> How will the dual cores be labled if not the total of the
>two?

Simply as dual core ?

= The views stated herein are my personal views, and not necessarily the views of my wife. =
August 13, 2004 5:09:03 PM

Wouldn't developing dual-core be the same thing as developing one core + connectivity? Thus rendering the discussion on the development of one over the other moot?

Selling is another matter.

<font color=blue>The day <font color=green>Microsoft</font color=green> will make something that doesn't suck is the day they'll start making vacuum cleaners.</font color=blue>
August 13, 2004 5:31:16 PM

I see your point, If you expand the single core speed you will expand the dual core because its two single cores.

I was making the point about 64/32bit as in once you have made a leap in technology why would you research old and useless tech. but i see now that 64bit single CPU will be important.

If games wont produce any benifits what about this:
A dual core machine like a A64, 2.6Ghz dual core with 2 6800Ultra's on dual PCI-E board....would that enable me to play something like UT2004 with my friend ON THE SAME MACHINE?
or would there be some bottleneck somewhere that wouldn't support this? Providing there is a lot of memory like 1.5-2GB.

because in games, not D3, where you dont need a SLI to make it play at highest quality you might be able to do that. It would be cool if we could.

Also one thing you said about software, is it impossible or just unlikely that M$ or Linux or some program can make use of the dual CPU like a RAID-0 and utilize it as if it were one giant PCU? (understandably it wouldn't be a 100% conversion but maybe a dual core of 2Ghz would act more like a 3.6Ghz)

I dont know much about the technical part the design of CPU's and i know even less about programing.

the possibilities that can arise from a dual core are interesting to me.

What do you call a group of midgets? A pack, gaggle, a pride, or maybe just a murder?
August 13, 2004 6:21:08 PM

Your ignorance (and I don't mean that insulting at all) allows you to have some truly refreshing ideas :)  And again, I don't mean this sarcastic, but both the SLI and cpu RAID ideas are terrific in their own way :)  Unfortunately, therefore not quite realistic.

Lets start with 2 cpu's "in RAID-0". the big difference between storage and executing code, is that there is no dependancy with storeage, and there is with code. In a disk/RAID setup, one bit on disk 0 has absolutely no relation with the other bits on disk 1. you can simply put the odd bits on one disk, and even on another, and thats it.

With code, if you would randomly chop it into pieces, and feed different pieces to different cores (cpu's), you'd have enormous problem with dependency (one piece of code being dependant on the output of another, running on another core), cache trashing and other problems that would result in performance being an order of magnitude lower than running on a single core; the cpu's would both be completely stalled and bottlenecked by intercpu/core communication.

If you make the cpu's a bit more intelligent in determining which "pieces" can be executed in parallel, you end up with what every modern cpu does "being superscalar" by having more than one pipeline, and several execution units. There are limits to how far you can push this though, and stretching this over two cores doesnt give any improvement.

No, in order to make proper use of more than one cpu, you need software that allows this by creating more than one "thread" which can be processed with as little dependencies as possible with the other threads. Since even compilers can't properly do this automatically yet today, you can not reasonably expect cpu's to ever do this automatically/on the fly.

As for using your PC to play 2 copies of the game at once.. well, frankly some games already allow this, through split screen. Add a second monitor and support for it, and you could do it properly. There is no strics need for dualcore cpu's or dual SLI gpu's to achieve this. You'd need 2 keyboard/mice (which might be a problem already under windows I think), a powerfull enough computer and mostly, a game that supports this. If properly written, I also don't think it would require twice the cpu power, a lot of things could be shared between both game instances (AI, physics,..). At least I assume.

what you are suggesting however, isnt crazy at all, and a lot companies including MS and Intel are working hard at making it possible/easier to create several virtual instances on one PC (or server). Some of this technology will even allow you to have multiple different operating systems loaded at once, rebooting one instance while keeping the rest running, etc. Such technology should also allow you to play several games at once, or the same game obviously. But dual core chips or SLI videocards on their own will not enable this, nor will they be really required for it.

= The views stated herein are my personal views, and not necessarily the views of my wife. =
August 13, 2004 6:25:32 PM

Quote:
Intel is already transitioning one of its fabs to 65nm

Yeah, but is that a <i>processor</i> fab?

Quote:
As for calling a dual 2.2Ghz a 4.4Ghz, that would be a blatant lie of course, but I'd suspect the marketing freaks to like that idea.

If AMD doesn't get rid of their rating system then I <i>would</i> expect them to call it an Athlon64 DC 6500+ or some such nonsense.

Quote:
BTW, what is it with chip names? Why do they always have to be so boring? For instance, you could codename a chip "Chimera", then its predecessor "Bellerophon", for instance...

By any chance have you been watching Mission Impossible 2 lately?

<pre><b><font color=red>"Build a man a fire and he's warm for the rest of the evening.
Set a man on fire and he's warm for the rest of his life." - Steve Taylor</font color=red></b></pre><p>
August 13, 2004 6:36:39 PM

YAY! I knew my ignorance would pay off some day...lol.
I used to talk about physics with my Physics teacher and my friend who wasn't familiar with it...he would rbing fresh ideas and new ways of thinking about it. I dont see my not knowing all of this as a weakness. I am asking because i dont know and i am curious. I think the people who are content to sit in ingnorance are stupid. Not the people who ask questions....so i take not offense at all to anything you said.

I am thinking about some of this stuff. I guess its all in the software people's hands. In theory a game could use CPU1 for physics and CPU2 for AI...therefor making more complex code for both.

I know SOME games already allow the dual gaming to be done. But you would think with a simple program...(seems simple to me but i know know SH!T about programming and code.) EVERY game could be played this way even if it wasn't designed to.
With a dual core and dual GPU, you should be able to EASILY set up a virtual desktop or something like that to run the 2nd game from. So ALL games could be played like that.
Also i didn't think Windows had a problem with Mulitple keyboards and mice....with USB ports this shouldn't be a problem.

I guess i am dreaming, but to me i dont think it would take a super code to make two OS to be running from the same machine. With dual core, two GPU, enough RAM(maybe even sectioned off for each core when this is enabled), and multiple input devices....you should be able to do it.

But then again this is a moot point, it's all theory.

One more Q...if dual cores doesn't bring gaming up, and limited in other apps...why would companies spend hundreds of millions on something JUST for encoders?
Where is dual core technology going to help gamers and graphic designers and other people who will only see a minor change in performacne?

What do you call a group of midgets? A pack, gaggle, a pride, or maybe just a murder?
August 13, 2004 7:01:39 PM

> In theory a game could use CPU1 for physics and CPU2 for
>AI...therefor making more complex code for both.

Not just in theory; indeed spinning off physics, geometry and AI into different thread is probably the easiest way, if its not even done yet to some extent now. Not sure what you mean by 'more complex code', but if you mean that you'd therefore have more processing power using multicore/multi cpu, then, yes indeed.

> EVERY game could be played this way even if it wasn't
>designed to.
>With a dual core and dual GPU, you should be able to EASILY
>set up a virtual desktop or something like that to run the
>2nd game from. So ALL games could be played like that.


Yeah, if the OS allows both virtual computer to use 3D accelerated graphics (on 1 or 2 videocards), 2 different audio cards/outputs (that already works now for sure), 2 difference input devices (which is a must anyhow, not just for games), yes, your plan would work. Not there yet though.

>Also i didn't think Windows had a problem with Mulitple >keyboards and mice....with USB ports this shouldn't be a
>problem.

Just tried it.. plugged a second USB mouse, and it works.. well, not if you expected 2 pointers though LOL.. both mice fight for control like I expected. but hey, things like that should be easy to change, maybe an application could even already detect the 2 different mice and redirect their inputs..

>One more Q...if dual cores doesn't bring gaming up, and
>limited in other apps...why would companies spend hundreds
>of millions on something JUST for encoders?

Yeah, why did intel ever release hyperthreading netburst cores that suck at anything except encoding ;)  Seriously though, dual core (or dual cpu) makes little sense on the average desktop today, that doesnt mean software won't be rewritten to take advantage of it. I'm sure intel will spend some fortunes on subsidizing companies to make their software SMT friendly (like they did for MMX, SSE, SSE2,..). In fact, intel may already have laid the groundwork for this with Hyperthreading.. which also requires threaded software. So its not like there won't be a benefit at all, if nothing else, atleast running 2 different cpu intensive tasks at once should benefit hugely, its just not like 2 cpu's would be better than a single cpu at twice the speed. I wouldn't want to give up more than ~20% clockspeed for a second cpu/core.

= The views stated herein are my personal views, and not necessarily the views of my wife. =
August 13, 2004 7:14:52 PM

It seems to me like after finding this out....65nm is much more exciting then dual core. It seems like dual core is best for servers and encoders.
The faster and more powerful single core for other things.
I guess that might be another choice people will have to make in the future.
I wish i could get a hold of companies that design software and give them ideas of what i want to do with my dual core and have them write code for it. I think dual cores have so many options but i have a feeling it will be so rushed to market that none of them will be utilized.

What are the odds of ANYTHING i have mentioned making it to market?

What do you call a group of midgets? A pack, gaggle, a pride, or maybe just a murder?
August 13, 2004 7:44:20 PM

>It seems to me like after finding this out....65nm is much
>more exciting then dual core.

Don't get overexcited by either, its just progress, evolution, there is no revolution (or if there is one, its a continuous one).

> It seems like dual core is best for servers and encoders.
>The faster and more powerful single core for other things.

Yes, servers and workstation will benefit the most (and the first) from multicore, but I wouldn't completely ignore it for the desktop either. there is already quite a bit of MT software out there, just don't expect dual core to double performance for everything (or even anything).

>What are the odds of ANYTHING i have mentioned making it to
>market?

Like what ? 4 GHz K8s ? No. 4 Ghz "K9" or K10, K11 or whatever? definately. Dualcore, dual GPU, machines capable of supporting two games/gamers at once ? Yes, definately, won't take that long either. Multicore cpu's that benefit automatically even from single threaded software ? I doubt it in the near future, but wouldn't rule it out.

= The views stated herein are my personal views, and not necessarily the views of my wife. =
August 13, 2004 8:03:56 PM

With the problems Intel had with 90nm i figured that the transition to 65nm would be put on the shelf for a long time and ultimatly not yeild that great of results.

But knowing that 65nm now looks much more likely, because of AMD producing quality 90nm prodcuts, its more excitng that 65nm might extend the limits of the CPU as i know it today.

Whats after 65nm? atomic transistors? i have heard about this, but isn't it super exspensive?

What do you call a group of midgets? A pack, gaggle, a pride, or maybe just a murder?
August 13, 2004 9:20:42 PM

>With the problems Intel had with 90nm i figured that the
>transition to 65nm would be put on the shelf for a long
>time and ultimatly not yeild that great of results.

Impossible for us to tell really. First, i'm not sure intel is having any at all trouble with 90nm, they are having trouble with *Prescott* (not really with Dothan, also 90nm). For all I know, Prescott on 130nm could be a 200W monster.

Secondly, intel will be moving to FD-SOI at 65nm, AMD (afaik) will stick to PD-SOI like now, so its even harder to guess how good/bad 65nm product will be.

>Whats after 65nm?

45nm :) 
I don't know how much further they will be able to shrink after that (I think I read something about some technology in the works that would make it possible to scale down to 10nm without radical differences, anything further is science fiction), but its certain it will get increasingly hard and expensive and it won't last forever. At some point quantum mechanical effects just take over and you can't design traces smaller than a couple of atoms anyway. But by that time, who knows what they will come up with ? 3D layouts, quantum computers, diamond/light cpu's.. ? whatever, use your imagination, reality is likely even more absurd :)  There is very little point in trying to look more than ~5, maximum 10 years ahead, even intel or AMD don't really know what they'll do then (like they didn't know 10, 20, 30 years ago).

In this context, this <A HREF="http://www.byte.com/art/9612/sec6/art7.htm#" target="_new"> old Byte article </A> might interest you. Its from 1996 and asked top executives at Intel, AMD, Cyrix, DEC, etc to look into their chrystall ball. Note how they are all pretty much at a loss what would happen 10 years down the line, as i'm sure they are now. Some prediction are pretty much spot on though, especially Andy Grove and Jerry Sanders did pretty well if you ask me.

= The views stated herein are my personal views, and not necessarily the views of my wife. =
!