apesoccer

Distinguished
Jun 11, 2004
1,020
0
19,310
There is no such thing as a dual 3500.

Current X2's: 3800, 4200, 4400, 4600 and 4800.

You can run x2's on any nf4 chipset (you can check to see if your current bios includes x2's by checking the bios version 10xx against Asus's website).


edit: would it be worth it: Well that's up to you. If you have a lot of background tasks eating up processing power, then maybe. If you run programs that make use of multiple threads then yes. Otherwise...probably not, at least not for another year or so...i hear there might be an os out there that makes better use of multiple processors...in the next couple of years there will be more and more multithreaded programs...but the change will be slow. The building time for multithreaded app's is usually more then double the time for single threaded app's.

F@H:
AMD: [64 3000+][2500+][2400+][2000+][1.3][366]
Intel: [X 3.0x4][X 2.8x2][P4 3.0x2][P4-M 2.4][P4 1.3]

"...and i'm not gay" RX8 -Greatest Quote of ALL Time<P ID="edit"><FONT SIZE=-1><EM>Edited by apesoccer on 09/15/05 11:13 AM.</EM></FONT></P>
 

K8MAN

Distinguished
Apr 1, 2005
839
0
18,980
i would reccomend spending a little more and getting dual 3600+'s.....

The know-most-of-it-all formally known as BOBSHACK
 

slvr_phoenix

Splendid
Dec 31, 2007
6,223
1
25,780
????

<font color=purple> "Stop Whining!"
Ed Stroligo - 9/15/05

Computers haven't gotten much faster the last few years. It doesn't look like they're going to get a whole lot faster for most purposes the next few years, either.

Yes, the CPU makers have real problems; I know that all too well. However, no one else seems willing to pick up the ball and run with it.

Whenever programmers open their mouths these days, it seems like all they do is whine about their multithreaded fate. For once, the burden is on them to get their programs running faster, rather than let Intel or AMD do it for them, and what are they doing?

Are they saying, "We look forward to the challenge," or show enthusiasm about new opportunities? I've seen little of that. No, instead I see moaning and groaning because the old dogs will have to learn some new tricks.

I'm not trying to minimize the difficulties; I'm complaining about the attitude. It sounds more like politicians from a certain waterlogged place talking than Geek Valhalla.

Even some of the hardware people are getting into the act, saying that more cores rather than faster cores will bottleneck their video cards.

Well, gee, isn't that too bad! Guess you'd better go out of business.

That's what it boils down to: Adapt or die. The world is going to change to multicores, multithreading for all but the mundane stuff; probably change for good. If you don't do it, somebody else will, and soon after that, you'll become a trivia question.

You know, it's very unbecoming of geek gods to gripe so greatly. Makes them seem all too . . . human.

Perhaps more importantly for the overall industry, how can one expect mere enthusiasts to run out and buy these technologies when those who'll have to make it go have such a "can't do" attitude?

If the hardware doesn't get much better, and the programming doesn't get much better, why buy?

Ed</font color=purple>
What examples?

And I'm sorry, but not only does Ed sound like an arse, but he also doesn't seem to have a clue. Programmers aren't whining because they don't want to adapt. Look at how many have adapted to optimizing for unique architectures like Netburst, K7, K8, etc.. Look at how many have adapted to new environments such as .NET, Qt, etc.

What Ed seems to be oblivious about is two fold:
1) Multithreading code is easy. Multithreading code without creating timing bugs, memory leaks, etc. is f'ing hard. You need a multitude of different platforms (since you need different timings to even find timing bugs) that incurs a nasty expense. You'll have a hell of a time debugging because with multiple threads stepping through code becomes downright difficult to impossible. (Compared to it being the walk in the park that it used to be.) And you'll generally need to completely redesign your code from the ground up to even make it possible, which is a hell of a lot more than a simple port or adaptation and will create numerous new bugs in the redesign (since no one is perfect).

2) Even after all of that great expense and hassle to get <i>good</i> multithreaded code, chances are that you'll maybe gain a 5-20% performance boost from your second CPU and multithreaded code as most of the time you won't really be able to run multiple simultaneous heavy-use threads anyway. And on single CPU computers you'll often see a performance <i>drop</i> from all of your thread interaction and timing code.

What Ed doesn't seem to have a clue about is that in the vast majority of cases multithreading <i>isn't</i> a useful answer. There are a considerable number of programs that just simply <i>can't</i> gain from it, and of those that can gain, it definately won't be even close to a 1:1 scale for each new proc/core. And that's after spending well over <i>four times</i> the resources in time, money, manpower, etc. to multithread your code. And even then there is always the chance of yet another a timing bug just waiting to be found in the one of many configurations that you didn't use in testing.

Other than the few highly parallel programs out there, the only real advantage to multiple procs/cores is that you can run multiple high-usage independant programs at the same time without them strangling each other. Whoop-de-doo.

At least with Intel's HT concept your 'real' core can access all of the execution units that your 'fake' core isn't using, thereby maximizing the CPU's productivity. Where as dual-core quite often just leaves one processor sitting around doing almost nothing. Even if a dual-core's two core's CPU usages are high (to my knowledge) that's just indicating high instruction throughput, not high execution unit usage. Not that HT is any ultimate answer either, but technically speaking, a processor would be a lot better off with a large bank of execution units and cache shared by several seperate instruction handlers than by wholely seperate cores. You're really a lot better off improving instruction-level parallelism than you are improving thread-level parallelism. Muticore CPUs are definately no panacea.

:evil: یί∫υєг ρђœŋίχ :evil:
<font color=red><i>Deal with the Devil. He buys in bulk.</i></font color=red>
@ 197K of 200K!
 

apesoccer

Distinguished
Jun 11, 2004
1,020
0
19,310
Heh nice...I wonder though...I suppose, with all the complaining, if that will raise prices on software in general for a while (at least until it becomes second hand for the programmers). =/

F@H:
AMD: [64 3000+][2500+][2400+][2000+][1.3][366]
Intel: [X 3.0x4][X 2.8x2][P4 3.0x2][P4-M 2.4][P4 1.3]

"...and i'm not gay" RX8 -Greatest Quote of ALL Time
 

slvr_phoenix

Splendid
Dec 31, 2007
6,223
1
25,780
Are you kidding? It'll <i>definately</i> raise prices on software. Multithreaded software takes a heck of a lot more to test and debug, both in hardware and in time, and that means more cost if the software house doesn't want to go bankrupt. **LOL**

The only other option is crappy code with a lot more bugs in it on release that'll likely give people hell on any platform that it wasn't tested under. And I'm sure no one wants that. :O

:evil: یί∫υєг ρђœŋίχ :evil:
<font color=red><i>Deal with the Devil. He buys in bulk.</i></font color=red>
@ 197K of 200K!
 

apesoccer

Distinguished
Jun 11, 2004
1,020
0
19,310
Well...<blushes>...heh.

Do you think it'll cost more down the road? Or like most things, once we learn to think multithreaded and code multithreaded it'll just be like writing single threaded. You'll just be thinking about coding differently...

<me likes taking the professors point of view...if i'm wrong or being obtuse about something, just say that the other person didn't understand what i meant>

F@H:
AMD: [64 3000+][2500+][2400+][2000+][1.3][366]
Intel: [X 3.0x4][X 2.8x2][P4 3.0x2][P4-M 2.4][P4 1.3]

"...and i'm not gay" RX8 -Greatest Quote of ALL Time
 

Atolsammeek

Distinguished
Dec 31, 2007
1,112
0
19,280
I see it this way. I had to reprogram Doom 3 so I could use more memory from the Grapic Card.

Then with computers that have 1 or 2gb of ram. Why has windows not make a standby that uses the Ram Not the Hard drive.

One thing I do say it dumb to change to dual core right now for it too new use. It like when Dos and windows 95. Going from 16 bit to 32 bit.

Then there the other thing Mirosoft will not get off there rear and start useing windows 64.

One small Fact is Plvr phoenix. Everything gose up in price. Let take gas. In the 1950s It ws like .25 a gal. Now it $3.00 a gal.
 
LOL

Then there the other thing Mirosoft will not get off there rear and start useing windows 64.

Start using XP64? What does it matter what OS they use? Anyway, they offered a free upgrade for pro users to the 64bit version. The problem with 64 is the drivers, I still can't find 64bit ones for my Belkin WLAN card.

The rest of you post is just a mess of bad english apart from:
One small Fact is Plvr phoenix. Everything gose up in price. Let take gas. In the 1950s It ws like .25 a gal. Now it $3.00 a gal.

Keeping things computer related, everything goes down in price! How much can you pick up an Athlon 64 for now compared to their initial release?

(\__/)
(='.'=) <A HREF="http://snipurl.com/fxwr" target="_new">Welcome to the House of Horrors, welcome to the House of a 1000 Corpses</A>
(")_(")