Sign in with
Sign up | Sign in
Your question

Athlon64 X2 3800+ Feedback need I WANT one but is it good ??

Last response: in Video Games
Share
Anonymous
August 2, 2005 5:20:42 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

Athlon64 X2 3800+ Feedback need I WANT one but is it good ??


Whats the difference to the 4200. Which was the one i wanted but to
expensive. Why is the 3800 X2 less expensive ?


Thanlks.
August 2, 2005 5:20:43 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

If cost is such an issue for you just wait til the lowcost version is
released, I believe in September.

Dual core won't be all that useful in games until multi-threaded game
engines and multi-threaded graphics card drivers start appearing around Q2
2006 anyway. That's a full year away. By then Longhorn will be looming
also, probably forcing a major system upgrade cycle.

Save your money.

rms
Anonymous
August 2, 2005 5:20:43 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

"No One Realy" <No one.com> wrote in message
news:6efse15ua60h1213i5mnl3c88ugji8oj59@4ax.com...
> Athlon64 X2 3800+ Feedback need I WANT one but is it good ??
>
>
> Whats the difference to the 4200. Which was the one i wanted but to
> expensive. Why is the 3800 X2 less expensive ?
>


It's apparently a very good CPU and very fast in single threaded apps too.
There's a good multi-page review here:

http://www.techreport.com/onearticle.x/8616

AMD came out with this processor in response to pressure that other dual
core offerings coming from them were too expensive.
Related resources
Anonymous
August 2, 2005 9:17:56 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

On Mon, 1 Aug 2005 18:30:32 -0600, "NightSky 421"
<nightsky421@yahoo.ca> wrote:

>"No One Realy" <No one.com> wrote in message
>news:6efse15ua60h1213i5mnl3c88ugji8oj59@4ax.com...
>> Athlon64 X2 3800+ Feedback need I WANT one but is it good ??
>>
>>
>> Whats the difference to the 4200. Which was the one i wanted but to
>> expensive. Why is the 3800 X2 less expensive ?
>>
>
>
>It's apparently a very good CPU and very fast in single threaded apps too.
>There's a good multi-page review here:
>
>http://www.techreport.com/onearticle.x/8616
>
>AMD came out with this processor in response to pressure that other dual
>core offerings coming from them were too expensive.
>
>


See also the extensive new review here:-

http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=248...

A performance comparison between the X2 3800+ and the
Pentium D 830 ( not 820 !). Not at all flattering for the Intel
processor !!

And the X2 3800+ should overclock quite nicely with very reasonable
air-cooling. Plus it is a plug-in upgrade to most existing Socket 939
motherboards, with a BIOS change. Unlike the Pentium D, which
requires a new 775-pin motherboard, DDR-2 memory ( zero
speed advantage over DDR1 at the current memory-clock speeds)
and 50 to 100 watts of extra power for inferior dual-core-CPU
performance.

John Lewis

- Technology early-birds are flying guinea-pigs.
Anonymous
August 2, 2005 9:42:46 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

The 3800 x2 looks slower in many games than the regular Athlon 64 3800.
If you are buying a dual core CPU, you are paying more money for power that
current games cannot use- and "power" being the literal term here. The
processor is going to put out more heat, require more cooling, and require a
bigger power supply.

Developers had plenty of time to write code for hypertheading with Intel
CPU's, and yet almost none did. Even the current Battlefield 2 doesn't use
multi-core CPU's or hyperthreading.
Anonymous
August 2, 2005 7:08:57 PM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

Magnulus wrote:
> The 3800 x2 looks slower in many games than the regular Athlon 64 3800.
> If you are buying a dual core CPU, you are paying more money for power that
> current games cannot use- and "power" being the literal term here. The
> processor is going to put out more heat, require more cooling, and require a
> bigger power supply.
>
> Developers had plenty of time to write code for hypertheading with Intel
> CPU's, and yet almost none did. Even the current Battlefield 2 doesn't use
> multi-core CPU's or hyperthreading.
>
>
So is all this 64-bit, multi-core just a marketing gimmick? It seems
like it when it comes to games.

http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=248...
I'm looking at the benchmarks here but I don't understand what those
numbers are. In any case the dual core smokes the non-dual core in ICC
SYSMark 2004. But I'm rather sepctical, points are meaningless to me.
I need some other scale like time for example.
Anonymous
August 2, 2005 7:39:28 PM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

"Butterpants" <admin@gmail.com> wrote in message
news:bxPHe.4107$z91.338355@news20.bellglobal.com...
> So is all this 64-bit, multi-core just a marketing gimmick? It seems like
> it when it comes to games.

At this point a few games support 64-bit procesors. Not many. I can't
think of any games that really benefit from dual cores. An Athlon 3800
with the dual cores is actually slower in games than an Athlon 3800 with a
single core.

>
> http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=248...
> I'm looking at the benchmarks here but I don't understand what those
> numbers are. In any case the dual core smokes the non-dual core in ICC
> SYSMark 2004.

For content creation, dual core is faster because these applications often
can use hyperthreading and multiple threads. But games don't run the same
way as these applications. The first games that use multi-threading might
not be out until well into next year. Dual core games might take 2-3 times
the time to program an engine for. Also, the power is going to be limited
(ie, the effect will not be X2), reading Tim Sweeney's description of Unreal
3 on Anandtech, the second core gets used for animation and physics, whereas
the main bulk of the game is still run on one core, because you cannot
divide up game logic between two cores easily. So the second core is
basicly a glorified DSP performing all these other things, only there's no
API for developers to control it; they have to write the instructions
themselves. Maybe middleware will make this task easier in the future, and
perhaps this programming will only have to be done on the engine level
primarily.
Anonymous
August 2, 2005 11:26:45 PM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

On Tue, 2 Aug 2005 05:42:46 -0400, "Magnulus" <magnulus@bellsouth.net>
wrote:

> The 3800 x2 looks slower in many games than the regular Athlon 64 3800.
>If you are buying a dual core CPU, you are paying more money for power that
>current games cannot use- and "power" being the literal term here.


> The
>processor is going to put out more heat,

Wrong

> require more cooling,

Wrong

> and require a
>bigger power supply.
>

Wrong,

.........in the case of the AMD X2 processors ( 90nm, SOI process)
when compared to the current (130nm) version of the single-core
equivalent running at the same clock-speed.

Plug-in replacement ( after BIOS update ) on most Socket-939
motherboards. No need to upgrade power-supply or cooling.
------------------------------------
However, your assertion is perfectly true in the case of all the Intel
dual-core processors, besides requiring a new motherboard and
DDR2 memory. ~60 watts of extra power for the P-D 820 with up
to 110watts extra for the Extreme 840 --- with 80% of the extra power
being demanded from the +12V rail..
----------------------------------
See the X2 3800+ reviews starting with:-

http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=248...

> Developers had plenty of time to write code for hypertheading with Intel
>CPU's, and yet almost none did.

HT is fake dual-core. Limited processor resources. HT only accelerates

multithread applications by 10-20% MAX, and in some cases actually
reduces performance.

> Even the current Battlefield 2 doesn't use
>multi-core CPU's or hyperthreading.
>
>

Hmmm, nVidia is in the process of upgrading their unified video driver

to take advantage of the multithread capability of the dual-core
processors. An excellent start. And some popular games may offer
multithread-upgrade patches, (in a similar way to the 64-bit patch
for Far Cry) -- an opportunity to try out multithread improvements
to their game-engines on a wide audience. I would not be at all
surprised to see a BF2 multi-thread patch towards the end of the year,
by which time a substantial number of dual-core systems will have
been deployed. For example, carving out a chunk of multithread for
the Single-player AI processing in BF2 should help alleviate the
intermittent stutters when the (single-core) CPU bogs down
trying to simultaneously handle the graphics data-setup and the
Singleplayer AI.

Any so-called 'enthusiast' building a 'performance' PC system today
would be an idiot if he/she did not actively consider dual-core ---
specifically one of AMD's offerings.

John Lewis
Anonymous
August 3, 2005 12:12:20 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

"Magnulus" <magnulus@bellsouth.net> wrote in message
news:%cHHe.6165$hp.4358@bignews4.bellsouth.net...
> The 3800 x2 looks slower in many games than the regular Athlon 64 3800.
> If you are buying a dual core CPU, you are paying more money for power
> that current games cannot use- and "power" being the literal term here.
> The processor is going to put out more heat, require more cooling, and
> require a bigger power supply.
>
> Developers had plenty of time to write code for hypertheading with Intel
> CPU's, and yet almost none did. Even the current Battlefield 2 doesn't
> use multi-core CPU's or hyperthreading.
>

Exactly. I don't know why everyone thinks they *NEED* an x2 processor. I
agree its the "next big thing" but until there are multi-threaded apps and
games, there's no point really. Even if you are ready for a new system now
you're better off buying a powerhouse Athlon64 FX chip and then bumping up
to an x2 proc in a couple years when support should be commonplace.
Anonymous
August 3, 2005 4:07:14 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

"John Lewis" <john.dsl@verizon.net> wrote in message
news:42efbff5.4789544@news.verizon.net...
> ........in the case of the AMD X2 processors ( 90nm, SOI process)
> when compared to the current (130nm) version of the single-core
> equivalent running at the same clock-speed.

That's apples to oranges, you aren't comparing chips with the same
process. And an Athlon 3800 is not exactly a low power chip anyways.

>
> Plug-in replacement ( after BIOS update ) on most Socket-939
> motherboards. No need to upgrade power-supply or cooling.

So you think a 430 watt power supply will hack it? Somehow, I don't
think so. Especially when you factor in every other component is consuming
more power in a PC now, too. The Creative X-Fi is going to use alot more
power than most ordinary sound cards, for instance. But I'd rather have my
power going to something I will use, rather than something I won't. And I
don't see dual core processors at all being useful for over a year or more,
not for games anyways, though they might have productivity benefits.

>
> Any so-called 'enthusiast' building a 'performance' PC system today
> would be an idiot if he/she did not actively consider dual-core ---
> specifically one of AMD's offerings.

I considered dual core, but I decided to go with the Socket 939
motherboard and a regular Athlon 64 just for the potential to upgrade to
dual core and dual channel memory. At the moment I think dual core in a
non-feature, though, even worse than 64-bit support. If you are into
productivity apps, you might benefit from dual core, but even then most
people running typical office apps aren't going to see any real improvement.
Anonymous
August 3, 2005 6:19:07 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

On Tue, 2 Aug 2005 20:12:20 -0400, "HockeyTownUSA" <cyberpilot at
gmail dot com> wrote:

>
>"Magnulus" <magnulus@bellsouth.net> wrote in message
>news:%cHHe.6165$hp.4358@bignews4.bellsouth.net...
>> The 3800 x2 looks slower in many games than the regular Athlon 64 3800.
>> If you are buying a dual core CPU, you are paying more money for power
>> that current games cannot use- and "power" being the literal term here.
>> The processor is going to put out more heat, require more cooling, and
>> require a bigger power supply.
>>
>> Developers had plenty of time to write code for hypertheading with Intel
>> CPU's, and yet almost none did. Even the current Battlefield 2 doesn't
>> use multi-core CPU's or hyperthreading.
>>
>
>Exactly. I don't know why everyone thinks they *NEED* an x2 processor. I
>agree its the "next big thing" but until there are multi-threaded apps and
>games, there's no point really. Even if you are ready for a new system now
>you're better off buying a powerhouse Athlon64 FX chip and then bumping up
>to an x2 proc in a couple years when support should be commonplace.
>

You are going to be upset when Nvidia release their multithreaded
video driver currently in development and several multithread PC
patches become available for popular action games... ( including BF2
maybe)..... as a real-live test for upgrading the game-engines for the
new multithreaded consoles. Sinking many $$$ in a FX chip as an
upgrade now or in the immediate future is a really bad idea even for
the enthusiast gamer...IMHO.

John Lewis
Anonymous
August 3, 2005 11:02:22 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

Most people aren't doing video editing, though. Most people are using
Office or Outlook Express, or maybe ripping an MP3. I don't see how dual
cores will really benefit those people.

About as close as I get to actual work on a PC, working with graphics, is
creating textures in Paint Shop Pro. And currently a single core is more
than fast enough for that.
Anonymous
August 3, 2005 12:44:08 PM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

On Wed, 3 Aug 2005 00:07:14 -0400, "Magnulus" <magnulus@bellsouth.net>
wrote:

>
>"John Lewis" <john.dsl@verizon.net> wrote in message
>news:42efbff5.4789544@news.verizon.net...
>> ........in the case of the AMD X2 processors ( 90nm, SOI process)
>> when compared to the current (130nm) version of the single-core
>> equivalent running at the same clock-speed.
>
> That's apples to oranges, you aren't comparing chips with the same
>process. And an Athlon 3800 is not exactly a low power chip anyways.
>
>>
>> Plug-in replacement ( after BIOS update ) on most Socket-939
>> motherboards. No need to upgrade power-supply or cooling.
>
> So you think a 430 watt power supply will hack it?

Yes, of course, if an AMD dual-core, you are not running SLI and the
total +12V PS capacity is 22 Amps or greater, as expected from a
quality 430 watt supply.

> Somehow, I don't
>think so. Especially when you factor in every other component is consuming
>more power in a PC now, too. The Creative X-Fi is going to use alot more
>power than most ordinary sound cards, for instance.

Really ?? More than 10 watts extra and I would be very surprised
indeed. Audio processing takes little power, regardless of complexity.

> But I'd rather have my
>power going to something I will use, rather than something I won't. And I
>don't see dual core processors at all being useful for over a year or more,
>not for games anyways, though they might have productivity benefits.
>

Lots..........

>>
>> Any so-called 'enthusiast' building a 'performance' PC system today
>> would be an idiot if he/she did not actively consider dual-core ---
>> specifically one of AMD's offerings.
>
> I considered dual core, but I decided to go with the Socket 939
>motherboard and a regular Athlon 64 just for the potential to upgrade to
>dual core and dual channel memory.

Don't you have a dual-channel pair of DIMMS installed already even
with your single-core Athlon ? Otherwise you are strangling the
processor to half the potential memory bandwidth and your gaming
performance will suffer significantly.

> At the moment I think dual core in a
>non-feature, though, even worse than 64-bit support. If you are into
>productivity apps, you might benefit from dual core, but even then most
>people running typical office apps aren't going to see any real improvement.
>

Incorrect. Especially when manipulating several programs in multiple
windows. You see far less of the evil hour-glass and sticky windows.
A very pleasant improvement.

John Lewis
>
Anonymous
August 3, 2005 12:44:09 PM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

"John Lewis" <john.dsl@verizon.net> wrote in message
news:42f07eda.2645443@news.verizon.net...

> Really ?? More than 10 watts extra and I would be very surprised
> indeed. Audio processing takes little power, regardless of complexity.

The X-Fi has 51 million transistors running at 400 Mhz. The Audigy 2 has
less than 5 million transistors running at 200 MHz, and the power
consumption was only a few watts- typical of soundcards. The new X-Fi is on
a .13 micron process, but I don't think that lowers the power consumption
all that much. There is a rumor that the PCI version, the one that is
expected to be the mainstream, might require a molex connection to the power
supply. It will be interesting to see how they pull this off since Creative
will most likely opt for a fanless design- it might be the first mainstream
soundcard with a heatsink.

> Don't you have a dual-channel pair of DIMMS installed already even
> with your single-core Athlon ? Otherwise you are strangling the
> processor to half the potential memory bandwidth and your gaming
> performance will suffer significantly.

All the benchmarks I have seen have shown minimal benefit to dual channel
memory right now in games. Currently I have a 1 GB DIMM, and I will upgrade
when the price of memory falls more. The Intel chips seem to benefit alot
more from dual channel memory. I suspect, though, that all that extra
memory bandwith should give the Socket 939 platform a long lifespan and it
should scale with faster processors.

> Incorrect. Especially when manipulating several programs in multiple
> windows. You see far less of the evil hour-glass and sticky windows.
> A very pleasant improvement.

But that really has no bearing upon games. And most people aren't running
those kinds of applications that it makes that much difference. My dad's
computer, for instance, is slow as hell, but that's mostly because he
probably has spyware and a registry that needs serious cleaning (or Windows
reinstallation).
Anonymous
August 23, 2005 2:39:33 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

"John Lewis" <john.dsl@verizon.net> wrote in message
news:42eeff8f.37243797@news.verizon.net...
> On Mon, 1 Aug 2005 18:30:32 -0600, "NightSky 421"
> <nightsky421@yahoo.ca> wrote:
>
>>"No One Realy" <No one.com> wrote in message
>>news:6efse15ua60h1213i5mnl3c88ugji8oj59@4ax.com...
>>> Athlon64 X2 3800+ Feedback need I WANT one but is it good ??
>>>
>>>
>>> Whats the difference to the 4200. Which was the one i wanted but to
>>> expensive. Why is the 3800 X2 less expensive ?
>>>
>>
>>
>>It's apparently a very good CPU and very fast in single threaded apps too.
>>There's a good multi-page review here:
>>
>>http://www.techreport.com/onearticle.x/8616
>>
>>AMD came out with this processor in response to pressure that other dual
>>core offerings coming from them were too expensive.
>>
>>
>
>
> See also the extensive new review here:-
>
> http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=248...
>
> A performance comparison between the X2 3800+ and the
> Pentium D 830 ( not 820 !). Not at all flattering for the Intel
> processor !!
>
> And the X2 3800+ should overclock quite nicely with very reasonable
> air-cooling. Plus it is a plug-in upgrade to most existing Socket 939
> motherboards, with a BIOS change. Unlike the Pentium D, which
> requires a new 775-pin motherboard, DDR-2 memory ( zero
> speed advantage over DDR1 at the current memory-clock speeds)
> and 50 to 100 watts of extra power for inferior dual-core-CPU
> performance.
>
> John Lewis
>
> - Technology early-birds are flying guinea-pigs.

Hiya John...
I'm made meself into a flying guinea-pig, I guess :) 

My first Athlon, haven't had an AMD chip since K6-2 350 :)  I can see
they've got better since then. Just to the right of me on the desk, I have
a newly built box. Has a GA 8KNS Ultra 939 nf3 ultra mobo, 2 x 1g DDR400
(Corsair), 6800GT (AGP), and a A64 X2 4800+, running under either XP Pro
(standard) or XP x64.
There is ONE program that doesn't scream... and that is 3DMark05.
For once, the 6800GT is giving its all! :) 
I hear Magnalus is running x64?
McG.
Anonymous
August 23, 2005 2:42:34 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

"Magnulus" <magnulus@bellsouth.net> wrote in message
news:%cHHe.6165$hp.4358@bignews4.bellsouth.net...
> The 3800 x2 looks slower in many games than the regular Athlon 64 3800.
> If you are buying a dual core CPU, you are paying more money for power
> that current games cannot use- and "power" being the literal term here.
> The processor is going to put out more heat, require more cooling, and
> require a bigger power supply.
>
> Developers had plenty of time to write code for hypertheading with Intel
> CPU's, and yet almost none did. Even the current Battlefield 2 doesn't
> use multi-core CPU's or hyperthreading.
>
What temps does the A64 3800+ run at?

I have a X2 4800+ running right now at 1.38v, cpu fan at 2000, CPU temp is
41C.
McG.
Anonymous
August 23, 2005 2:49:05 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

"Butterpants" <admin@gmail.com> wrote in message
news:bxPHe.4107$z91.338355@news20.bellglobal.com...
> Magnulus wrote:
>> The 3800 x2 looks slower in many games than the regular Athlon 64
>> 3800. If you are buying a dual core CPU, you are paying more money for
>> power that current games cannot use- and "power" being the literal term
>> here. The processor is going to put out more heat, require more cooling,
>> and require a bigger power supply.
>>
>> Developers had plenty of time to write code for hypertheading with
>> Intel CPU's, and yet almost none did. Even the current Battlefield 2
>> doesn't use multi-core CPU's or hyperthreading.
> So is all this 64-bit, multi-core just a marketing gimmick? It seems like
> it when it comes to games.

I just built an X2 4800+ rig this weekend. Only games I have installed are
Far cry 64 bit and Doom3. Far Cry is astounding (but a tad flakey, some
keypresses reset the machine in SP but not in the MP maps). Doom3 with all
the goodies, in Ultra quality mode, in 1024x768 is truly a fantastic looking
and playing game. In 1280x1024 it's jerky and mouse is mushy. Gotta fiddle
with something it seems. But with Far Cry 64, I just maxed everything out
and played it.
I can say so far, this machine with the X2 really bangs on my P4 3.0E 2 gigs
ram. Both with the 6800GT.
McG.
>
> http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=248...
> I'm looking at the benchmarks here but I don't understand what those
> numbers are. In any case the dual core smokes the non-dual core in ICC
> SYSMark 2004. But I'm rather sepctical, points are meaningless to me. I
> need some other scale like time for example.
Anonymous
August 23, 2005 3:00:18 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

"Magnulus" <magnulus@bellsouth.net> wrote in message
news:h1YHe.4119$jq.1549@bignews3.bellsouth.net...
>
> "John Lewis" <john.dsl@verizon.net> wrote in message
> news:42efbff5.4789544@news.verizon.net...
>> ........in the case of the AMD X2 processors ( 90nm, SOI process)
>> when compared to the current (130nm) version of the single-core
>> equivalent running at the same clock-speed.
>
> That's apples to oranges, you aren't comparing chips with the same
> process. And an Athlon 3800 is not exactly a low power chip anyways.
>
>>
>> Plug-in replacement ( after BIOS update ) on most Socket-939
>> motherboards. No need to upgrade power-supply or cooling.
>
> So you think a 430 watt power supply will hack it?
I'll be seeing soon enough. Mine is running on a Thermaltake PurePower 420W
right now. I'll pick up a good 550W ps though. I'll be adding a couple
more drives to this one.


> Somehow, I don't think so. Especially when you factor in every other
> component is consuming more power in a PC now, too. The Creative X-Fi is
> going to use alot more power than most ordinary sound cards, for instance.
> But I'd rather have my power going to something I will use, rather than
> something I won't. And I don't see dual core processors at all being
> useful for over a year or more, not for games anyways, though they might
> have productivity benefits.
>
>>
>> Any so-called 'enthusiast' building a 'performance' PC system today
>> would be an idiot if he/she did not actively consider dual-core ---
>> specifically one of AMD's offerings.
>
> I considered dual core, but I decided to go with the Socket 939
> motherboard and a regular Athlon 64 just for the potential to upgrade to
> dual core and dual channel memory. At the moment I think dual core in a
> non-feature, though, even worse than 64-bit support. If you are into
> productivity apps, you might benefit from dual core, but even then most
> people running typical office apps aren't going to see any real
> improvement.
>
The 64 bit dual core cpu's are bleeding cutting edge right now. There isn't
much 64 bit support for Windows 64. There are NO current drivers for my
printer, scanner or pen tablet. So I dual boot to standard XP Pro for that
stuff, and boot to XP x64 for horsing around with the games n stuff.
I remember when XP was released retail, lots of manufacturers were busy the
first year with fixing driver problems.
But, some manufacturers aren't going to fix their drivers for 'non
professional hardware' to work with a 64 bit op sys. Wacom is one of those.
Nvidia and ATI are at the front of the bus there :)  Looks like this is
where 77.77 is *good*... as 64 bit :) 
McG.
Anonymous
August 23, 2005 3:07:04 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

"HockeyTownUSA" <cyberpilot at gmail dot com> wrote in message
news:VbednXdB7L3Tl23fRVn-2Q@comcast.com...
>
> "Magnulus" <magnulus@bellsouth.net> wrote in message
> news:%cHHe.6165$hp.4358@bignews4.bellsouth.net...
>> The 3800 x2 looks slower in many games than the regular Athlon 64 3800.
>> If you are buying a dual core CPU, you are paying more money for power
>> that current games cannot use- and "power" being the literal term here.
>> The processor is going to put out more heat, require more cooling, and
>> require a bigger power supply.
>>
>> Developers had plenty of time to write code for hypertheading with Intel
>> CPU's, and yet almost none did. Even the current Battlefield 2 doesn't
>> use multi-core CPU's or hyperthreading.
>>
>
> Exactly. I don't know why everyone thinks they *NEED* an x2 processor. I
> agree its the "next big thing" but until there are multi-threaded apps and
> games, there's no point really. Even if you are ready for a new system now
> you're better off buying a powerhouse Athlon64 FX chip and then bumping up
> to an x2 proc in a couple years when support should be commonplace.
>
Well right, HT. It's cutting edge stuff, and only someone that really wants
to play with the stuff (and has a spare grand for the cpu alone sitting
around!) is likely to nudge a parts snagger into getting one for him.
Um...like me! I feel like you guys, it'll be at least another year before
we begin seeing some significant support for this stuff. 64bit is
struggling right now, don't even think dual core...unless someone really
really starts pushing them!
if I'd been using good sense instead of letting the drool factor take
control... I'd have picked up a A64 4000+ instead of the X2 4800+. The
4000 would probably be faster in most things right now. I think.
McG.
Anonymous
August 23, 2005 3:10:25 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

"John Lewis" <john.dsl@verizon.net> wrote in message
news:42f02755.31257393@news.verizon.net...
> On Tue, 2 Aug 2005 20:12:20 -0400, "HockeyTownUSA" <cyberpilot at
> gmail dot com> wrote:
>
>>
>>"Magnulus" <magnulus@bellsouth.net> wrote in message
>>news:%cHHe.6165$hp.4358@bignews4.bellsouth.net...
>>> The 3800 x2 looks slower in many games than the regular Athlon 64
>>> 3800.
>>> If you are buying a dual core CPU, you are paying more money for power
>>> that current games cannot use- and "power" being the literal term here.
>>> The processor is going to put out more heat, require more cooling, and
>>> require a bigger power supply.
>>>
>>> Developers had plenty of time to write code for hypertheading with
>>> Intel
>>> CPU's, and yet almost none did. Even the current Battlefield 2 doesn't
>>> use multi-core CPU's or hyperthreading.
>>>
>>
>>Exactly. I don't know why everyone thinks they *NEED* an x2 processor. I
>>agree its the "next big thing" but until there are multi-threaded apps and
>>games, there's no point really. Even if you are ready for a new system now
>>you're better off buying a powerhouse Athlon64 FX chip and then bumping up
>>to an x2 proc in a couple years when support should be commonplace.
>>
>
> You are going to be upset when Nvidia release their multithreaded
> video driver currently in development and several multithread PC
> patches become available for popular action games... ( including BF2
> maybe)..... as a real-live test for upgrading the game-engines for the
> new multithreaded consoles. Sinking many $$$ in a FX chip as an
> upgrade now or in the immediate future is a really bad idea even for
> the enthusiast gamer...IMHO.
>
> John Lewis

Uh, John... you do know that the X2 4800+ is a dual FX53 core at 90nm,
right? Toledo! LOL! There's no doubt about it, it's fast :)  I'm not at
all sure why 3DMark05 returned such lousy numbers. It was slower than my P4
3.0E, both rigs using the 6800GT and 77.77 drivers. Unless 3DM05 didn't
agree with x64?
McG.
Anonymous
August 23, 2005 3:16:18 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

"Magnulus" <magnulus@bellsouth.net> wrote in message
news:Gt1Ie.6404$jq.2228@bignews3.bellsouth.net...
> Most people aren't doing video editing, though. Most people are using
> Office or Outlook Express, or maybe ripping an MP3. I don't see how dual
> cores will really benefit those people.
>
> About as close as I get to actual work on a PC, working with graphics, is
> creating textures in Paint Shop Pro. And currently a single core is more
> than fast enough for that.
>
Yeah but you oughtta see how fast PSP9 opens up. Then 1 sec for each
additional instance of it. I usually run 3 to work in, and sometimes
another to go looking for stuff...browser.
now is when we need the fastest disk drives.
McG.
Anonymous
August 23, 2005 7:36:14 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

Thusly "McGrandpa" <McGrandpaNOT@NOThotmail.com> Spake Unto All:

>Yeah but you oughtta see how fast PSP9 opens up.

The big advantage of dual core isn't that things run faster - as a
rule things wont - but that it improves multitasking. It is probably
true that most people don't run a lot of apps in parallel at present,
but IMHO that's to no small part because single-CPU systems discourage
paralellism & multitasking. Once dualcores are common hopefully people
will learn to multitask & run multiple apps at the same time.

For me, I look forward to the day when I get my dualcore and a
CPU-intensive Photoshop action, a Norton full disk scan, or a hung
program, doesn't stall Windows Explorer.
Anonymous
August 23, 2005 7:57:46 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

"McGrandpa" wrote:
> if I'd been using good sense instead of letting the drool factor take
> control... I'd have picked up a A64 4000+ instead of the X2 4800+. The
> 4000 would probably be faster in most things right now. I think.
> McG.

Look at this way, McGrandpa.

You, Sir, are READY!

Mark

P.S.: Gimme, please. :-)
Anonymous
August 26, 2005 12:12:33 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

Windows 64 bites. I feel so burned I bought it. Lots of drivers for
stuff wouldn't work (I had to use Broadcom drivers instead of Linksys
wireless drivers, and I lost out on the 125 mbps transfer rate). The last
straw was I went out and bought a Creative X-Fi, and of course it did not
include drivers for x64. So I reinstalled Windows XP, activated it
(amazingly had no problems doing so), and now everything runs.

Windows x64 is for "enthusiasts", but I think more than a few gamers were
fooled into believing that's "gamers". In actual fact, Windows x64 does
have some legitimate applications (Episode III Star Wars was done with
Windows x64). But the average gamer doesn't need the ability to address
over 4 gigabytes of memory, which as far as I could tell, is the only real
benefit to Windows x64. Sure, there is a marginal speed increase in certain
non-gaming applications, but it's not worth it. Windows x64 is today's
Windows 2000.
August 26, 2005 1:08:03 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

"Magnulus" <Magnulus@bellsouth.net> wrote in message
news:g6tPe.330$2_.136@bignews6.bellsouth.net...
> But the average gamer doesn't need the ability to address over 4 gigabytes
> of memory, which as far as I could tell, is the only real benefit to
> Windows x64.

Besides memory, it has the ability to compute floating point operations
using the CPU's native 64bit registers. Under 32 bit OS, floating points
are usually 8 bytes (there are 4 byte floats, but they don't have the
precision to be used for much). At the machine code lefvel, 32bit OS's cant
fit an 8 byte floating point number into their 4 byte registers in order to
do math on it, so they have to play tricks of chopping them in half,
shifting bits, etc. So, x64 can do the floating point work using less
machine instructions which is an increase in speed for applications (or
parts of applications) doing lots of floating point work.
Anonymous
August 26, 2005 5:45:24 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

"Magnulus" <Magnulus@bellsouth.net> wrote in message
news:g6tPe.330$2_.136@bignews6.bellsouth.net...
> Windows 64 bites. I feel so burned I bought it. Lots of drivers for
> stuff wouldn't work (I had to use Broadcom drivers instead of Linksys
> wireless drivers, and I lost out on the 125 mbps transfer rate). The last
> straw was I went out and bought a Creative X-Fi, and of course it did not
> include drivers for x64. So I reinstalled Windows XP, activated it
> (amazingly had no problems doing so), and now everything runs.
>
> Windows x64 is for "enthusiasts", but I think more than a few gamers were
> fooled into believing that's "gamers". In actual fact, Windows x64 does
> have some legitimate applications (Episode III Star Wars was done with
> Windows x64). But the average gamer doesn't need the ability to address
> over 4 gigabytes of memory, which as far as I could tell, is the only real
> benefit to Windows x64. Sure, there is a marginal speed increase in
> certain non-gaming applications, but it's not worth it. Windows x64 is
> today's Windows 2000.
>
Agreed about it being an "enthusiasts OS" for the moment. I discovered very
fast that common peripheral drivers for 64 bit mostly don't exist. Some
tech support types tell me (politely) not to expect 64 bit drivers to be
written for non-professional hardware :)  Ha! Like my printer and scanner
can't be 'professional'? The Graphire3 6x8 pen tablet? Hmmm. Methinks
tunes will change, or x64 will simply disappear. But with good support with
Linux 64, I don't think M$ will bow out of this one at all. They'll just
take a couple years to catch up :) 
Things that do go faster: Far Cry 64!!!!! Even Doom3.
McG.
August 26, 2005 5:45:25 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

The reality is that MS is consumer testing Windows 64. In a few years when
a large percentage of machines are 64 bit and most software is written in 64
bit code as well as 32 bit code a second or third version of 64 bit Windows
will take full advantage of the hardware and there will be huge increases in
speed and stability.

JK

"McGrandpa" <McGrandpaNOT@NOThotmail.com> wrote in message
news:UuuPe.247986$X76.152251@tornado.texas.rr.com...
>
> "Magnulus" <Magnulus@bellsouth.net> wrote in message
> news:g6tPe.330$2_.136@bignews6.bellsouth.net...
>> Windows 64 bites. I feel so burned I bought it. Lots of drivers for
>> stuff wouldn't work (I had to use Broadcom drivers instead of Linksys
>> wireless drivers, and I lost out on the 125 mbps transfer rate). The
>> last straw was I went out and bought a Creative X-Fi, and of course it
>> did not include drivers for x64. So I reinstalled Windows XP, activated
>> it (amazingly had no problems doing so), and now everything runs.
>>
>> Windows x64 is for "enthusiasts", but I think more than a few gamers
>> were fooled into believing that's "gamers". In actual fact, Windows x64
>> does have some legitimate applications (Episode III Star Wars was done
>> with Windows x64). But the average gamer doesn't need the ability to
>> address over 4 gigabytes of memory, which as far as I could tell, is the
>> only real benefit to Windows x64. Sure, there is a marginal speed
>> increase in certain non-gaming applications, but it's not worth it.
>> Windows x64 is today's Windows 2000.
>>
> Agreed about it being an "enthusiasts OS" for the moment. I discovered
> very fast that common peripheral drivers for 64 bit mostly don't exist.
> Some tech support types tell me (politely) not to expect 64 bit drivers to
> be written for non-professional hardware :)  Ha! Like my printer and
> scanner can't be 'professional'? The Graphire3 6x8 pen tablet? Hmmm.
> Methinks tunes will change, or x64 will simply disappear. But with good
> support with Linux 64, I don't think M$ will bow out of this one at all.
> They'll just take a couple years to catch up :) 
> Things that do go faster: Far Cry 64!!!!! Even Doom3.
> McG.
>
Anonymous
September 1, 2005 7:03:02 AM

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

Then everyone will program for multicaore. The everything will slow
down as much as for single core cpus and were back where we started.

bahahah.


On Tue, 23 Aug 2005 03:36:14 +0200, Mean_Chlorine
<mike_noren2002@NOSPAMyahoo.co.uk> wrote:

>Thusly "McGrandpa" <McGrandpaNOT@NOThotmail.com> Spake Unto All:
>
>>Yeah but you oughtta see how fast PSP9 opens up.
>
>The big advantage of dual core isn't that things run faster - as a
>rule things wont - but that it improves multitasking. It is probably
>true that most people don't run a lot of apps in parallel at present,
>but IMHO that's to no small part because single-CPU systems discourage
>paralellism & multitasking. Once dualcores are common hopefully people
>will learn to multitask & run multiple apps at the same time.
>
>For me, I look forward to the day when I get my dualcore and a
>CPU-intensive Photoshop action, a Norton full disk scan, or a hung
>program, doesn't stall Windows Explorer.
>
!