AMD 4000 or X2 Model

choirbass

Distinguished
Dec 14, 2005
1,586
0
19,780
definetly go with a dual core model... for the cost, an X2 3800+ is an excellent price/performance cpu... and if youre into overclocking, it can easily reach in most cases equal to about an X2 4600+ (2.4 Ghz) or better

i would only suggest investing in a more expensive model if you prefer not to overclock, and so the cpu would come at a higher clock speed by default then


really though, any X2 model you get (regardless of speed) will make a noticable improvement overall more than a single core upgrade would make
 
I am runnin a AMD 3200 and looking to upgrade. Should i go with the 4000 or go to and X2 model?
If its for gamming mostly the X2 3800 is low performance than your current 3200. In single core games the 4000+ is a great CPU only topped by the E6600 to X6800 of Intels and the AMD FX55, FX57, FX60, FX62, and X2 5000+. While the 4000+ does get beat by more than a few CPU's its price for performance is unbeatable in gamming.

Is this an AM2 or 939 as price may not go much lower for the 939. In the 939 I would suggest a X2 4600+ as that most likly will be the last CPU you'll buy for that mobo. The AM2 is mostly going to see a new CPU design at the start of next year so for this mobo buy cheap now or wait till the new CPU design.
 

choirbass

Distinguished
Dec 14, 2005
1,586
0
19,780
yep, by default, both the 3200+ and X2 3800+ come clocked at 2.0 GHz, so, clock for clock theyre the same...

...for even gaming though, the performance difference is more negligable (especially if your gpu is taking most of the burden anyhow graphically)... a 4000+ has a higher clock speed by default, yes (2.4 GHz)... but you basically only have half the total cpu then... games coded for single core usage can only make direct use of a single core, but with 2 cores, you will automatically benefit from the extra core on the cpu anyhow, because all other OS and process threads are offloaded from your already burdoned core when youre gaming, resulting in more fluid gameplay in general

for any other application though that isnt so gpu dependant... no comparison really, even when the bottleneck is I/O throughput where your HDD is the bottleneck for example, theres still better performance in the sense that your computer wont just be locked up for the time being
 
yep, by default, both the 3200+ and X2 3800+ come clocked at 2.0 GHz, so, clock for clock theyre the same...

...for even gaming though, the performance difference is more negligable (especially if your gpu is taking most of the burden anyhow graphically)... a 4000+ has a higher clock speed by default, yes (2.4 GHz)... but you basically only have half the total cpu then... games coded for single core usage can only make direct use of a single core, but with 2 cores, you will automatically benefit from the extra core on the cpu anyhow, because all other OS and process threads are offloaded from your already burdoned core when youre gaming, resulting in more fluid gameplay in general

for any other application though that isnt so gpu dependant... no comparison really, even when the bottleneck is I/O throughput where your HDD is the bottleneck for example, theres still better performance in the sense that your computer wont just be locked up for the time being
Most gamers shut down all other process and mostly whats left are GPU drivers, anti-virus, software firewall's, and connection processing. Everything but connection processing and GPU drivers are single threaded so your ping may stay better and make things a little smoother. The only differance I've seen in dual core GPU's is the ATI crossfire, due to load balancing issues, does the dual cores do better in single core gamming. I dont see much of a use having crossfire or SLI on anything less than the X2 4600+ so the GPU question isn't that big.

There are several dual core patched games but the 4000+ comes up to about he 4200+ at a cheaper price but the real problem with dual core gamming right now is DX9. Microsoft has done a poor job of making DX9 dual core compatable.
 

Eviltwin17

Distinguished
Feb 21, 2006
520
0
18,990
get an x2, single cores are going to be obselete soon. Plus if you upgrade to one of the higher end x2's then you are good to go for a while. At least 3-4 years. Overclocking might help you too if you buy the 3800 or 4400 models.
 

choirbass

Distinguished
Dec 14, 2005
1,586
0
19,780
yep, by default, both the 3200+ and X2 3800+ come clocked at 2.0 GHz, so, clock for clock theyre the same...

...for even gaming though, the performance difference is more negligable (especially if your gpu is taking most of the burden anyhow graphically)... a 4000+ has a higher clock speed by default, yes (2.4 GHz)... but you basically only have half the total cpu then... games coded for single core usage can only make direct use of a single core, but with 2 cores, you will automatically benefit from the extra core on the cpu anyhow, because all other OS and process threads are offloaded from your already burdoned core when youre gaming, resulting in more fluid gameplay in general

for any other application though that isnt so gpu dependant... no comparison really, even when the bottleneck is I/O throughput where your HDD is the bottleneck for example, theres still better performance in the sense that your computer wont just be locked up for the time being
Most gamers shut down all other process and mostly whats left are GPU drivers, anti-virus, software firewall's, and connection processing. Everything but connection processing and GPU drivers are single threaded so your ping may stay better and make things a little smoother. The only differance I've seen in dual core GPU's is the ATI crossfire, due to load balancing issues, does the dual cores do better in single core gamming. I dont see much of a use having crossfire or SLI on anything less than the X2 4600+ so the GPU question isn't that big.

There are several dual core patched games but the 4000+ comes up to about he 4200+ at a cheaper price but the real problem with dual core gamming right now is DX9. Microsoft has done a poor job of making DX9 dual core compatable.

yeah, if you close all other applications down other than the game itself (so theres no real interferring extraneous cpu usage), i can see where you wouldnt benefit as much from having a dual core, especially if the game is only coded for single core usage, and your gpu isnt at all the bottleneck...

though without the second core, your overall OS and application experience is more open to mishaps (application stalls in particular), where youre stuck waiting on something to load, and you cant do much else, until its finished processing... since all the processes have to load sequentially, one after the other on a single core, instead of being able to load in parallel on a multicore... which a multithreaded OS such as windows can take advantage of, loading things side by side then to finish tasks that much quicker
 
yep, by default, both the 3200+ and X2 3800+ come clocked at 2.0 GHz, so, clock for clock theyre the same...

...for even gaming though, the performance difference is more negligable (especially if your gpu is taking most of the burden anyhow graphically)... a 4000+ has a higher clock speed by default, yes (2.4 GHz)... but you basically only have half the total cpu then... games coded for single core usage can only make direct use of a single core, but with 2 cores, you will automatically benefit from the extra core on the cpu anyhow, because all other OS and process threads are offloaded from your already burdoned core when youre gaming, resulting in more fluid gameplay in general

for any other application though that isnt so gpu dependant... no comparison really, even when the bottleneck is I/O throughput where your HDD is the bottleneck for example, theres still better performance in the sense that your computer wont just be locked up for the time being
Most gamers shut down all other process and mostly whats left are GPU drivers, anti-virus, software firewall's, and connection processing. Everything but connection processing and GPU drivers are single threaded so your ping may stay better and make things a little smoother. The only differance I've seen in dual core GPU's is the ATI crossfire, due to load balancing issues, does the dual cores do better in single core gamming. I dont see much of a use having crossfire or SLI on anything less than the X2 4600+ so the GPU question isn't that big.

There are several dual core patched games but the 4000+ comes up to about he 4200+ at a cheaper price but the real problem with dual core gamming right now is DX9. Microsoft has done a poor job of making DX9 dual core compatable.

yeah, if you close all other applications down other than the game itself (so theres no real interferring extraneous cpu usage), i can see where you wouldnt benefit as much from having a dual core, especially if the game is only coded for single core usage...

though without the second core, your overall OS and application experience is more open to mishaps (application stalls in particular), where youre stuck waiting on something to load, and you cant do much else, until its finished processing... since all the processes have to load sequentially, one after the other on a single core, instead of being able to load in parallel on a multicore... which a multithreaded OS such as windows can take advantage of, loading things side by side then to finish tasks that much quicker
True but programs must be dual core compatable to be loaded on the second core. World of warcraft is dual core compatable meaning you can run two instances with a dual core CPU. The OS cant simple assign a single threaded program to the second core unless its atleast compatable with dual core. True the OS is multithreaded but anytime you leave other app's running on a gamming machine your games are competing against other process for RAM, HD, and CD-ROM and thats the reason why gammers shut them down.

I see how the X2 will be need greatly next year but currently on an AM2 mobo I would stay cheap until the new AMD CPU's come out. If its a 939 go for the X2 4600+ or do as choirbass said get a X2 3800 and OC to the X2 4600.
 

Titaniumrsx

Distinguished
Feb 1, 2006
7
0
18,510
Thanks guys verys much for some input, i have 939 asus A8932-mvp Deluxe Crossfire board, ati 1800xt and 2 gig hyperX, i will go with a X2, looking at the 4400 since it has 1mg vs 512 on the 3800 and 4600
 
Thanks guys verys much for some input, i have 939 asus A8932-mvp Deluxe Crossfire board, ati 1800xt and 2 gig hyperX, i will go with a X2, looking at the 4400 since it has 1mg vs 512 on the 3800 and 4600
I think you can OC the X2 4400+ to almost a X2 5000+ performance if your into OCing. The X2 4400+ maybe a hard CPU to find as AMD is stop production of all none FX 1MB cache CPU's. If you can find it better get it quick.
 

jimw428

Distinguished
Jul 9, 2006
392
0
18,780
I just upgraded to the X2 4400+ from a 3200+ Venice, and couldn't be more pleased. I debated buying the X2 4600+ vs the X2 4400+ (at roughly the same price) but choose the 4400+ due to the 2 x 1mb cache as opposed to the 200 mhz speed advantage of the 4600+.

I believe I can get at least 200 mhz with a conservative overclock, but 2 x 1mb cache skt 939 CPU's will be gone from the product line once current inventories are sold.

I may live to regret my decision, but not so far. :)
 

choirbass

Distinguished
Dec 14, 2005
1,586
0
19,780
True but programs must be dual core compatable to be loaded on the second core. World of warcraft is dual core compatable meaning you can run two instances with a dual core CPU. The OS cant simple assign a single threaded program to the second core unless its atleast compatable with dual core. True the OS is multithreaded but anytime you leave other app's running on a gamming machine your games are competing against other process for RAM, HD, and CD-ROM and thats the reason why gammers shut them down.

im almost positive, but wouldnt making full independant use of either core simply be a result of the cpu affinity? (by default applications have an affinity for multiple cores)... if an application is coded to only take advantage of one core (with its cpu affinity assigned to both cores), it can run on both cores, but not take full advantage of both cores simultaneously (which is what coding an application for multithreading across both cores would do, like you said... instead of running all its threads on one core, like most single core applications do)

i dont think what i just said made much sense now that i reread it... but the cpu affinity for a single core coded application will allow it to run across all available cpu cores simultaneously... but only take full advantage of one core (which is why sometimes you need to change an applications cpu affinity to only one core, cuz problems sometimes happen if an application is using more than one core)

i think the cpu affinity across multiple cores allows the OS to more efficiently manage the cpu usage... if one core is tied up completely by a single core application, it can distribute the additional cpu requirements from seperate applications, to the other core
 
True but programs must be dual core compatable to be loaded on the second core. World of warcraft is dual core compatable meaning you can run two instances with a dual core CPU. The OS cant simple assign a single threaded program to the second core unless its atleast compatable with dual core. True the OS is multithreaded but anytime you leave other app's running on a gamming machine your games are competing against other process for RAM, HD, and CD-ROM and thats the reason why gammers shut them down.

im almost positive, but wouldnt making full independant use of either core simply be a result of the cpu affinity? (by default applications have an affinity for multiple cores)... if an application is coded to only take advantage of one core (with its cpu affinity assigned to both cores), it can run on either one, or the other, but not take full advantage of both cores simultaneously (which is what coding an application for multithreading across both cores would do, like you said... instead of running all its threads on one core, like most single core applications do)

i dont think what i just said made much sense now that i reread it... but the cpu affinity for a single core coded application will allow it to run across all available cpu cores simultaneously... but only take full advantage of one core (which is why sometimes you need to change an applications cpu affinity to only one core, cuz problems sometimes happen if an application is using more than one core)
OS are highly multithreaded but anticipating the needs of any givin program they can't do. A single threaded program , none dual core compliant, will not even recognize the second core and couldnt operate correctly if forced to.

Blizzard had to make Worlds of Warcraft dual core compatable before you could run 2 instances. Before Blizzard updated WOW the program would even see the second core as to it their was none.

I moved from the 3500+ to the X2 4200+ and ping is the only thing changed for me on both WOW and CS:S. I couldnt even run WINAMP without lowering performance but thats a HD issue.
 

choirbass

Distinguished
Dec 14, 2005
1,586
0
19,780
OS are highly multithreaded but anticipating the needs of any givin program they can't do. A single threaded program , none dual core compliant, will not even recognize the second core and couldnt operate correctly if forced to.

Blizzard had to make Worlds of Warcraft dual core compatable before you could run 2 instances. Before Blizzard updated WOW the program would even see the second core as to it their was none.

I see moved from the 3500+ to the X2 4200+ and ping is the only thing changed for me on both WOW and CS:S. I couldnt even run WINAMP without lowering performance but thats a HD issue.

an application cant be forced to completely use both cores... BUT, by default, all applications run on multiple cores... if you open the task manager, and look at the cpu usage graphs... youll see when an application opens, its probably only tying up one core, but is capable of running on either one...

youre right though, a single core application is not aware of the second core, unless its coded to span its threads across more than one core

if you change an applications affinity from using both CPU 0 and CPU 1, to only CPU 0 or CPU 1, all of a sudden, the free core that wasnt being used becomes completely tied up, and your original core that was tied up, is now completely free... it just switches usage of available cores (never more than 50% total cpu usage though in a dual core setup, unless the application is multithreaded for use on both cores, whichcase, both cores spike up to 100% usage max)... but, with an affinity assigned to both available cores for available applications, the cpu usage is better balanced... so not everything is stuck on one core, with the second core going unused the whole time.

anyhow, yeah...
 
Well the 4000 is a very fast chip. Especially for gaming, my buddy has one with 2 7800GTX's and it amazing. However Id go for a X2 3800 or 4200 before they stop making them or before the prices go back up due to availability. This will pretty much be your last upgrade with 939.
I agree and stated so in one of my last posts. Staying with a single core here is mainly on the AM2 socket where AMD is promising new CPU's by the end of this year or early next.

2 GPU's on a 4000+ was and ok ideal but now is kind of a low CPU. The SLI setup really gains little from a dual core, unlike ATI's, so yes that makes for a good game performance system.

The last good upgrade for the 939 will be the X2 4600+ or a OC'ed X2 3800+ as all none FX 1MB cache CPU's are to stop production. I purchased the X2 4200+ back when it was high but good price point and wished I would have waited.
 
OS are highly multithreaded but anticipating the needs of any givin program they can't do. A single threaded program , none dual core compliant, will not even recognize the second core and couldnt operate correctly if forced to.

Blizzard had to make Worlds of Warcraft dual core compatable before you could run 2 instances. Before Blizzard updated WOW the program would even see the second core as to it their was none.

I see moved from the 3500+ to the X2 4200+ and ping is the only thing changed for me on both WOW and CS:S. I couldnt even run WINAMP without lowering performance but thats a HD issue.

an application cant be forced to completely use both cores... BUT, by default, all applications run on multiple cores... if you open the task manager, and look at the cpu usage graphs... youll see when an application opens, its probably only tying up one core, but is capable of running on either one...

youre right though, a single core application is not aware of the second core, unless its coded to span its threads across more than one core

if you change an applications affinity from using both CPU 0 and CPU 1, to only CPU 0 or CPU 1, all of a sudden, the free core that wasnt being used becomes completely tied up, and your original core that was tied up, is now completely free... it just switches usage of available cores (never more than 50% total cpu usage though in a dual core setup, unless the application is multithreaded for use on both cores, whichcase, both cores spike up to 100% usage max)... but, with an affinity assigned to both available cores for available applications, the cpu usage is better balanced... so not everything is stuck on one core, with the second core going unused the whole time.

anyhow, yeah...
True but your affinity is really dual core compatable as thats is the what is allows. The OS can't switch a program but the program must be programed to do so. The OS's main function is to allocate resources and program must ask for the resources. I wish an OS could dynamically switch program from 1 core to another as that would, make multithreading worthless, and mean only having to break up programs into smaller program and just let the OS do all the multithreading work.
 

choirbass

Distinguished
Dec 14, 2005
1,586
0
19,780
True but your affinity is really dual core compatable as thats is the what is allows. The OS can't switch a program but the program must be programed to do so. The OS's main function is to allocate resources and program must ask for the resources. I wish an OS could dynamically switch program from 1 core to another as that would, make multithreading worthless, and mean only having to break up programs into smaller program and just let the OS do all the multithreading work.

well... in a sense then... due to dynamic allocation of cpu cycles between cpu cores, and the OS, you could honestly say multithreading an application is more pointless even, if for that reason...

the OS does use additional cpu cores when a single core application starts up, even if one is already in use (so theres no CPU usage conflicts)... by default, all single core applications load on CPU 0, but if CPU 0 is already at 100% usage because of that application, where would a newly launched single core application go to?, the application itself isnt aware of the extra core, but the OS its running on is... the application itself doesnt need to be coded specifically to allow that... ...the OS takes care of it... and thats where CPU affinities come into play


im sure we can agree though that a single core coded application can only effectively harness one core, and for it to harness the processing capability of more than one core, it needs to be programmed that way
 
True but your affinity is really dual core compatable as thats is the what is allows. The OS can't switch a program but the program must be programed to do so. The OS's main function is to allocate resources and program must ask for the resources. I wish an OS could dynamically switch program from 1 core to another as that would, make multithreading worthless, and mean only having to break up programs into smaller program and just let the OS do all the multithreading work.

well... in a sense then... due to dynamic allocation of cpu cycles between cpu cores, and the OS, you could honestly say multithreading an application is more pointless even, if for that reason...

the OS does use additional cpu cores when a single core application starts up, even if one is already in use (so theres no CPU usage conflicts)... by default, all single core applications load on CPU 0, but if CPU 0 is already at 100% usage because of that application, where would a newly launched single core application go to?, the application itself isnt aware of the extra core, but the OS its running on is... the application itself doesnt need to be coded specifically to allow that... ...the OS takes care of it... and thats where CPU affinities come into play


im sure we can agree though that a single core coded application can only effectively harness one core, and for it to harness the processing capability of more than one core, it needs to be programmed that way
True as long as the single threaded code has been programmed to meet dual core compatable code requirements else the second core isnt even recognize. Dual core compatable code increase the overhead of code which may reduce performance to a small degree. Dual core compliant code penalties cause some game developers, which need to drain every ounce of performance, to opt out of complying.
 

choirbass

Distinguished
Dec 14, 2005
1,586
0
19,780
True as long as the single threaded code has been programmed to meet dual core compatable code requirements else the second core isnt even recognize. Dual core compatable code increase the overhead of code which may reduce performance to a small degree. Dual core compliant code penalties cause some game developers, which need to drain every ounce of performance, to opt out of complying.

well, all i can do really, is give an example of a game that came out in the late 90's made by the same company that made WoW... Diablo 2 (that i was just running yesterday to test this, and again just now)... it was out long before dual core CPUs, that only came out just last year... the game has no dual core support, but it complies to exactly what i was saying

the game itself does not see the second core (its purely and strictly a single core coded game, as far as its concerned, there is only one CPU core)... but the OS is capable of manipulating the CPUs cores to the extent of running the game on either CPU core

if you have a dual core CPU, you should try it (with a different game if you dont have Diablo 2)

or, run a single/multi core coded CPU benchmark on only CPU 0, let it run for a few minutes... while its running, start a game, (dont change its cpu affinity though)... it will automatically be using CPU 1 (even if it can use either one), and so you now have both CPU cores tied up... but the game isnt lagging at all, running perfectly smooth, and your benchmark is running just the same too... ...its the OS dynamically changing CPU core usage is all

IF however... you assign only CPU 0 to both the game and the benchmark (and uncheck CPU 1)... both the game and the benchmark will run slower... the game will run more choppy, and the benchmark will take quite awhile to finish then
 
True as long as the single threaded code has been programmed to meet dual core compatable code requirements else the second core isnt even recognize. Dual core compatable code increase the overhead of code which may reduce performance to a small degree. Dual core compliant code penalties cause some game developers, which need to drain every ounce of performance, to opt out of complying.

well, all i can do really, is give an example of a game that came out in the late 90's made by the same company that made WoW... Diablo 2 (that i was just running yesterday to test this, and again just now)... it was out long before dual core CPUs, that only came out just last year... but it complies to exactly what i was saying

the game itself does not see the second core (its purely and strictly a single core coded game, as far as its concerned, there is only one CPU core, which is CPU 0)... but the OS is capable of manipulating the CPUs cores to the extent of running the game on either CPU core

if you have a dual core CPU, you should try it (with a different game if you dont have Diablo 2)

or, run a single/multi core coded CPU benchmark on only CPU 0, let it run for a few minutes... while its running, start a game, (dont change its cpu affinity though)... it will automatically be using CPU 1 (even if it can use either one), and so you now have both CPU cores tied up... but the game isnt lagging at all, running perfectly smooth, and your benchmark is running just the same too... ...its the OS dynamically changing CPU core usage is all

IF however... you assign only CPU 0 to both the game and the benchmark (and uncheck CPU 1)... both the game and the benchmark will run slower... the game will run more choppy, and the benchmark will take quite awhile to finish then
Played D2 for going on 6 years now and it was patch summer of last year v1.11. Its truly a great game but the pallys are to buffed. D2 really isnt a system stressing game like CS:S and thus would make little difference complaint or not. Many new programming laugange compilers do help create dual core compliant code but require recompling the entire code and or updates. In system stressing games these options are usally turned off to increase performance.
 

choirbass

Distinguished
Dec 14, 2005
1,586
0
19,780
Played D2 for going on 6 years now and it was patch summer of last year v1.11. Its truly a great game but the pallys are to buffed.

yeah it is (its my all time favorite game, seconded by D1), i didnt play it much online before LoD was released though... always made Sorcs and Zons online :)... didnt do nearly as well with any of the melee characters though :\... too much clicking and missing with them, i ended up running around everything i attacked, especially against other players in duels, lol
 
Played D2 for going on 6 years now and it was patch summer of last year v1.11. Its truly a great game but the pallys are to buffed.

yeah it is (its my all time favorite game, seconded by D1), i didnt play it much online before LoD was released though... always made Sorcs and Zons online :)... didnt do nearly as well with any of the melee characters though :\... too much clicking and missing with them, i ended up running around everything i attacked, especially against other players in duels, lol
I loved the versions before 1.03 because the low level unique had no level requirements. Version 1.07 I was number 7 on the ladder and played with the number 1 on the ladder heartsnow. Ive not played in about 4 months due to WOW but I've keep my chars from expiring. I have about 6 accounts of ladder chars and about 16 accounts full on none ladder. I wish D3 would come out soon. D1 was ok until the pickup dupe was discovered then I started playing warcraft 2 and C&C.
 

choirbass

Distinguished
Dec 14, 2005
1,586
0
19,780
Played D2 for going on 6 years now and it was patch summer of last year v1.11. Its truly a great game but the pallys are to buffed.

yeah it is (its my all time favorite game, seconded by D1), i didnt play it much online before LoD was released though... always made Sorcs and Zons online :)... didnt do nearly as well with any of the melee characters though :\... too much clicking and missing with them, i ended up running around everything i attacked, especially against other players in duels, lol
I loved the versions before 1.03 because the low level unique had no level requirements. Version 1.07 I was number 7 on the ladder and played with the number 1 on the ladder heartsnow. Ive not played in about 4 months due to WOW but I've keep my chars from expiring. I have about 6 accounts of ladder chars and about 16 accounts full on none ladder. I wish D3 would come out soon.

i havent played WOW at all due to the monthly fee... but when i played d2, almost religiously, i stocked up numerous accounts as well, only one main account of actual characters though, (that i kept cycling through and deleting when i made more characters), and the rest of the accounts were just full of mules, since i spent most of my time just doing meph runs and rushing people with my 99 sorc (yay lol, only had one lvl 99 though, the rest were all lvl 90-98s)... i didnt trust any mephbots or pindlebots for mfing though either... but mousepads maphack was okay, aside from the fee he charged with 1.10

the problem was, i was very much addicted to the game... that the only way i was able to stop completely, was having all the items stolen from my accounts, several times (it was from trusting people with my password who i thought i could trust, but really they just wanted the equipment stored on the account)... so, made alot of 'friends' that way, lol... the last time i had everything stolen though, i just didnt feel like going back and building everything up from scratch again, and i quit, lol

edit: as far as D3 though... im honestly not sure theres going to be one, at least it wont be as good as the first 2 anyhow (if it does get released), because the original people who worked on both games are no longer there... they moved onto form a different company after 1.10 was released... they basically quit, to start over again.
 
Played D2 for going on 6 years now and it was patch summer of last year v1.11. Its truly a great game but the pallys are to buffed.

yeah it is (its my all time favorite game, seconded by D1), i didnt play it much online before LoD was released though... always made Sorcs and Zons online :)... didnt do nearly as well with any of the melee characters though :\... too much clicking and missing with them, i ended up running around everything i attacked, especially against other players in duels, lol
I loved the versions before 1.03 because the low level unique had no level requirements. Version 1.07 I was number 7 on the ladder and played with the number 1 on the ladder heartsnow. Ive not played in about 4 months due to WOW but I've keep my chars from expiring. I have about 6 accounts of ladder chars and about 16 accounts full on none ladder. I wish D3 would come out soon.

i havent played WOW at all due to the monthly fee... but when i played d2, almost religiously, i stocked up numerous accounts as well, only one main account of actual characters though, (that i kept cycling through and deleting when i made more characters), and the rest of the accounts were just full of mules, since i spent most of my time just doing meph runs and rushing people with my 99 sorc (yay lol, only had one lvl 99 though, the rest were all lvl 90-98s)... i didnt trust any mephbots or pindlebots for mfing though either... but mousepads maphack was okay, aside from the fee he charged with 1.10

the problem was, i was very much addicted to the game... that the only way i was able to stop completely, was having all the items stolen from my accounts, several times (it was from trusting people with my password who i thought i could trust, but really they just wanted the equipment stored on the account)... so, made alot of 'friends' that way, lol... the last time i had everything stolen though, i just didnt feel like going back and building everything up from scratch again, and i quit, lol

edit: as far as D3 though... im honestly not sure theres going to be one, at least it wont be as good as the first 2 anyhow (if it does get released), because the original people who worked on both games are no longer there... they moved onto form a different company after 1.10 was released... they basically quit, to start over again.

I never trusted anyone on D2 but WOW is another story. In WOW if you trust someone and they cheat you they get ban and the GM's give your stuff back. Hacks and Item sales for real money are ban as they mostly bring the good program hackers. The things I dont like about WOW is getting ganked in the middle of quests, getting ganked on my way to quests, and getting ganked on my way back to town to turn in the quests. Theres so many quest ive not even done them all yet and I've been playing now for 4 months.

The cost is high I agree and wish it was lower but you can get a 10 day free trial from a friend.
 

choirbass

Distinguished
Dec 14, 2005
1,586
0
19,780
I never trusted anyone on D2 but WOW is another story. In WOW if you trust someone and they cheat you they get ban and the GM's give your stuff back. Hacks and Item sales for real money are ban as they mostly bring the good program hackers. The things I dont like about WOW is getting ganked in the middle of quests, getting ganked on my way to quests, and getting ganked on my way back to town to turn in the quests. Theres so many quest ive not even done them all yet and I've been playing now for 4 months.

The cost is high I agree and wish it was lower but you can get a 10 day free trial from a friend.

hmmm... for the trial, do you need a credit card number and such to activate the trial membership?... or can you just sign up without one... ...i dont have one myself, so if it requires one, it might be a problem then... otherwise, i think i might give it a try :)

edit: ...scratch that i guess... just checked a download site, i guess it does require one to sign up... oh well