Upgrading Emachines Computer for Gaming

McGirk242

Honorable
May 18, 2013
28
0
10,530
Hey There!

I've found a somewhat aged Emachines ET1331-40e PC that my dad bought at Costco sometime ago before three consecutive moves across this Continent and Country. I'm interested in seeing what I can do to bring the ole' girl back up to speed.
Here's what we've got for specs-
* Windows 7 Home Premium 64 Bit on 500GB HDD
* AMD Athlon II X2 220 Dual Core CPU
* Nvidia Geofrce 6150SE GPU
* 4GB DDR2 RAM

I know most likely the PSU will be crap, so what are my upgrade options for the thing? Will any regular PSU fit in it, or do I need a special OEM PSU to fit?

What about the CPU? Depending on the Motherboard, what's my best bet for upgrading the sucker to the best it can handle?

I know the GPU is as good for playing games as Winter does in heating you up. IT DOESNT.
So what are my best bets? Low Profile Radeon? Or can I get by with one of those new shiny, super highly efficient Nvidia GTX 750s?

And the RAM? What's the highest I can take it up on? 8GB is what I'm thinking, but I'm not really sure.

And if this doesn't work out, could I just take out the HDD and use it for a later build, with the OS alongside with it?

Since I live in Texas, specifically the Houston Area I have greater access to stores but online order would be preferable. The Old Man does not enjoy driving into the City. Not. One. Bit.

I'll be checking the BIOS to see the model of the Mobo tomorrow for further information.

Thank You!

 
The fact that it's still running DDR2 ram and has an old, slow dual core...

She's on her last legs, and quite honestly, I'm not sure it's worth trying to get that last mile out of her... parts that old are going to be more expensive, since they're harder to find now. That means you could save up just a little bit more than you would spend on this rig and get a brand new one.

You can definitely use the hard drive as a data drive, but I would want regular backups. (I don't ever trust hard drives, especially older ones.) Even if you got a SSD to put windows on, you should be able to look up your OEM windows key and use that to re-install windows 7 to the SSD.
 
A) NO, the system is way outdated and behind, and the costs for parts would be about or MORE then just buying a new computer.
B) NO, you can't reuse the Windows 7, Emachine, gateway, Acer whatever 'Company' name made it all discounted the 'cost' of the OS by buying the licenses in huge bulk. The 'cost' for this was the licenses were ONLY allowed with that specific hardware (Case, Mobo, CPU) and will refuse to boot if used on any other system.
C) Honestly no reason to keep the HDD, since normal new systems (as noted in A) come with 600GB to 1TB drives out the door. Further the current drives use SATA III, where as that old computer and old HDD probably is SATA II, which would bottleneck your 'new' computer if you follow the advice.

Suggested systems: check www.slickdeals.net for a i5, Win8, 1TB HDD, 8GB, DVD desktop for around $400, if you spring a little more a i7 is only $549 or so (Lenovo or Acer constant 'sales' through Staples). Grab a PSU from www.PCPartPicker.com (600W for $90) then the video card you want ($149-$499) and you would have a 1080p, around 60FPS, settings on High (i5) to Ultra (i7) for any titles on the cheap.
 


...dude, seriously?

You can reuse an OEM key, provided that you do it by reinstalling a regular copy of windows and using that key.

Your "C" point is even more ridiculous. First of all, even if new systems DID all come with a 1TB drive, why in the world wouldn't you keep the 500GB hard drive to give yourself 1.5TB of storage? On top of that, there is NO MECHANICAL DRIVE out there that would be bottlenecked by a SATA II connection. Even a RAID 0 setup would be barely any faster on SATA III than it would on SATA II. The drives read and write much slower than the 3Gb/s or 6Gb/s transfer speeds of the SATA connection.

As for suggested systems, are you kidding me? An i7 isn't any better than an i5 for gaming, a DVD isn't a selling point, and those computers are absolutely horrible.

You come across as though you're shilling for the sales website, while at the same time not having a clue what you're talking about. I'm sorry, but that's the way it is.
 


http://download.microsoft.com/download/6/A/1/6A1647EE-3FC7-47F2-9AFE-470AD5E5D856/OEMSoftwareLicensingRulesandRestrictions.pdf

"OEM Software may NOT be transferred to another machine. Even if the original laptop, PC or Server is no longer in use, or if the software is removed from the original hardware, OEM licenses are tied to the device on which the software is first installed."

YES Seriously. Per Microsoft, per Dell, Gateway, Emachine (which is Gateway btw), etc. for all additions of Windows up till Windows 8, W8 the policy changed. You are thinking of OEM SYSTEM BUILDERS edition, which is the same as if you bought a 'off the shelf at BestBuy/Walmart/etc.' copy of Windows, which is allowed to ANY one machine per copy of Windows install.
And yes 20 years of doing this I had moved a Dell/et al drive to a 'different' (cpu, mobo, case) computer and even from one model computer to a different model made by the same company (but again Cpu, Mobo, Case were different) and turned on the PC and Windows won't load, some cases the BIOS itself will post the message the wrong OS is installed, depending on the maker and such.



WOW are you wrong here. First off the 'read and write' is based on the RPM of the drive, so if your talking a 5400RPM drive, then NO SATA II nor SATA III wont' change that speed of the head read and writes, but that be STUPID to suggest to buy a 5400RPM drive anyway, everyone always EXPECTS a 7200RPM, which WOULD be impacted (bottlenecked) by a SATA II as compared to a SATA III connection (be it on the HDD chipset itself or the Mobo chipset) as the data is passed from HDD Cache FROM / TO RAM. The 7200RPM would read or write that fast (spinning of the platter, moving of the head) but would be bottlenecked if that data it wants to pass along or waiting to receive is choked by a smaller bandwidth.
THIS IS ESPECIALLY IMPORTANT ON NEW DRIVES USING 4K instead of OLD 512K formatting.

That said, enable RAID 0 on a SATA II Mobo does NOT increases the performance past the 3Gb/s limitation nor adding a SATA II drive to a RAID 0 SATA III Mobo increases the performance of the HDD beyond 3Gb/s. If the connection is SATA II, it is stuck at the maximum bandwidth for data to pass from HDD Cache to RAM or from RAM to HDD CACHE. SATA III doubles the bandwidth, so more data flows through and not bottlenecked waiting in RAM / HDD Cache to be passed one way or the other.

Honestly there is no reason for consumers to use RAID, unless they wish to do RAID 3 or better to have immediate swap out and fall over support, like those running Gaming Home Servers for their Clans and such, so they have 0 downtime and instant recovery hot swap-ability.



UHM like wow, get a grip here. Yes these are not 'Gamer's Rigs' using the latest GigaByte performance Mobo, and KILLER's brand RAM, but they aren't "absolutely horrible" as in the design specs of the equipment is somehow shoddy quality / lower end different equipment. Yes I seen some crappy things occasionally thrown into both 'homebrew" as well as "prebuilt" systems, like the epidemic of the bad crappy low end PSUs that have flooded the market, that even old 'reliable' makers have had issues of poor QA. So let's not just toss one group around here over another, and let's be honest.

As we are talking builds I was providing for the PRICE you get the hardware normally needed (yes DVDs are still needed many people still have games only on DVD). So including the main components ALREADY INCLUDED IN THE PRICE isn't "shilling for the sales website", (where they hell your coming up with that I don't know), it is informing the OP, hey here is a cheap, EASY, solution you MIGHT want to consider as all your asking about is "Upgrading Emachines Computer for Gaming" .

Since the OP question in itself shows the OP is not a learned HomeBrew Veteran like yourself, not a skilled and educated 20 year Tech like myself, or the high qualifications as the Mod, but some simple Joe "I don't know what a AMD or a Intel is much less comparing whatever a CPU is" average Consumer, I offered something that would potentially answer the OP question. So please don't be insulting in a biased tone over a simple easy and 'targetted' answer.

As for " i7 isn't any better than an i5 for gaming", you are wrong in the GENERAL BLANKET answer sense, but may be right in SPECIFIC single games you wish to point as evidence of your statement. In the GENERAL statement, YES a i7 is better, DUH, you talking 8 'cores' over 6, so that presents 2 more to process with, uhmm two extra workers to do the same work ALWAYS performs better then LESS workers, simple math there. Secondly, if you check almost every 'benchmark' for a video card, the 'standard' platform for testing is a i7, not i5, so obviously the professionals (i.e. get paid to test, evaluate then provide you/me/everyone the results) seems to think your wrong as well. Third, while I agree at this moment TODAY, why would I suggest a FPS of 94 for i7 in BF4 over a score of 89 for i5 when we can't see more then 60FPS, because I am not suggesting to the OP to buy a system JUST to play today's titles. If the OP is spending $700 or more for a new system (as we all are telling him to do) this is a investment the OP expects YEARS of the EXACT SAME PERFORMANCE as today (right or wrong it is the expectation) with the EXACT SAME SETTINGS no matter the game title. Why would I tell him to spend just enough for today's demands / game titles, then in 2016 (two years form now) or later on seeing the performance being impacted because he has a i5 as compared to a i7, especially when we are talking only $100 or so difference on the INVESTMENT. The $100 extra paid today will save him 3-4 yrs from now having to spend more then $100 to get the performance expected IF he had first invested in a i7 over a i5. How can I say that? Well simple, just see current titles benchmarks and review a i7-3xxx, i7-2xxx as compared to a i5-3xxx, i5-2xxx etc. The i7's still score better NOW then the 'marginal difference' when that 3xxx or 2xxx was first tested with that generation of games when those chipset were released.

For value, nominal cost, and 'future proofing', never minding that the i7s always lead any benchmarks, it makes the most sense, and is provided as a OPTION, as I DID include a i5 as well as a choice. Your response though is very unwarranted and biased in the tone in comparison.
 
You're still proving that you have no idea what you're talking about... When ignoring chips such as the $600 i7-3930k, neither an i5 NOR an i7 have six or eight cores. A desktop i5 is a quad core. A consumer level i7 is an i5 with 3MB more l3 cashe, and hyperthreading. Big. Freaking. Whoop.
There are a very small handful of games that even support hyperthreading, and out of those games, some of them perform WORSE with hyperthreading on than they do when hyperthreading is disabled. The reason an i7 is the standard for the bench testers is that they're doing a LOT of things beyond gaming, such as straight number crunching, rendering, or, say, video editing so they can get their bench tests online faster.

You also claim that the human eye can't see more than 60FPS, which is a myth that has been around for years and proven to be BS for just as long. Nobody in 2014 should seriously be claiming that anymore, especially after the prevalence of 120Hz monitors.

The original poster is in fact relatively proficient with technology, as you could see if you had read the original post, rather than just the title. He understands that his power supply is likely to be shoddy, and knows what parts do what... he's also obviously very comfortable with working with computers and happy to swap out not only the RAM and GPU, but also the CPU. He asked if he could save the hard drive and use it in a new system... that shows that he's capable of building a rig himself to get far better performance and reliability for the same money.

I don't know why you think gaming rigs require Gigabyte motherboards, or Killer branded ram... but no, prebuilts often shave off money on the parts that matter most for reliability, such as the power supply and motherboard. Yes, you can upgrade that later, but why not just build one yourself that's reliable from the start?

Your information on hard drives, simply put, is wrong. No hard drive, even a 10,000k RPM velocraptor, even if it's labeled "Sata III!" will saturate the bandwidth of a SATA II connection by itself. In addition, your original point is still ridiculous - why in the world should the OP get rid of perfectly usable storage just because it doesn't happen to have the newest label? Also, if there is "no reason" for consumers to use RAID, why have tech enthusiasts been relying on it for far greater speed for decades?

As for transferring the OEM key, you still aren't bothering to read what's in front of you. I'm not thinking of the system builder's OEM, I'm thinking of a factory OEM install. You do not transfer the drive. You have to look up your OEM key, and then reinstall windows on the new computer, using that key. You would then have to call Microsoft and explain the situation, but 90% of the time, they'll activate it just happily.


I might not have your 20 years of experience sitting at a help desk, but that doesn't say much anyways. When you're diagnosing computers, it's a different story, I'll grant you that, but the technology field changes far too rapidly for anything but recently gained experience to be of a huge help. Your views reflect ideas that were popular some three to five years ago, and which have been disproven. I understand that you're just trying to help the OP, but much of your advice is no longer applicable.
 
I am breaking up my response into two parts

Regarding FPS and HZ.



Well this took a while to research to prove / disprove, but that said first off your completely off-base and obviously lack any credibility when trying to equate FPS to Hz, they are not the same thing nor equal, but that is the second part of what I will address. First to the point of 'Human Perception' of visual stimulus; this seems to be another one of those long arguments running on the web and everyone throwing in OPINION, and referencing a BLOG as a source then a official legitimate (see above) liable source. I on the other hand spent time to find some creditable sources, and I believe the best example revolves around the issue of The Hobbit being released in 48FPS (see below more about FPS and Hz).

So first to quote APPLE in their own documentation for FinalCut Pro:
http://documentation.apple.com/en/finalcutpro/usermanual/index.html#chapter=D%26section=3%26tasks=true

"The limit of human perception: There is no reason to show more frames per second than the viewer can perceive. The exact limit of human motion perception is still up for scientific debate, but it is generally agreed that there is an upper threshold after which people can’t appreciate the difference."

In the case of film it was 24FPS, video broadcast 30FPS, in games the magic number been 60FPS. After that point "people can’t appreciate the difference" i.e. Looks all the same to me.

To further discuss the issue as pointed out here http://movieline.com/2012/12/14/hobbit-high-frame-rate-science-48-frames-per-second/ about The Hobbit,
" scientists and researchers in the field of consciousness perception say that the human brain perceives reality at a rate somewhere between 24 fps and 48 fps — 40 conscious moments per second, to be more exact — and exceeding the limit of the brain’s speed of cognition beyond the sweet spot that connotes realism is where Jackson & Co. get into trouble. Movieline spoke with filmmaker James Kerwin, who lectured on the subject of the science of film perception and consciousness at the University of Arizona’s Center for Consciousness Studies. (His presentation included an analysis of the work of Dr. Stuart Hameroff and British cosmologist/philosopher Roger Penrose, and their quantum theory of consciousness.) According to Kerwin, there really is a simple scientific answer for why The Hobbit’s 48 fps presentation plays so poorly with some viewers — and it's not something we'll get used to over time.
HOW OUR BRAINS PERCEIVE REALITY
James Kerwin: “Studies seem to show that most humans see about 66 frames per second — that’s how we see reality through our eyes, and our brains. So you would think that 48 frames per second is sufficiently below that — that it would look very different from reality. But what people aren’t taking into account is the fact that although we see 66 frames per second, neuroscientists and consciousness researchers are starting to realize that we’re only consciously aware of 40 moments per second.”"

Finally there is from the University of Leicester, and the Swift Gamma-Ray Burst Mission (yeah I think people hear MIGHT know something about visual detection) and Michael F. Deering of the very small meaningless (SIC) Sun Microsystems provides the actual scientific basis again for the discussion, http://www.swift.ac.uk/about/files/vision.pdf

". Assuming 60 Hz stereo display with a depth complexity of 6, it was estimated a rendering rate of approximately ten billion triangles per second is sufficient to saturate the human visual system"

Now to discuss the matter of FPS and Hz and how your incorrect let's start with the father of film Thomas Edison. When he invented film and over time it was determined that for the visual experience of a single frame of film to be perceived different (aka motion) when presented with a second frame of film, a series of the same SINGLE frame needed to be grouped together then the next group would contain the second frame of film and so on. This was the number of Frames Per Second displayed, which was and has been 24FPS, stand film processing.

Much later with the advent of Broadcast TV and video, the creation of the Cathode Ray Tube to DISPLAY a image worked ONLY at 60Hz, which is to say, 60 times per second the ENTIRE SCREEN is "repainted" with the SOURCE image. These are very important words, DISPLAY and SOURCE. HOW the screen got 'repainted' was in two methods Progression and Interlaced. Common TV was using Interlaced where the CRT would shoot at row 1, 3, 5, etc. from the top of the screen down to the bottom, then start over at 2, 4, 6, etc. rows. This was evident in many B&W screens when you seen the 'rolling' effect of every other 'row bar' over the screen. Progression was better as it instead did in sequence rows 1,2,3,4, etc. from the top to the bottom, which produced a much better image especially for things 'moving' on the screen. The problem still was the source (currently 24FPS on FILM but now many people filming in 30FPS) could NOT be displayed on a 60Hz screen, as there wasn't enough 'film' to fill in for the rest of the display, until they built into CRT to double each 'FPS' (24 or 30) of one FRAME image ( and in the case of 24FPS copy 6 additional frames). So the SOURCE video is still honestly 24/30FPS but the DISPLAY video is 60Hz (>>> screen refresh NOT frame of picture <<<) so both Source and Display match up with the added doubling of the SOURCE video. Wait! I know of CRTs display 75Hz, or 100Hz! YEP, and so in that case 30FPS would be copied 3x, plus 10 extra copies of the same frame (3x30=90+10=100) to UPSCALE the SOURCE to DISPLAY properly at 100Hz. But it still doesn't change that the SOURCE is still 30FPS, for example, and still talking all ANALOG signaling here.

Here comes along LCD, completely different and DIGITAL. It instead of Progressive or Interlaced Cathode Ray way of doing it instead starts at pixel 0,0 and until it hits the bottom right last pixel (all dependent on the screen size and resolution) to DISPLAY the whole image at one time. As the 'basis of technology' was CRT, the use of Hz was still implemented to 'describe' how it displays, especially as the 'source' was still 'Film and Video' 24/30FPS. We had to initially deal with the LetterBox issue, and so on as the 'new tech' was worked through the 'old model' paradigm until all industries finally got 'on board' and now we have the common 720P, 1080P, HDTV, etc. digital film and video SOURCES changed over from Analog. Along the way the LCD was 'stuck' at 60Hz as the base model for how things displayed (we did it for CRT for a day and age and worked then so should it work on your new fangeld digital screenabobs). As the SOURCE was changed from ANALOG to DIGITAL, not much could be done, but then when ONLY DIGITAL was the SOURCE, then things have been changing over the LCD side of things, to now promote 120Hz, 240Hz, 4K etc. displays, and filming in 40-48Hz producing the "Soap Opera Effect" (simple quick article on it here http://www.cnet.com/news/what-is-the-soap-opera-effect/ ) for Film and Video Sources. The effect (which is where this discussion is now at) is the smoothing of the motion RATHER THEN the 'blurring' effect (even when they program INTO the video/film blurs in Post Production) to make the brain believe the video is more 'realistic' as normal viewing of our world is. For example if you run your hand left right fast in front of you it smoothly moves, it doesn't 'blur', but in a movie it would 'blur' just enough for your brain to pick it up and realize it isn't "real" life moving.

This now brings us to video games, with this as our foundation. Computer Video SOURCE (your GPU, CPU, etc. all working together) produce 'effective' motion (blur) above 30FPS, because under '30FPS' you pick up the 'lagg' and 'jerky' motions in the game. Now the race on the SOURCE side has been to push the FPS higher and higher, over the magic 60FPS and now in the 90FPS range, which majority of people (going back again to Apple) "which people can’t appreciate the difference." Now taking the LCD and hooking to the SOURCE, the LCD can display only 60Hz, 100Hz, 120Hz NO MATTER if the SOURCE is 'sending' 1FPS or 1000FPS, only 60, 100, 120 screen refreshes (Hz) will DISPLAY on the screen.

So if you hit 'lag' in a game and drop from 100FPS to 50FPS, YOUR saying the screen would also 'drop' to 50Hz, NO. Conversely if your system is only capable to SOURCE (create) 50FPS and you plug it into a 100Hz screen it doesn't make your game 'increase' to 100FPS either.
 
Apr 24, 2014
2
0
4,510
Hun the PC is just too old and those old emachines have special OEM OS's that can not be reused in another machine by MS rules. I mean you can but MS dont want you to.

I would do what Tom has suggested and check out Slickdeals. I get my suggestions from there also.

I picked up a nice rig for 700 asus from New Egg and it rocks the town and plays most games on max res or ultimate res. I used windows 7 pro.

I really would not waste your time on the old emachine and just ebay it LOL.

Thank you for your time.

Claudia
 
The rest of the points and Counter Points.



Yes the The i7-3960K has # of Cores 6 # of Threads 12 and appears AND PERFORMS as 12 Cores because of Hyperthreading is allowing each chip to 2x Process by splitting MicroProcessing the work. http://ark.intel.com/products/63697

But the rest of the Desktop PCs, CONSUMER, Staples: pick any all are around the normal i7-4770, http://www.staples.com/i7/directory_i7?fids=142810&rpp=18&pn=1&sr=true which is # of Cores 4 # of Threads 8 so it appears to and performs for the OS as 8 Cores, http://ark.intel.com/products/75122/Intel-Core-i7-4770-Processor-8M-Cache-up-to-3_90-GHz

Desktop PCs, Consumer i5s here http://www.extremetech.com/deals/169285-et-deals-529-for-hp-pavilion-500-haswell-desktop
and here http://configure.us.dell.com/dellstore/config.aspx?oc=fddnrn1359&cs=19&dgvcode=ss&c=US&l=EN&dgc=SS&cid=274440&lid=5138339&acd=12309196895274941 are all around the i5-4430 / 4440 which is # of Cores 4 # of Threads 4 so it appears and performs to the OS as 4 Cores, http://ark.intel.com/products/75036/intel-core-i5-4430-processor-6m-cache-up-to-3_20-ghz

BUT a i5 can be EITHER a 2 or 4Core and still work as 4 Threads
http://www.intel.com/content/www/us/en/processors/core/core-i5-processor.html
http://www.techpowerup.com/reviews/Intel/Core_i5_4670K_and_i7_4770K_Comparison/

So your statement "NOR an i7 have six or eight cores" is incorrect as you even pointed to the i7-3930 And I pointed to the 3960, both are 6 CORE, but the latter is 12 threads (which is really 12 cores processing), "A desktop i5 is a quad core." is NOT correct as some are actual 2, but they all perform as 4 Threads, and yes I am talking "consumer level a i7 is (NOT) an i5 with 3MB more l3 cashe, and hyperthreading", there is much more including more cores. So YOU apparently throwing BLANKET statements around to justify your harassment doesn't fly when they are ALL incorrect, as proven.



HOLD IT! Your tossing around stuff and mixing it together when we are discussing different things AND has NOTHING to do with Hyperthreading. Hyperthreading is the technique developed by Intel to keep multiple cores from 'getting in the way' of each other trying to grab or pass on the processed data, as well as optimize the core to such a extent that it literally performs TWO processes at the same time INside each core, because of the use of MicroProcessing technique. This allows PARALLEL processing of data, rather then SERIAL processing as AMD does. AMD kept adding more and more 'cores' (FX-4xxx was Quad, 6xxx has 6 cores, FX-8xxx is 8 cores) but as more cores were added they didn't have a optimized threading solution, as everything is performed in serial (think of a straight line from A to B as serial, while parallel is A to B and C to D right next to each other).

When we talk about gaming what happens in NORMAL games they tend to be single THREADED applications, so when they follow the serial AMD method the other cores tend to be waiting in line to get a turn to grab code, are waiting to release code, etc. Where as a parallel Intel iCore model allows 2x the data to be grabbed by a single core, and as there is no 'wait' the next core does the same thing and so on. Basically doubling or more the performance while using less cores, then AMD can provide with 'true core'. When a application is actually optimized for MULTIPLE Threads (like Dual Cores) then having dedicated cores does improve performance for serial processing, but Intel's newer generation triumph over it with even more optimize MicoProcessing. In fast the newest Haswell chipset increaded the number of MicroProcesses then ever before, pushing the performance of Haswell iCores even better then the highest end FX chip available (surprisingly).

Now IF AMD had not walked away from the fight (FX chips are over 3 years old http://en.wikipedia.org/wiki/List_of_AMD_FX_microprocessors as compared to Haswell in 2013) and instead come up with a BIOS update to manufacturers + Driver change that would even 'partially' parallel (say a FX-4xxx with 4 cores had 6 threads) the entire line of current and previously sold chips, this would drastically change the marketscape and put a serious problem in Intel Dominant path. Or conversely, if code was developed (a 'wrapper') to take non-multi thread applications and 'smartly' divide the processing amongst the Cores, AMD again would prevail alot better then Intel, as iCore all are 'half' the cores of the AMD line.



Uhm NO, and totally FALSE. The 'Benchmark' systems are ONLY used for Bencmarking and kept under strict control. WHY? Because any introduction of other uses can skew later benchmarks, and cause FALSE results (as in the recent case of Samsung http://bgr.com/2014/03/05/samsung-benchmark-cheating-ends/). Such places as Ziff-Davis (owner of Cnet, et al), Tom's Hardware, CNN Tech, etc. all rely on accuracy and honesty, because if they don't they are LIBEL for compensation for false reporting or bias of one product / maker over another, and honestly they don't really want multimillion dollar lawsuits. CPU Magainze, Tom's Hardware, etc. all provide a disclaimer on exactly HOW they performed their benchmarks, built the systems and even the process of how the testing is complete to ensure "Journalistic Integrity". Now IF your referring to MadR33fer's L33t Youtube 'benchmark videos' ad your 'proof' I would direct you to the infamous word "CAKE IS A LIE". Any person can say whatever BS they want on a blog (personal log), a youtube video, or even a Forum like this. People whom have CREDIBILITY back it up with THIRD PARTY evidence, such as the numerous citation I am doing here. This proves my point, especially when my sources ARE DIRECTLY FROM THE MAKERS OF THE STUFF. Not just my overinflated trolling opinion spew.




Honestly I was replying to your assertion. While you may crave (as many people do) self built systems, many people do not have the time, money, much less patience to be all 'Selfie' mode. Hence the second option of a prebuilt. Now as for 'reliability', I have seen the same thing happen prebuilt OR selfbuilt, it doesn't matter, if a manufacturer makes mistakes or such it doesn't mean "sell it only in prebuilts", it is sold however they can. The BEST assurance of the MOST LIKELY reliability is to buy from the most reputable and this is also normally the most expensive makers (hence my examples) which their sales utterly RELY (make or break) on that rep. So if they screw up, they can't shrug it off, they lose to competitors. Other (like the MANY PSU makers) dont' care all they care about if they push enough sales out they take the profits and screw the people getting the 'bad ones'. Usually that is because being in overseas markets there is less liability (i.e. the FCC/FTC/etc.can't touch them to stop them) so they can do that. I heard it is so bad, they open a factory in China one one side of the street till they get caught, then they just move down the block with a new name and do the whole thing over again (sweatshoppe style) and just keep doing that, raking in the money shut down move make more money.



http://www.tomshardware.com/charts/hdd-charts-2012/compare,2900.html?prod%5B5531%5D=on
Actually the Velociraptor 1 TB 3.5" Hard Drive does exceed the max bandwidth of SATA II (300MB/s) with a score of 400.80 .

Second to that I would add that alot of consumers (as noted by the sales out there) add / switch to SSDs, which would be bottlenecked with a SATA II interface (killing the very reason to get a SSD).

Third, MOST drives now are 4K formatted, NOT 512K (See here to help you understand if you dont' http://www.bit-tech.net/hardware/storage/2010/04/01/the-facts-4k-advanced-format-hard-disks/1) and it is physically reading the platter different in case you haven't been paying attention (http://en.wikipedia.org/wiki/Disk_read-and-write_head
"Perpendicular magnetic recording (PMR)
During the same time frame a transition to perpendicular magnetic recording is occurring (PMR), in which for reasons of improved stability and higher areal density potential, the traditional in-plane orientation of magnetization in the disk is being changed to a perpendicular orientation. This has major implications for the write process and the write head structure, as well as for the design of the magnetic disk media or hard disk platter, less directly so for the read sensor of the magnetic head." This is not compatible nor usable without a BIOS change IF one even exists for the Mobo, normally the solution is to upgrade to a 'current' Mobo, aka SATA III controllers.

Fourth, as this is a OLD system, thus (as I believe I posted before) the warranty on the drive is way expired, and the normal 'lifespan' is based upon 40Hrs per week constant usage for 3 years (normal business environment) per each manufacturer. So that means after that point or if you exceed those number of hours repeatedly the PHYSICAL hardware itself will not 'be guaranteed to perform' as advertised (i.e. we expect it to break DUH!). So if the OP takes this drive and 'relies' on it, then he is adding a higher risk on a used piece of old hardware that per the manufacturer may 'fail' at anytime. I am on my Alienware m17xR2, and finally had to chuck my one 500GB drive because after these past 3 years of use (though manufacturing date is 4 years ago) it finally crapped out as a example of my OWN experiance.

Fifth, DIDN'T you just say "prebuilts often shave off money on the parts that matter most for reliability" ? So if you derail my suggestion for this reason, WHY then do you push he take the EMachines (prebuilt) hard drive (often shave off money on the parts) and rely on it (most for reliability)? You seem to conflict with YOUR OWN WORDS.

Lastly, as for RAID " tech enthusiasts been relying on it for far greater speed for decades" - incorrect they have been relying on the faster RESPONSE in how the data is stored as compared to how normal 'non-RAID' traditional in-plane orientation mechanical drives using 512K formatted common 5400RPM drives that were the 'standard' for off the shelf systems. As all this has changed (how the data is stored, file size it is stored in, how the data is read, etc.) for consumers they don't 'see' a difference these past years, especially now with SSD on the 'norm' for systems. As you noted "tech enthusiasts ", put it another way a Car Tuner whom 'build his own engine' etc. would push the same argument when he/she sees all the Prius, Hondas, etc. at the gas station all with the 'stock' settings. Enthusiast will take risks the average OP just wants a simple answer to.



KETTLE BLACK. I am calling you on it here, because you are spouting "you still aren't bothering to read what's in front of you" when it is your own words your not reading what your saying. First off your saying "why in the world should the OP get rid of perfectly usable storage" which has what? The Windows installed already, so that is why I pointed out the example of just moving the drive to a new build wouldn't work which you respond " You do not transfer the drive. " WTH MAN? Get your story straight your talking both sides of your head here.

NO, Microsoft will NOT issue a 'new key' NOR activate it "happily" when you explain "I just moved my EMachine OEM copy of Windows to a new Computer". FIRST it is NOT activation (your thinking old XP before SP1), Windows WILL NOT INSTALL, the CD is inserted, you BOOT from CD, it says you need to insert a proper DISK, you can't install. So it isn't just a 'activation key problem'. Secondly you did not read this PER MICROSOFT (that I provided to you)

OEM Software may NOT be transferred to another machine. Even if the original laptop, PC or Server is no longer in use, or if the software is removed from the original hardware, OEM licenses are tied to the device on which the software is first installed

SO: IF OEM Software is NOT installed on EMachine model XYZ with Mobo Model ABC and CPU YYY then IT WILL NOT INSTALL AND IS NOT PERMITTED. It doesn't matter how you " call Microsoft and explain the situation" that still straight the problem.

As you DETEST prebuilts, YOU are NOT using a PREBUILT OEM Windows, obviously, so that means your using the OEM SYSTEM BUILDER edition, which is NOT limited, just like a off the shelf Windows at Walmart/Best Buy/etc. In the case of the OEM System Builder Edition OR off the shelf Windows, YES you can certainly "call Microsoft and explain the situation, they'll activate it just happily" through the Automated System or speak to a person whom will do THE AUTOMATED SYSTEM with you. Can you understand it the SECOND time saying the same thing now?



Again you FAIL. You don't know shit about me and your trying to disparage a career (and that means the Mod and others here whom AS WELL have experience like mine thank you) as " that doesn't say much anyways." 20 Years of NOT sitting at a help desk "1800IamaIdiotHelpme" No. I do hands on, remote, designs, networks, and a whole lot more, never mind all my education to understand there is more to 'tech' then 'tech', that there is business, customer needs, expectations, ways to phrase it, etc. And No it isn't "I did computers 20 years AGO" it is I BEEN doing computers since 1984, from a Vic-20 and PET computers till todays tablets using Windows 8, Android, Chrome, etc. and much more. These are not "reflect ideas that were popular some three to five years ago", all I see is you being a ass right now spewing things you honestly do NOT understand, and yet here I am spoon feeding you not only the CORRECT information, but VALIDATING IT with sources to prove I am not full of shit as your doing right now.

Honestly your bunk is more outdated and incorrect is IS seriously NOT helping the OP, and from the get go jumped into a negative tone right off with a biased an unsubstantiated OPINIONS, not FACTS.

Step back, look what I proved, how I proved it, and READ THE PROOF (OMG you SO ignored Microsoft's OWN words come on!!!) then reassess your postings. Honestly look at what I am saying and look at what your spouting. There is a serious problem with what your saying.
 

sulumordna

Distinguished
Oct 16, 2011
182
0
18,710
I have this computer, well sort of. Mine is an ET1331-07w that had an Athlon II 250u.

I have upgraded the psu to a cooler master 450gx, radeon hd 6850, phenom ii x3 720be. I have a 120mm fan as in intake fan on the side and a 92mm exhaust fan. Everything else is the stock

It plays all my son's games fine (I have a 1376 x 768 monitor on it).
I initially tried replacing the motherboard with an extra am3 board I had with 2gbs ddr3 ram and using the 720 be but it was terribly slow with only 2 gigs of ram. I had an Athlon II 250 3ghz cpu on it for a while and it would lag at times on certain games he played but now it plays with virtually no lag (only lag is hdd load times and internet lag).

I'm now sure how long this setup will last as this mobo is only designed for 89w cpus and I'm using a 95w cpu on it. All I know is that a triple core phenom II cpu WILL work on the motherboard, and make a noticeable speed increase.