Hello. First time poster on this board, but over the years I have come to trust this site for computer hardware comparisons, so I hope someone might be able to help me with this dilemma. My desktop computer is now 6 years old or so and I'm in dire need of a new one. I've upgraded the CPU, ram, and GPU before, but it's time to relegate this computer to backup and buy a new one. Now here's the dilemma. When would be the best time to buy a desktop computer, for gaming mostly, this year? I'm obviously looking for the best price all around and I've asked people about it before. Some people say end of the year holidays, such as Black Friday through Christmas, while some have said during the end of certain quarters of the fiscal year, where there might be new hardware releases or a need to drive up sales numbers before the quarter ends. I'm not looking to build a computer myself, so I've been looking to buy from Cyberpower PC, which is competitive in prices and close enough to me to pick up my order. Anyways, any suggestions on when to buy this year? Thank you for very much for any help.
Hmm...thanks for the advice. I'm actually thinking of getting a comp before Starcraft 2 comes out at the end of July. I'm looking at a Asus X58 mobo, i7 930 CPU, and a ATI 5850 GPU. Not really interested in SSD's (solid state drives right?) until they get way cheaper and have proven the test of time.
Alvin, I'm not sure what u mean by your reply about a "balanced build." Are you saying it'd be a waste because the GPU wouldn't utilize all of the CPU's processing speed? I do plan to Crossfire 2x 5850 when the card drops down in price in the future.
All I'm saying is that fps/detail@^Rez is almost cmpletely governed by the GPU ... with CPU clock-rate, a distant second, and with anything over 2 cores being mostly wasted.
I also meant to remind that more than 4GB of RAM is completely wasted .. for games.
I cant say (zactly) if an AMD 440 (OC) with a 5870 would do any better than a 930 with a 5850, but it would likely come nigh close (fps/detail) in most games and, with a second 5870 (in 18 months) ... Your expendiatures, over 5 years, would be way less, while your gaming experience would be similar (at first) and WAY better, after SLI .... and, by that date ... two 5850s might just be laughable anyway.
... I am not much for debating this as I am so cheap that an Xbox or PS3 would satisfy my basic boredom-fighting requirement and be easier on my wallet.
Also ... Graphics/render rigs tend to be nVidia and I don't want mine to be real loud so I'm going with a down-clocked 9800GT-EE (lo pwr-noise heat).
Lot's of happy gamers will opine ... just give them time.
I'm no computer wiz, but supply/demand is a pretty simple concept. High supply with low demand = lower prices. Low Supply with high demand = higher prices. With that said, when the demand goes up, so will production of new models. New models may mean lower demand for older products, but not always. Sometimes new products fail after they have stopped production of the older, but working, models.
The moral of the story? The market a tricky hooker. You never know what she's going to do.
funny you should mention "toilet", in relation to the economy ... because ...
No matter HOW BAD the world economy gets, folks are STILL gonna need toilet-paper and tooth-paste, etc. ... Global population growth ensures demand ... ensures a "bounce" from "worry-inspired" market "dips" (however severe or protracted).
All those toilet paper and toothpaste companies have been putting off new hw and os because of economy and vista .... they will ALL be replacing half their desktops between sept 01 and feb 29th ... place your order before september ... August 15th will give the best pricing and "forward vision".
I know this sounds really weird, and far fetched, at the moment, but ...
In the end, OS and Apps developers (like MS & ADOBE) will send their "intermediate source" (secure/semi-compiled) code to INTEL, where it will be "hardware compiled" into a custom-optimized "system on a chip" where each sw module and subroutine is uniquely hard-coded into custom cores and pipelnes, complete with scaled caching and dedicated MLC "render-zones" (SSD zones).
All of this will be packaged into an SDHC sized (postage stamp) sized package which will also contain general local SSD storage and one-way data thruput which is pumped on and off the module via dedicated read vs. write caches and lanes. It will have three copper connector lines for +V, earth ground, and "managed static ground" (a new pwr management feature) ... and those will be straddled by two light pipes, one for read and the other for write throughput.
These "app cards" will plug into an INTEL spec "light-bus" on the new 8 inch deep mini-rack "case standard", which will quickly evolve to the 2 inch deep micro-rack standard.
So ... You will purchase (as a consumer) a 1U (rack-space) or 2U space panel that has as many "apps slots" as you need (can afford) ... It looks alot like a card reader with rows of vertical slots ... consumer models will be able to load up to 96 (major) apps.
There will also be "generic compute modules" (discreet and/or integrated) that allow lesser (generic) apps and utilities, to be run in a standard 24-core generic environment (almost like Core i7 + RAM).
Security dongle is built in and counterfeiting/piracy will be "dead", as far as OS and "Super-Apps" go.
= I just got back from the year 2018 ... I had less than two hours and I went straight online and stayed there ... did all the research I could, before my "continuity" reached "BINGO" =
I only had two hours so, natuarally I went for the 45 minute "IntEL Virtual Roadmap tour, first ... so my view was distorted ... but it was clear that the micro-rack was an open standard and that the licensing of various patents was more the issue than who was doing the fab. The integration of dissimilar dies within a common multilayer substrate was IBM's main contriburtion ... TI were the ones who provided the "super local" DSP bridges and SanDisk cmae up with "scalable SLC zones" , etc ... Basically ... different fabricators bid on open contracts for various "specs", which were "kludged-up" from an open source toolbox of "API" like plug-ins.
In a way, you might say that the programmers pick and choose from "logic engine architecture" toolsets, and ... that all fits withing a "process flow-chart" design approach.
But ... physically ... there are not vast central resources (RAm, etc.) ... it is all like shops and strip-malls along a branching superhighway ... I got one "search hit" that described it like a human circulatory system ... "semi-fractal/super-local" and everything (all data) moves in one direction (but there are many loops, along the way).
Like i said ... most apps that used this architecture were more expensive but compute compatibility and performance benches were "absolutely guaranteed" ... So you could buy five classes (speeds) of CS9, for instance ... like SDHC cards ... "Class 10" is faster.