Sign in with
Sign up | Sign in
Your question

I hate what the CPU is becoming!

Tags:
Last response: in CPUs
Share
September 13, 2010 1:52:32 AM

Hello everybody ::- ). My first post here. I'm going to try and make it a meaningful one. Please take into consideration that I am talking purely from an ENTHUSIAST's standpoint.

I hate the CPU's future. I hate the fact that pointless transistors and technology is being shoved down our throats. I never had and (hopefully) never will have integrated graphics in my computer. A discrete GPU cannot be replaced by these pathetic Sandy Bridge or Bulldozer architectures (at least not for 10 years). Ok, so they cater to the mainstream. I got no problem with that. But WHY are we, the people who buy their BEST products and, for sure, ensure a future for their top products, being treated like this?

I want to upgrade my CPU next year, but I hate the choices: Sandy Bridge has a stupid integrated GPU in all versions, while AMD is going for the exact same crap. So I'm going to shove my money into something I will NEVER USE. Annoying!

At least if they thought to bring hybrid GPU technology to the desktop! But did they? Of course not. *sigh*. I would understand the usefulness of an integrated GPU if in normal OS mode (no games), the discrete GPU would be COMPLETELY SHUT DOWN and the system would fall back on the CPU integrated graphics.

I agree that it's a good package... for the masses. But not for us who still buy discrete GPUs. Discrete GPUs will not be replaced by any CPU GPU mongrel anytime soon. It's impossible (for the immediate future) due to the huge advantages a dedicated card has over some nickel-sized partition on a CPU.

I WANT CLEAN CPUs!

More about : hate cpu

a c 83 à CPUs
September 13, 2010 1:56:53 AM

First off, Bulldozer doesn't have a graphics core, AMD's Fusion is a completely different product called Llano. 2nd, only mainstream Sandy Bridge has the graphics, the high end socket 2011 parts do not.
September 13, 2010 1:58:47 AM

AMD is not just going with an hybrid CPU lineup, they also have a new architecture for enthusiasts, Bulldozer, as you said. Also, the Intel 990X is coming soon, along with other K-Series (unlocked multipliers) CPUs. You are not and will not be limited integrated GPU CPU's.
Related resources
September 13, 2010 2:20:39 AM

loneninja: Didn't know that Bulldozer will not have a graphics core. I thought both will.

Eugenester: as for Sandy Bridge: the K-series DO have integrated graphics as far as I know, the Chipset doesn't use them, but the transistors are there, on the die, waste, waste, waste.
a b à CPUs
September 13, 2010 2:36:18 AM

Quote:

I got no problem with that. But WHY are we, the people who buy their BEST products and, for sure, ensure a future for their top products, being treated like this?


If you think about it, even though you say you hate the choices for your CPU upgrade...eventually and inevitability...you will buy the CPU. Intel and AMD both know this,don't think for a minute that these companies are a bunch of dummies in suits sitting around a conference table yakking about their products, they are smart and they are all about making money, to them, you are a "nobody". And even though you may think that you ensure their future by buying their product...it isn't like that at all, you are going to buy their product either way and they know that you depend on them so you have no choice...so really its the other way around. So unless you are really going to do something about it like not buying your next CPU (and convincing other people to do the same), because it has integrated graphics and you are paying for something you aren't using, and stick with your old CPU...then its pretty much a pointless effort.
September 13, 2010 2:46:25 AM

No news there. I was merely expressing my dis-satisfaction, not asking for a revolution. And since I learned that Bulldozer will not have the GPU in it, apparently, I can (and will) do something about it: not buy Intel.
a c 87 à CPUs
September 13, 2010 2:49:56 AM

Its a baby step. It will suck for a bit, but once fusion is done it will be a huge boost in speed. There is two types of math a CPU will do, Integer and "floating point". 1 + 2 / 3, etc is Integer math. The cores found in CPUs are very good at doing this. 1.23 + 2.34 / 3.5678 is floating point math. Care to guess why GPUs run some programs better then CPUs? (other then many more simple cores?) The Arch found in them is better suited to this type of math then what a CPU has.

This is where fusion comes in. You leave the Int cores on the CPU alone, and get rid of the FP core/cores. Fold a GPU into your CPU, and have it do all the FP math you come across. It won't be sitting there doing nothing, but running all your FP math faster then your CPU could ever hope to do. Just because it CAN run graphics doesn't mean that's all it can do. This should provide a nice boost in speed, and not just for IGPs.
a b à CPUs
September 13, 2010 3:44:30 AM

Exactly, that is the eventual goal. As Axonn points out, for many of us the first gen may bring little or no benefits. However, later on that GPU could turn into a great co-processor and a decent IGP if necessary. However, if they don't start somewhere they'll never get anywhere.
September 13, 2010 4:22:56 AM

I am familiar with integer/floating, all that stuff. I'm not debating that. Of course for >90% of people CPU/GPU union will pay off especially at later stages (and hell, for most people, even Sandy Bridge will rock! compared to what crap Intel gave them so far).

But... when does the CPU/GPU mongrel get 2 GB of dedicated GDDR5 RAM? ::- ) When does it get a socket capable of pouring 250 W in it? Yeah, for Solitaire and HD movies it will be a cool thing, but for tri-monitor super quality 3D gaming? ::- D. In how many years? I can't even see a 3rd generation Fusion product capable of doing that. Fusion is not enough. More is needed. Much more. And that simply outstretches the CPU.

Let me put it this way: I think a good fusion would be moving the GPU *NEAR* the CPU, maybe even on-die, but with a huge amount of super-fast RAM near it. We need less than 20 nm to be able to move huge GPUs on die with CPU however. And I'm not even sure that will ever compete with discrete solutions.
a c 87 à CPUs
September 13, 2010 6:57:01 AM

Quote:
I am familiar with integer/floating, all that stuff. I'm not debating that.


Not sure you understand it still. The point of fusion isn't to get rid of the GPU. As EXT said its like a co processor. Those of us that want great gaming will continue to use a GPU. And the "GPU/IGP" of the "CPU" won't be sitting there doing nothing. If you understand how things are hoped to work, you'll know what it will be doing.
September 13, 2010 7:15:35 AM

Moving the GPU on the same die as a CPU like amd's llano is more aimed at your casual user and laptops. Also like others have pointed out it will act as a sort of co processor that should yield a nice speed boost. Besides that there is always hybrid graphics to save power which is very important in laptops. Also think about this from a business sense, why make a processor without the graphics core on die when you can charge more for the same processor with the graphics core and people will still buy it.
a c 172 à CPUs
September 13, 2010 8:25:56 AM

Besides, the enthusiast market is tiny compared to the mainstream office and home market. Gaming may drive the high-end, performance market, but the real bread and butter for both Intel and AMD is the mass market in basic boxes.

You should be grateful. That mass market is what is keeping the costs relatively low for us.
September 13, 2010 9:33:54 AM

dipankar2007ind: that's exactly what I was complaining about, if you haven't been paying attention: I don't want pointless transistors (disabled) in my computer ::- ).

anonymousdude: that's the problem: it can't be the same price. Obviously those transistors cost /something/. It would be cheaper without that useless crap.

4745454b: ah, that I did not catch, indeed. A coprocessor if used correctly by software would be great indeed, no doubt about it! But today no software does that and look how much they needed to go x64 and multi-thread. *sigh* I'm afraid this is another thing that will be there for a decade before we can finally get something out of it.

Personally I think more than 4 threads is another complete waste, and proves how slow software evolves. And it's not because they're stupid (the programmers) ::- D. I can tell you that because that's what I am. I've got almost 20 years of software coding under the belt. But it's just that most applications don't even need all that power. And the ones that do, well, it's not very easy to make something multi-core. Fortunately multi-core-aware libraries started to pop up, which make things easier. But it took years! And even now, 8 cores => overkill & waste for most people (except the rendering / heavy computational crowd).

So we'll see how much time such a coprocessor takes to be used. Personally I'll be staying away from them as much as possible. If they come up with some use for them, like disabling Discrete GPU when in OS (like nVidia Optimus does on Laptops) I'll probably reconsider, since I code a shitload more than I game.
a c 172 à CPUs
September 13, 2010 9:50:38 AM

All it would take is a CUDA-type library to use a modern embedded graphic core to generate a sizable increase in floating point performance.
September 13, 2010 11:48:27 AM

dipankar2007ind: PARTIALLY true with the mobos: I don't buy Mobos with integrated graphics. Never did, never will. Sound, that's another story: you can't really avoid that apparently *laugh*, but if I could, I would. I personally got a relatively decent Creative sound card, but lately, I've heard onboard sound solutions got decent. Especially on ASUS Rampage series.
a c 133 à CPUs
September 13, 2010 12:07:17 PM

Basically we all know a single chip that can do everything is the future. Computers with the power of the most powerful stuff out now into a package no bigger then a cell phone will be in the future but as another has said it takes baby steps.
a c 87 à CPUs
September 13, 2010 12:21:59 PM

Chicken and egg then. Do you put the GPU onto the die, or what for the library to be invented that will allow it to crunch FP? One has to come before the other. I get what your saying, but it has to happen. Sure, it sucks for awhile because we have to wait for this to happen and its just disabled and drawing power/causing heat. But once they figure out how to make it work it will be heaven. I say let anyone do it, the eventual reward will be worth it.
a b à CPUs
September 13, 2010 12:50:32 PM

Quote:
But WHY are we, the people who buy their BEST products and, for sure, ensure a future for their top products, being treated like this?


Because they don't make money off us. Its the bottom 99% of people that Intel/AMD make their money off of. Lets face it, 90%+ of PC's use integrated GPU's; the discrete market is ever shrinking, thanks in no small part to the rise in console gaming. As such, it makes perfect sense to move the GPU onto the CPU die.

Thats how capitalism works: The majority will always be catered too.
a c 131 à CPUs
September 13, 2010 1:17:52 PM

Nobody complained there was wasted die space and money when the FPU co-processor was integrated into the main design. Or the cache.
September 13, 2010 2:13:40 PM

4745454b: completely true. I'd just preferred Intel to give us an option however. Sandy Bridge without GPU. Like AMD will hopefully do with Bulldozer.
a c 131 à CPUs
September 13, 2010 2:45:32 PM

Axonn said:
4745454b: completely true. I'd just preferred Intel to give us an option however. Sandy Bridge without GPU. Like AMD will hopefully do with Bulldozer.

I figure that the prices will probably match up like they do with motherboards. Motherboards with integrated graphics are in the low and and, regardless of the IGP, are much less expensive than anything that doesn't have an IGP.
a c 87 à CPUs
September 13, 2010 3:32:17 PM

Axonn, I don't see why they would do that. Why deprive yourself of a better FP engine? Besides, it would cost to much to make separate lines like that. I really feel like you believe its a waste of die space, when the truth is in the future it will be one of the best things to happen to the CPU since moving the L2 cache off the BSB and onto the die itself.
a b à CPUs
September 13, 2010 3:45:05 PM

My only concern with the IGP on the CPU die is that of memory access; I strongly suspect both SB and BD will be significantly affected my memory access times...I'm actually very interested in how all the communication between the CPU/IGP and RAM is carried out in both designs...
September 13, 2010 3:58:01 PM

Intel also has the top MARKETING brains ;;- ).

4745454b: not deprive. Just have an alternate lineup for the consumers. They got one for the servers, don't they? No GPU there ::- /. I want one of those released for a consumer socket.
September 13, 2010 5:50:41 PM

True ::- ). But let's not forget that there was a period when AMD completely humiliated & destroyed Intel (for a good 4-5 years), but they were idiots and did not know how to keep that advantage. A pity! Now they'll probably suck forever ::- D. Or who knows?
a b à CPUs
September 13, 2010 6:11:31 PM

I'm also not liking the integrated GPU. Not unless they can use it to do some other awesome things once a discrete card is added. I know it doesn't make sense, but if the on-die GPU parts could be used as a PhysX (or whatever other physics engines finally start using it) accelerator that would be cool, or maybe it could somehow speed up the PCIe connections with discrete cards or something like that. If there's zero advantage to having it on die once you add your own GPU, it's a complete waste of space. Now, if they had a line of CPU without GPU for the sandy bridge, maybe they could use that space to pack in another core or something? I don't know. I'd like to know if Ivy Bridge will be GPU-less.
September 13, 2010 6:37:43 PM

4745454b said:
This is where fusion comes in. You leave the Int cores on the CPU alone, and get rid of the FP core/cores. Fold a GPU into your CPU, and have it do all the FP math you come across.


Uh, no.

GPU floating point processing is generally fast but very high latency compared to CPU floating point. It's great for things like video processing where you can send data in large chunks and do something else while you wait for the response, but it sucks ass when you just want to calculate 1.3x9.716+3.1415926.

It also tends to be much lower precision than the CPU: AFAIR you can get up to 80-bit on a typical x86 CPU vs 32-bit on a typical GPU.
a c 131 à CPUs
September 13, 2010 6:47:52 PM

Quote:
Which period are you talking about?Intel always kicked Amd butt since the dawn of computers.

I take it you left on a trip across the galaxy during the K8 days before intel came out with their core architecture?
September 13, 2010 6:57:02 PM

Modest dual-cores will be sufficient for the next 5 years I predict. The only thing is they'll change the PCI-express spec so to run the latest games, you'll have to buy a new mobo, therefore CPU etc.

I have already written myself a promise not to buy anymore computer components for the next 5 years after my latest upgrade. I'm confident.
September 13, 2010 6:57:53 PM

enzo matrix said:
I take it you left on a trip across the galaxy during the K8 days before intel came out with their core architecture?


From what I remember the AMD29000 was much faster than comparable Intel chips too. I think we were using 29000s in the 286 era when x86 wasn't even 32-bit yet.

Edit: actually, maybe Intel had the i860 around the same time? I forget.
a b à CPUs
September 13, 2010 7:26:48 PM

Until the K6, AMD simply made intel clones, with a slight overclock in some models. In terms of actual differences in architecture, the K8 days are when AMD was supreme.
September 13, 2010 7:37:07 PM

Enzo Matrix: drop it. I smell fanboy ::- D. Don't waste your time.

gamerk316: exactly but let's not all get into history lessons just because somebody, *ahem*, sees in blue colors only ::- >.
a b à CPUs
September 13, 2010 7:56:39 PM

There are also instruction sets, used for encryption and other routines most of us won't use. Until someone codes something interesting. Microsoft is doing all sorts of hype with gpu acceleration in IE9, these are exciting times.
http://news.cnet.com/8301-13924_3-20016156-64.html
The IGP may not match discrete cards for playing games, but if they can find use for more parallel processing for free. Surfing the net, could become a whole new experience. Like going from b+w tv to color, then to HD , now 3d.
There need to be two web layers, one for the smart phones/net books and the other for Home entertainment broad band.

Quote:
Intel, on the other hand, is addressing acceleration from the hardware side. The chipmaker released a video Friday showing IE9 running on a Core i5 processor, claiming that "Internet Explorer 9 is hardware accelerated on any piece of graphics hardware that supports DirectX 9."

"The Intel Core i5 processor is calculating the movement of these images and then the built-in HD graphics is actually rendering these images on the screen," said Erik Lorhammer, Sandy Bridge graphics marketing manager, in the video.
Read more: http://news.cnet.com/8301-13924_3-20016156-64.html#ixzz...

http://www.youtube.com/watch?v=6KbTvJlwQR4
September 13, 2010 10:27:18 PM



edit: misread sorry
a b à CPUs
September 13, 2010 10:38:36 PM

He wasn't referring to Enzo as being a fanboy, I think.
September 13, 2010 10:44:21 PM

EXT64 said:
He wasn't referring to Enzo as being a fanboy, I think.

Oh ok. Then nevermind :na: 
a b à CPUs
September 13, 2010 10:44:43 PM

I really dont see the point here, companies cant cater the superficial "needs" of 5% of the market when to the other 95% its something pretty darn nice.

Besides, you'll still have LGA2011 on the Intel side for SB and something like it for Haswell and beyond due to server chips."Clean" CPU will probably be available for a pretty long while, it simply moves more upmarket.

Saying this is like saying how AMD shoves Eyefinity down your throat and is "wasting space with the DP port when they could opt for better cooling or $10 less"and Nvidia shoving CUDA and PhysX down your throat when they can "cut die space by removing CUDA and make it more power efficient"
a b à CPUs
September 13, 2010 10:46:07 PM

I wouldn't think a halfway-decent IGP on the CPU would be a waste, esp. if all you are doing is browsing the web or something and your power-hungry discrete card is turned off. Plus even during some intense gaming there could be a way for the game to use the IGP such as for physx or other computational enhancements. Intel is using a high-speed ring bus to interconnect the CPU and GPU components, sharing the cache mem and some other tricks according to Anandtech, and I think AMD will be doing something similar.

The tech is too new to just write off completely for gaming - give it some time and see what evolves.
a c 131 à CPUs
September 13, 2010 11:13:22 PM

If either company integrated a graphics processor on die with their newer high end CPUs and keeps the TDP down, I will be very impressed.
September 13, 2010 11:15:20 PM

i agree with you, i was planning on upgrading from my stock hp when sandybridge came out, but after much thought i decided to upgrade now as i didnt like the idea of the gpu on the cpu.

I feel like its waisting the true potential of sandy, since some of the space is decicated to the gpu. makes me wonder just how much better the cpu's would be if they could use all of the die just for the cpu. (thats just my thought on all this, glad to see im not the only one).

although sandybridge on a laptop, with nividas optimus would make for an awesome powerful gaming laptop, that i could use for long periods of time and not worry about battery life. Just one more year (sigh)
a b à CPUs
September 13, 2010 11:28:49 PM

^ Mainstream Sandy's will be out in Q1, so you probably won't have to wait a full year for a lappy that you like.

IIRC the s2011 performance versions (without a GPU onboard) will be out 2H next year, so maybe 9 - 12 months or more from now. There will definitely be a 6-core version, maybe even 8-core or 10-core since Intel will be making a Xeon 10-core, 20-thread CPU according to their IDF presentation as reported by Anandtech. I'd bet the 10-core DT version goes for $1999 :p .
a c 131 à CPUs
September 13, 2010 11:35:41 PM

fazers_on_stun said:
^ Mainstream Sandy's will be out in Q1, so you probably won't have to wait a full year for a lappy that you like.

IIRC the s2011 performance versions (without a GPU onboard) will be out 2H next year, so maybe 9 - 12 months or more from now. There will definitely be a 6-core version, maybe even 8-core or 10-core since Intel will be making a Xeon 10-core, 20-thread CPU according to their IDF presentation as reported by Anandtech. I'd bet the 10-core DT version goes for $1999 :p .

Over $1000 would be the first in a long time, wouldn't it? I really with AMD would hurry up and bring some competition to the market.
September 13, 2010 11:39:38 PM

EXT64: exactly, I wasn't referring to him. But to certain people who seem to glorify Intel / nVidia / AMD when it is not the case. Enzo pointed out the obvious truth in AMD's history, a painful and sad truth that they were number 1 for a long time (painful and sad, for me at least, because they didn't manage to capitalize on that).

fazers_on_stun: unfortunately they aren't giving us Optimus on PC yet. If the discrete GPU would be turned off when in the OS, I would *TOTALLY LOVE* the integrated GPU on the CPU, as I am gaming much less than I am coding software, which is not very 3D intensive *laugh*.

Here's to hoping they do that.

Timtop: Your examples about Eyefinity / CUDA are not very relevant. CUDA is just a facet of 3D performance while Eyefinity does not encumber the card as a GPU encumbers a CPU. As for the non-GPU Sandy Bridges, did you see what they want to sell there? 6 cores? *sigh*. 4 cores (preferably without HT) is the absolute top I would be willing to fork money for. Anything else is waste of money.

I'm very picky when it comes to hardware. But if being like this will leave me stranded without a product to buy, that's my fault...

But /hopefully/ AMD will be earlier with Bulldozer (under pressure from Intel) so I can get a 4 core without a stinky GPU in it *grin*. Otherwise, I'll be forced to buy Sandy Bridge... I won't like it but it'll be the only choice for 6 months or more...
a b à CPUs
September 13, 2010 11:49:13 PM

Axonn said:
EXT64: exactly, I wasn't referring to him. But to certain people who seem to glorify Intel / nVidia / AMD when it is not the case. Enzo pointed out the obvious truth in AMD's history, a painful and sad truth that they were number 1 for a long time (painful and sad, for me at least, because they didn't manage to capitalize on that).

fazers_on_stun: unfortunately they aren't giving us Optimus on PC yet. If the discrete GPU would be turned off when in the OS, I would *TOTALLY LOVE* the integrated GPU on the CPU, as I am gaming much less than I am coding software, which is not very 3D intensive *laugh*.

Here's to hoping they do that.

Timtop: Your examples about Eyefinity / CUDA are not very relevant. CUDA is just a facet of 3D performance while Eyefinity does not encumber the card as a GPU encumbers a CPU. As for the non-GPU Sandy Bridges, did you see what they want to sell there? 6 cores? *sigh*. 4 cores (preferably without HT) is the absolute top I would be willing to fork money for. Anything else is waste of money.

I'm very picky when it comes to hardware. But if being like this will leave me stranded without a product to buy, that's my fault...

But /hopefully/ AMD will be earlier with Bulldozer (under pressure from Intel) so I can get a 4 core without a stinky GPU in it *grin*. Otherwise, I'll be forced to buy Sandy Bridge... I won't like it but it'll be the only choice for 6 months or more...

IIRC, CUDA is partially hardware based also, and if it was strictly for gaming, there are plenty of parts you can cut from a GPU without compromising performance. But its there for people who want to utilize the features.

So you're an "enthusiast" that's not willing to pay extra to buy "enthusiast" products, hmm....

Well, SB isn't really amazing so far, more like a die shrink plus than a new architecture, so you could just get Nehalem and OC the crap out until you see something satisfying... Just saying.
September 14, 2010 12:04:46 AM

Timop: sorry for misspelling your name. And this Forum just won't let me edit my own posts. Keeps saying I have no right. *sigh*. Anyway...

I'm an enthusiast that is not willing to pay extra to buy *products which are not meant for him*. I am a software developer & a gamer. For me, more than 4 cores is overkill & waste. I don't do rendering or encoding. I want to invest 2000-2500 Euro in my next computer, solely for motherboard and what's on it, but I prefer going SSD RAID or something like that for my money, rather than 6 core. The HDD is a bottleneck, I'll invest more money there. I am an enthusiast but after 15 years of system building & career I know my needs very well ::- ).

oz73942: I hope they'll bring some sort of Optimus on PC as well. I'm kind of waiting on AMD for that, as for the time being, I would not buy nVidia as a GPU. I currently have an nVidia, but it's from the time when they rocked (7950 GT). Now, other than 460, they're really slimy. And they'll get even slimier when AMD pulls out the Southern Islands in a few weeks.
September 14, 2010 3:55:44 AM

Axonn said:
Timop: sorry for misspelling your name. And this Forum just won't let me edit my own posts. Keeps saying I have no right. *sigh*. Anyway...

I'm an enthusiast that is not willing to pay extra to buy *products which are not meant for him*. I am a software developer & a gamer. For me, more than 4 cores is overkill & waste. I don't do rendering or encoding. I want to invest 2000-2500 Euro in my next computer, solely for motherboard and what's on it, but I prefer going SSD RAID or something like that for my money, rather than 6 core. The HDD is a bottleneck, I'll invest more money there. I am an enthusiast but after 15 years of system building & career I know my needs very well ::- ).

oz73942: I hope they'll bring some sort of Optimus on PC as well. I'm kind of waiting on AMD for that, as for the time being, I would not buy nVidia as a GPU. I currently have an nVidia, but it's from the time when they rocked (7950 GT). Now, other than 460, they're really slimy. And they'll get even slimier when AMD pulls out the Southern Islands in a few weeks.


2000 to 2500 euro? Thats going to be a pretty nice PC right there. Im waiting till the 8 core Bulldozer cpu is released for desktops till i upgrade. Ill just be buying a new motherboard along with the CPU. I don't plan on replacing my GTX 295 until theres a single GPU card that beats it across the board.
a b à CPUs
September 14, 2010 4:22:53 AM

2000-2500 is ALOT.... 2500 euros is basically 3200 american dollars. That's my budget plus 1700 more -_-. Jesus. If you really like quad-cores. The i5 750 is probably the only thing you need. Plus say the Mushkin Callisto (I just really like mushkin) as an SSD. You really only need 60-120gb for an SSD, plus your RAID 0 so 60gb really is all you need, and just get a Samsung F3 1tb and its good to go. O.O want to send me some money.

Anyways, Slimier.. what does that mean? But if your referring to the Cayman series from AMD, we have no clue of knowing the performance just yet. However, I would look towards the 6850 if ever. Mainly because Cayman like a Porsche Ahaha.
September 14, 2010 9:42:08 AM

aznshinobi: Slimier - adverbial form of slimy *i think*. English grammar is not my strong point ::- >. Not my native language ::- ).

yannifb: I don't buy computers often. Once every 3-4 years only. This one has been my longest running system. Dual Raptors (1st gen) in RAID, pretty top notch components for 2006.

To both of you: it's not such a huge budget. I calculated with 2000 Euro and it won't buy Science Fiction stuff. It is approximately what I invested in my current machine back in 2006 enough to keep it damn good for 2 years and then acceptable for another 2. But with that money I can barely shove 2 good quality SSDs + a VaporX 5870 Radeon, on a Maximus 3 Formula ASUS mobo. Problem is SSDs. They cost a lot! Other than that, a 2TB hard drive, a good chassis & power supply, 6 gigs of RAM and an i5.

SLI is a waste in my opinion.

SSDs in RAID, I'm still curious on that. Didn't see benchmarks thoroughly covering that subject. Right now I got Raptors in RAID but I keep hearing RAID impairs fast access to small files so maybe it's not such a good idea to have for the OS drive. It may compensate when large files are read though...
a b à CPUs
September 14, 2010 12:19:06 PM

I think it would be good to wait on at least the GPU (hopefully that would at least drive prices down even if you didn't want a 6000 card). CPU is a harder choice. The current offerings are great and would last a long time and the replacements are still a decent way off. Also, wouldn't i5 7xx or i7 8xx be 4 or 8GB and i7 9xx be 6 or 12GB?
a b à CPUs
September 14, 2010 2:00:47 PM

Yes EXT64 that's right, 1156 socket ONLY runs Dual-channel while 1366 runs triple, soon SB will run quad channel. So 6gb isn't the best memory choice for ram speeds.
!