Resse

Distinguished
Nov 2, 2011
7
0
18,510
Greetings,

I'm thinking about upgrading my system, but I'm not sure if it really makes sense in certain areas. Right now my system specs are:

Intel Core 2 Quad q9650 (3GHz)
Asus P5 Mainboard series (intel p65 chipset - socket 775)
XFX Geforce GTX 285
8 GB DDR Memory (I think its DDR3-1066)
Auzentech X-Fi Prelude soundcard
and some more stuff that doesn't really have any influence on my decision



Right now I'm thinking about adding a Geforce GTX 570 to my system to get my system ready for directx11. I had a closer look at Windows 8 and the new features don't really empress me. So i think I could be set for the next 3-4 years with this card.

My question now is, would there be a bottleneck in my system, because especially the MB and proc aren't really up to date. I only want to upgrade them if it really makes any sense.

In case of an upgrade I thought about buying:
Intel® Core™ i7-2600K processor (3,4GHz)
GIGABYTE GA-Z68XP-UD4 Mainboard
GIGABYTE GV-N570OC-13I graphics card

Any thoughts or advice?
 

dalmvern

Distinguished
Jun 15, 2011
673
0
19,060
While your mobo and processor are a bit outdated, being the fact that its a quad core processor, if you overclock it I dont think you will see any bottlenecking with the GPU.

For right now if you got a 570 you will be fine.

On the other hand, upgrading your processor will benefit you greatly, and you will see a huge performance boost.

If I were you, I would probably upgrade the core components now and wait until the new series of GPUs come out and buy one of those. Then you will essentially have a top of the line build with futureproofing and a GPU that will last you a quite long time.
 

Resse

Distinguished
Nov 2, 2011
7
0
18,510
I already thought about getting a new GPU from the upcoming nVidia 600 series. But as far as I can see they aren't that close to be released. If I wait the the GPU I'll probably have to wait up to 6 months. And even then the price will be quite high :pfff:

Right now with the GTX 285 I basically have the best directx 9 card there is. I wont get better results or more performance. DX10 was skipped by most developers as far as I know. That leaves DX11. And I don't believe that graphics in games improve greatly within the next 2 years. We all know that most devs are on the console train and we wont see the new PS or Xbox before 2013.

Therefore I think i could be pretty safe with the GTX 570.

Any other thoughts on that? Is there anything on the horizon where the recent generation of GPUs would scream in agony?
 

Resse

Distinguished
Nov 2, 2011
7
0
18,510
What do you mean with the 2011 six cores? The only six cores available right now (as far as I can see) are the Intel® Core™ i7-980 and 990X on socket 1366 mainboards and they are a bit expensive in my opinion with a price of 519€ and 899€.

Do you mean one of those CPUs? I think in a few months they could have a more reasonable price.

Btw. I don't want to use AMD CPUs ;)
 

Resse

Distinguished
Nov 2, 2011
7
0
18,510
Btw. I just saw a few card from "Point of View" with the geforce 570 chip but with 2560 MB memory (standard card have 1280 MB memory). How much impact could the additional memory have on the performance?
 

dalmvern

Distinguished
Jun 15, 2011
673
0
19,060



No he is talking about Intel's Sandy Bridge - E series that will be coming out...its socket LGA 2011.
 

dalmvern

Distinguished
Jun 15, 2011
673
0
19,060


It depends on a few things. If you are running multiple monitors or at a resolution higher than 1920x1200 then the increased VRAM will help. As far as gaming goes, you probably wont see much of a performance difference between the two. 1GB is sufficient for all current top end games.
 

Resse

Distinguished
Nov 2, 2011
7
0
18,510
Well I have a FullHD 40" TV (1900*1020) and a 25,5" (1900*1200) display. So the memory could give me a slight bonus. Usually I only use my 25,5" display for gaming, because the TV only has a standard VGA port (or HDMI) and with the VGA connection the quality of picture during gaming is not very good.
 

dalmvern

Distinguished
Jun 15, 2011
673
0
19,060
Well if you arent *using* the HDTV then it wont matter. I like to game on one screen and watch movies on my second...both pulling off a 2GB card (Radeon 6950), so the extra VRAM is a benefit for me.

Also, on as side note, have you tried using an DVI to HDMI adapter or a DisplayPort to HDMI adapter from your computer to your TV? That will help the quality, but 1920x1080 resolution on a 40" screen will be a bit grainy anyway.
 

Resse

Distinguished
Nov 2, 2011
7
0
18,510
Well, I didn't say I don't use my TV. :p

I just don't use it for PC gaming. I would probably have problems with the size of the screen anyway....

It's great for movies and I don't have any problems with the quality. Haven't tried the DVI to HDMI adapter yet (until recently I didn't know I could get something like that, but some of the new grapic card have HDMI ports anyway, so I wont bother with the port for now.

Right now I'm leaning towards the card with the 2560MB Memory. I'll have a closer look at them again.
 

dalmvern

Distinguished
Jun 15, 2011
673
0
19,060
Gotcha. I actually prefer DVI to HDMI over a full HDMI cable, mostly for the fact that I dont have to mess with any settings to make sure it doesnt transfer sound as well as video...I hate the speakers that are integrated with my monitor, I prefer my Dolby 7.1
:D
 

Resse

Distinguished
Nov 2, 2011
7
0
18,510
Hehe, I can relate to that. I have a proper 5.1 Dolby Surround System at home with a digital decoderstation from Teufel. (www.Teufel.de - only in german). It got enough power to wake all of my neighbours ;) and I only watch movies using this.