i've finally decided to sell of my amd opteron and get something new.i plan to play games-run osx on the pc-- i've got a 8800gt 512mb to game on.
i've got these components in mind i would love suggestions on which one to buy:-
any other good board/chipset that would fit my needs
2x2 gb gskill/transcend/corsair value select/kingston ddr2 800
1x2gb gskill/transcend/corsair value select/kingston ddr2 800
i plan to buy these parts within a week.
i would need something that would be stable enough to handle mac osx leopard
u might ask why won't i buy an apple directly---its just bcoz they are too expensive for the hardware they offer and then you can't play games on them.
i wouldn't be upgrading these components in the next 2 years so they need to be good and should last me that long.the graphics card would be dumped if something better turns up from AMD/Nvidia than my 8800gt
For gaming the E8400 is perfecto mundo. The buss and cache are awsome. It also OC very well. But for video edit/ren go with the Q6600. I am running the E8400 and LOVE IT! I am able to get a stable OC of 3.6Ghz on stock voltages.
so its e8400
what chipset would be the best that would suit me?-p35/g965/nvidia-i need to get good frame rates in crysis!
please do reply!
I'll tell you this much, even if the intel chipsets OC higher and are generally considered better. You will not do great in crysis with a single 8800GT. Not saying it's not very playable once tweaked, just saying you are going to make sacrifices in detail levels vs playable fps. A faily inexpensive 750i mobo like the P7N SLi Platinum, 3.4GHz Q6600 or e8400, and a second 8800GT will spank any single GPU in crysis regardless of how high you clock it or the CPU. There are parts of that game that bring a single gpu to it's knees, and even dual cards struggle, while providing much higher fps.
This could change very soon with new GPU's coming out. But unless you go 9800GX2 now, your best Crysis performance will currently be with SLI on an SLI chipset. All this said, P35 and X38 are very attractive overclocking chipsets. If you will not be adding another 8800GT, then by all means go for one of those.
Actual gameplay I did with a Q6600 @ 3.0GHz and a single 8800GT vs dual 8800GT in paradise lost(very gpu demanding ice level) at 16x10 all high 0xaa/16xaf.
A merged screenie, showing 16 fps for the single 8800GT and 29 fps for dual 8800GT this time with 2xaa and again all high details 16x10 and 16xaf. This happens to be the savepoint I used to test that fraps run above, it really puts a hurting on the GPU.
(edit: Merged and compressed to fit on imageshack.)
do you turn off the anti spy or virus?
do listen to music while u game?
might u burn a cd while gaming?
a quad has 2 free core almost, intel chages less for 4 cores then 2 per core.
so a quad at 3.6ghz vs a dual at 3.8-4ghz
well for pure gaming - yes the dual is better but who really does pure gaming?
How in the world do you determine the dual is better for pure gaming? Explain that to me. Forget the mulitasking part of the argument, lets focus on pure gaming. If we could exit as many processes as possible, lets talk gaming. Review sites sure are not running an anti virus while they test. Don't argue the quad isn't any better, prove the dual is better like you said. (two different things and I generally agree with the first one). But First read over the 4 links I provided in that other thread. One showing a 3.6GHz Quad outgaming a 4.2GHz dual.
Come back with your reasons for stating dual is better for gaming. Give some proof that a dual core can offer a better gaming experience than a quad core. For every link you find showing duals winning a low res scaling test, I could probably provide one showing a quad winning. But still, who games at 800x600 or 1024x768 no fsaa to let those determine what makes for a better gaming cpu? So lets look at actual settings we would game at when determining if a cpu is really better for gaming. You'll find little difference and if anything at high clocks the quad will lead more often than not.
Look at the Xbit link. In single threaded fear, a 3.6GHz Q6600 pulls ahead from a 3.85GHz dual. In the dual threaded games it sometimes pulled ahead more. In the Quad threaded games, it stomped the dual. Now, these were your typical low res tests to scale the CPU's. So in reality there is not really an advantage once you crank resolution and eye candy. But still if some low res tests are show to scale a dual in the lead, why not point out when the quads can pull ahead just the same.
What makes for better gaming? 200 fps vs 210 fps at 800x600 no fsaa, or the smoothest acceptable gameplay at the highest possible settings you can crank things to.
Anyway, please understand I am not jumping on you. I want an open discussion and exploration of dual vs quad. or better yet, I'd like to see the discussion come to an end as it's been discussed over and over and over. I for one must admit I get somewhat tired of people repeatedly saying dual is better without reading the links I provide that show otherwise, or coming up with any counter proof to back up their recommendations. Again, I am not referring to you specifically, just the comments that come up from whoever over and over and over.
As you can read in that other thread. It is my position, that in current games neither the quad nor the dual has any real gaming advantage over the other. Neither will provide a better gaming experience once we talk overclocking and especially, or more specifically, settings we game at. A 3.6GHz Quad can easily hang with a 4.0GHz dual, trading blows, winning as often as it loses. I totally do not agree with the statement that a dual core is better for gaming. edit: I likewise, won't argue a quad core is better for gaming.
ok now which motherboard do i go for?
tell me some of the best p35 chipset based boards/965 based boards
i also need to run mac osx leopard so i would prefer intel/asus based boards bcoz they happen to be a bit more stable.
do p35 chipset based boards have support for sli--as in dual display support so tht i can connect another 8800gt to the existing one i in sli to gain more performance?