Sign in with
Sign up | Sign in
Your question

Core2Duo this year or skip & wait for QuadCore?

Tags:
  • CPUs
  • Quad Core
  • Product
Last response: in CPUs
Share
August 28, 2006 2:12:11 AM

I'm a post-pro vfx student and I kind of grow tired of my P4 640 at home.
I work a lot with Maya and After Effects and other very high class programs and everytime I render something with nice hot summer weather outside my Preshot inside goes right up to 71° C and sometimes even throttles...(probably thanks to hyperthreading, Maya uses the CPU to the absolute max).
And all this with a good zalman cooler I bought...a but never mind to my topic:


So I was fooled by Netburst propaganda and now I'm a victim of the Conroe hype and would love to buy a E6600 sometime before christmas. But I heard numerous rumors about quad-core ready boards coming out this year too and now I wonder if I should keep my Preshott over the winter (warm and cozy) and just skip the whole "dual core thing" and upgrade to quad? Or will there be boards with 2 775 Sockets for C2D? basically will I be able to just "buy another E6600" and stick it in a suitable board?

Would be so awesome to finally be able to afford such a quadcore monster for maya...
Any information and help is appreciated!

More about : core2duo year skip wait quadcore

August 28, 2006 2:18:34 AM

Quadcore will be part of the extreme addition so expect to spend over a grand for one.

Dual core is fine for 99% of the applications out there. About 1% would even utilize 4 processors.

Its never a great idea to wait for technology. If you need to upgrade than upgrade. If you wanna wait than wait because you don't need to upgrade. don't wait because the next best thing isn't out yet. Thats dumb.
August 28, 2006 2:39:01 AM

I never was one of those "let's wait for the next generation" types regarding upgrades, but the decision which motherboard to choose for a C2D is difficult for me. If there will be boards supporting Kentsfield or some type of Dual CPU Core2Duo board then I would wait for that board...would love to just add another E6600 in a year or so and then effectively have quad cores..
anyone know anything about what Kentsfield-ready/ Dual CPU boards are coming out this year if any?
Related resources
August 28, 2006 2:39:59 AM

Unless you're absolutely fed up with/sick of the 'slow' speeds you get with your 640 there's no need to upgrade right now just to blow the cash other than 'I want one.' Prescotts do run hot; the idle temp on one I recently came in contact with was low to mid 60C.

The two socketed boards probably aren't going to be the better option (if there are any, usually not many dual processor boards in the desktop market) than a single quad core -- I wouldn't count on being able to plunk 2 C2D's into a board. If I were you, I'd wait. Prices are guaranteed to drop after awhile anyway. Not that the Conroe is a bad buy at this point in time. If you have to upgrade tomorrow, go with the E6300/E6400 & save yourself some cash. The E66/67's aren't a very good performance:p rice in comparison. I'm a tightwad though. :wink:

@xxsk8er101xx -- Extreme edition quad cores? You must be thinking of the AMD 4x4 architecture. Next year (possibly late this year, hard to tell launches with Intel) Intel is supposedly going to be releasing Kentsfield, basically a double Conroe chip (read: quad core). Then next year AMD will be releasing the native K8L quad core as a mainstream product line.

Just my .02.
August 28, 2006 2:47:13 AM

Quote:
The two socketed boards probably aren't going to be the better option (if there are any, usually not many dual processor boards in the desktop market) than a single quad core -- I wouldn't count on being able to plunk 2 C2D's into a board. If I were you, I'd wait. Prices are guaranteed to drop after awhile anyway. Not that the Conroe is a bad buy at this point in time. If you have to upgrade tomorrow, go with the E6300/E6400 & save yourself some cash. The E66/67's aren't a very good performance:p rice in comparison. I'm a tightwad though. :wink:

@xxsk8er101xx -- Extreme edition quad cores? You must be thinking of the AMD 4x4 architecture. Next year (possibly late this year, hard to tell launches with Intel) Intel is supposedly going to be releasing Kentsfield, basically a double Conroe chip (read: quad core). Then next year AMD will be releasing the native K8L quad core as a mainstream product line.

Just my .02.


The 2-socket boards is AMD's 4x4, there isn't a C2D two socket, and probably won't be since Intel doesn't have a HyperTransport link.

The Kentsfield is planned to be Intel's line-up as an Extreme Edition chip.

K8L is a server architecture, so don't plan on it at the desktop level until probably 2008.
August 28, 2006 2:53:13 AM

thanks for the info Cube!

My 3.2ghz 640 isn't that bad...it's just not made to render 1920x1080 images in Maya...and that's what I'm doing now and getting tired of it. But even with a E6600 it would "only" be double as fast...which is sadly still not in the area of "smooth working speeds"
Guess i have to render on our renderfarm, even the test renders...

For someone like me, all those "wow new CPU generation is 25% faster" hype is just laughable...to really work in a realtime environment on 2K footage I think I'd need at least 8Core, 8GB Ram or something...

Back to the point...I think I wait at least till early next year...stupid Conroe hype has got me...I was ready to buy one in a couple of weeks...just to show my P4 CPU I don't like him anymore ;) 
August 28, 2006 2:55:57 AM

Oops, forgot:

http://www.xbitlabs.com/news/cpu/display/20060817124626.html
Quote:
The new processor will cost $999 in 1000-unit quantities and is likely to substitute the already announced officially 3.20GHz Core 2 Extreme chip with two processing engines. Even though the reasons behind such move are unclear, the transition of “extreme” processor to a multi-core design should emphasize the company’s plan to shift the attention of end-users to the number of cores, not clock-speed.



Quote:
Intel Kentsfield is expected to be drop-in compatible with some of the Intel 975X-based infrastructure that supports Intel Core 2 Duo and Core 2 Extreme processors.
August 28, 2006 3:02:02 AM

Thanks for all the clearing up!
So no two sockets from Intel, no K8L for me, cause it's not out and will probably be server market and Kentsfield will be 1000$ EE so that would rule itself out...guess I have to go with the E6400 or E6600 for End of 2006-2007 then. Maybe this will be the CPU I'm finally be playing DukeNukemForever on when I'm not working ;) 

Man, I played D3D on a P90...after that 160, MMX233, P2 350, P3 866, P4 2,53, P4 3,2 and Duke still is in neverland...
August 28, 2006 3:08:12 AM

ehh if u want 2 x 2 core processors, look at Woodcrest. the xeon 51x0 series. Those are the better binned core 2s used in servers, and im sure you can find a 2 socket server board for 2 woodcrests. and possibly pop in 2 quad cores later for a 8 core cpu.. Maya would be so happy...
August 28, 2006 3:13:07 AM

With everybody around here upgrading, maybe you should start a "Sticky" for people to donate their old systems for you to make your own farm. :) 

I'm sure DukeNukemForever will make it out before Intel releases a 32-core. :lol: 


Cube3601 -> I re-read my reply to you and it seemed a little "dry". Sorry, that was not my intent.
August 28, 2006 3:18:09 AM

Might as well since Conroe are still overpriced.
August 28, 2006 3:22:25 AM

Quote:
Classic -- I know one of the level designers on the DNF project, I haven't heard from him in 5 years, but he is still working on DNF.

Jack


:D 
Think DNF stands for Do Not Finish?

Reminds me of the old "I invented a perpetual motion machine. Ironically, I can't stop working on it."

Think it will be out before the "Very old computer working?" thread ends?
August 28, 2006 6:11:23 AM

Intel's Bad Axe 2 will be coming out in October time frame. It will suport the Kentsfield processor with additinal overclocking enhancements compared to the original Bad Axe motherboard.

The original Bad Axe will also support the Kentsfield processor as seen by what Coolaler has acheived with it over in www.Xtremesystems.org.

The board will allow you to buy a low end Conroe E6300 and get the improved computing you are looking for until the Kentsfield is released in Q4 2006 and then you can make a determination to buy one then but reap the rewards of a fast cool running processor or just put your old processor in it.
August 28, 2006 6:18:44 AM

Well then get one of the new workstation Server boards that use the new Dual Core Xeon 5000 processors. The high end ones use a 1333 FSB and run at 3.0GHz and are cheaper than the X6800 Conore. Sure you will pay more for the server board but you can buy one processor and hold off until you need to populate the second socket on the motherboard. You will also be paying more for the FB-DIMM memory but hey it is new technology.
August 28, 2006 9:06:39 AM

Quote:
I'm a post-pro vfx student and I kind of grow tired of my P4 640 at home.
I work a lot with Maya and After Effects and other very high class programs and everytime I render something with nice hot summer weather outside my Preshot inside goes right up to 71° C and sometimes even throttles...(probably thanks to hyperthreading, Maya uses the CPU to the absolute max).
And all this with a good zalman cooler I bought...a but never mind to my topic:


So I was fooled by Netburst propaganda and now I'm a victim of the Conroe hype and would love to buy a E6600 sometime before christmas. But I heard numerous rumors about quad-core ready boards coming out this year too and now I wonder if I should keep my Preshott over the winter (warm and cozy) and just skip the whole "dual core thing" and upgrade to quad? Or will there be boards with 2 775 Sockets for C2D? basically will I be able to just "buy another E6600" and stick it in a suitable board?

Would be so awesome to finally be able to afford such a quadcore monster for maya...
Any information and help is appreciated!


I built my Rig (specs in my sig) to do just about the same thing you do, except i use lightwave instead of maya. Love my rig, much faster than my old P4 560J!
August 28, 2006 9:10:54 AM

Quote:
Dual core is fine for 99% of the applications out there. About 1% would even utilize 4 processors.


If an app is programmed for dual core, then it is likely to be programmed for multicore and therefore will utilise 4 or more CPUs.

Whether they are required or not is another argument entirely, and I agree - to more than 99% of users, it simply won't be an option yet.
August 28, 2006 12:27:16 PM

thanks for the info about the bad axe 2 board pausert20!
Sounds like a good option.
What I really wanted is a good board that I could continue to use with the kentsfield when it someday is in the 400$ range.
If I buy a cheap little 6300 or 6400 I definitely would upgrade sometime at he end of 2007 probably.
All my other boards recently (RAM included), I had to leave behind with every cpu upgrade and it sucks...
August 28, 2006 12:52:16 PM

Quote:
K8L is a server architecture, so don't plan on it at the desktop level until probably 2008.
You say that like Opteron and Athlon are really that different. :roll:
August 28, 2006 4:12:51 PM

Quote:
What I really wanted is a good board that I could continue to use with the kentsfield when it someday is in the 400$ range.
If I buy a cheap little 6300 or 6400 I definitely would upgrade sometime at he end of 2007 probably.


For professional rendering work ('Maya' et al), a good (read expensive) OGL GPU is the best, either you have a Workstation or a [powerful] Desktop.

Just my 2cs.


Cheers!
August 28, 2006 4:25:31 PM

Quote:
Oops, forgot:

http://www.xbitlabs.com/news/cpu/display/20060817124626.html
The new processor will cost $999 in 1000-unit quantities and is likely to substitute the already announced officially 3.20GHz Core 2 Extreme chip with two processing engines. Even though the reasons behind such move are unclear, the transition of “extreme” processor to a multi-core design should emphasize the company’s plan to shift the attention of end-users to the number of cores, not clock-speed.



Quote:
Intel Kentsfield is expected to be drop-in compatible with some of the Intel 975X-based infrastructure that supports Intel Core 2 Duo and Core 2 Extreme processors.


Will this make the Conroes cheaper?
August 28, 2006 5:20:02 PM

Hmmm ... You should consider the following specs

Intel Xeon "Woodcrest" 3.0ghz (this will give you the option of a second processor)
2-3 gigabytes of RAM
Nvidia Quadro series (not sure what the latest is)

That will do MORE than double.

Since you sound like you do this for a living you may want to consider forking out the bucks. You can either put one together yourself or build one at Dell. Just go to small business section and click on precision.

Quote:
thanks for the info Cube!

My 3.2ghz 640 isn't that bad...it's just not made to render 1920x1080 images in Maya...and that's what I'm doing now and getting tired of it. But even with a E6600 it would "only" be double as fast...which is sadly still not in the area of "smooth working speeds"
Guess i have to render on our renderfarm, even the test renders...

For someone like me, all those "wow new CPU generation is 25% faster" hype is just laughable...to really work in a realtime environment on 2K footage I think I'd need at least 8Core, 8GB Ram or something...

Back to the point...I think I wait at least till early next year...stupid Conroe hype has got me...I was ready to buy one in a couple of weeks...just to show my P4 CPU I don't like him anymore ;) 
August 28, 2006 5:21:11 PM

Quote:

For professional rendering work ('Maya' et al), a good (read expensive) OGL GPU is the best, either you have a Workstation or a [powerful] Desktop.

Just my 2cs.




Nope! Only if you want to render with a realtime render client like Gelato from Nvidia. A good OpenGL Card is nice to have, but not essential. When it comes to final rendering, Maya does everything with the CPU.
Any Geforce 6800+ is more than sufficient for viewport rendering the models and stuff...
So a C2D would give a lot more power than any 2000$ quadro card, at least for my Maya experience...
August 28, 2006 5:25:19 PM

http://configure.us.dell.com/dellstore/config.aspx?cs=04&oc=690750w32min&m_11=XPP2E&m_1=T3074&c=us&l=en&s=bsd&kc=6W463

Here's a little something i put together - not sure if it'll work ... lets see

Well it kinda worked - didn't really pick out the stuff i selected.

As you can see you can add a second processor (giving you quad core)

If Maya is programmed to see 4 processors that will dramaticly help rendering time!

And yes you're right from my experience in Maya it's mostly the cpu that renders not the gpu.
August 28, 2006 5:52:51 PM

Cool workstation...but 2400$+ is a little too steep for me right now. I'm still studying and not earning a lot with my work right now.
We have those Xeon Workstations at the academy and they rock, but I love working at home...hehe so my desktop has to have some horsepower too.

I love it that the lines between workstation and power desktop become more blurred every year. I'm a gamer too so it's good to be able to work on stuff in a decent environment and play some Crysis here and then ;) 
I think I'll go with "just" a Core2Duo at christmas or something...Kentsfield looks too expensive at least a year or so and a Xeon CoreDuo workstation like the Dell one isn't exactly what you would call a "student's" PC...
August 28, 2006 6:36:51 PM

You should check your cpu cooler... I got Zalman and my 640 runs cool and no throtling at all..(over 70 C throtle is almost a sure thing) ... when rendering with Catia , my rig get 53 C at max... so maybe you should invest first in a new cooler and plan better your next purchase...
Quadro cards will give you better rendering... over 25% when compared with regular VGA cards... one economical option is to get a 6800 Ultra card and using Rivaturner...turn it into quadro... all my 6800 ultra work as quadro, SLi is also supported...
Xeon systems are great for professional work (otherwise apple would not choose it for new dual workstations) ...so maybe you jump should be bigger than CD2 only..... but depend's on your needs...of course... but I got one demonstration of the xeon power... I got some Catia maps to render (over 100000 points) that requires ... 3 hours in my 240 opteron , 2H 45 min on 640 rig and 1 H 45 Min in Opteron 165 rig... with a dual xeon 5100 server... 22 minutes... probably rendering with Maya will get similar results as well as these results depends on VGA power and CPU/RAM power...

Maybe you should check better for dual xeon systems... cheaper than older xeon systems...and really more powerfull ... (and you can also save some serious money using 6800 ultra as quadro as there are only one model over quadro based on 6800 Ultra gpu...so sure it is powerfull)
August 28, 2006 6:36:53 PM

Quote:
Nope! Only if you want to render with a realtime render client like Gelato from Nvidia. A good OpenGL Card is nice to have, but not essential. When it comes to final rendering, Maya does everything with the CPU.


That's odd! I'm much less knowledgeable on GPUs than on CPUs; however, aside RT & quality, it's my belief that rendering speed also depends a lot on the GPU used... then again, I might be wrong, of course.

As a mere curiosity, here's a view on the Woodcrest platform:

http://www.realworldtech.com/includes/templates/articles.cfm?ArticleID=RWT052306090721&mode=print


Cheers!
August 28, 2006 7:09:33 PM

Quote:

That's odd! I'm much less knowledgeable on GPUs than on CPUs; however, aside RT & quality, it's my belief that rendering speed also depends a lot on the GPU used... then again, I might be wrong, of course.

As a mere curiosity, here's a view on the Woodcrest platform:

http://www.realworldtech.com/includes/templates/articles.cfm?ArticleID=RWT052306090721&mode=print


Cheers!


Trust me I know my stuff on this one.

We worked with Gelato renderer (like Mental Ray or Renderman for Maya) on our last project, and the only real advantage over other render clients is, that it uses the Nvidia Geforce Quadro for rendering. It is a little faster for lots of geometry but it sucks at rendering many light sources.

Other render clients like Mental Ray for Maya don't use the GPU for final rendering.
You can do some cool viewport stuff, realtime shaders, realtime bumpmapping with some plugins and a nice OpenGL card, but the quality and speed still sucks compared to any game engine.
For final rendering it's raw CPU power with normal (as in all except gelato) render clients. And Gelato is very beta right now and not as good as Mental Ray imho.

Greetz!
August 28, 2006 9:58:40 PM

Quote:
K8L is a server architecture, so don't plan on it at the desktop level until probably 2008.
You say that like Opteron and Athlon are really that different. :roll:


Not a big difference in architecture, but different in marketing. The new desktop chip (AM3 socket), will have to be backwards compatible with AM2 as AMD stated earlier. Plus, I doubt AMD will offer a desktop chip with similar performance as K8L right away as that would probably make their 4x4 platform much cheaper than their server offering.
August 28, 2006 10:01:59 PM

Quote:
Oops, forgot:

http://www.xbitlabs.com/news/cpu/display/20060817124626.html
The new processor will cost $999 in 1000-unit quantities and is likely to substitute the already announced officially 3.20GHz Core 2 Extreme chip with two processing engines. Even though the reasons behind such move are unclear, the transition of “extreme” processor to a multi-core design should emphasize the company’s plan to shift the attention of end-users to the number of cores, not clock-speed.



Quote:
Intel Kentsfield is expected to be drop-in compatible with some of the Intel 975X-based infrastructure that supports Intel Core 2 Duo and Core 2 Extreme processors.


Will this make the Conroes cheaper?

I doubt it will push the bottom end (ie E6300&E6400) any lower. Most likely will drive the 6800 down.
August 28, 2006 10:35:48 PM

Wait for quad-core to come out, buy dual-core.
August 28, 2006 11:37:48 PM

An E6600 will be way more than twice fast look at toms CPU compare and look at the max 8 comparisons E6600 1:38 640 4:20. That just one frame which is about 2.6 times faster. In render land that's huge.
August 28, 2006 11:48:50 PM

Go with a Core 2 since the first quads are basically going to be 2 conroes that communicate across the FSB, think of it as a cross between a Core 2 and a presler. An E6600 is really good power at that.
August 29, 2006 8:21:07 AM

List of improved features for Bad Axe 2:

1) Marvell SATA Gen 2 RAID controller
2) Updated Bios core
3) Much improved Overclocking Bios functionality
4) Digital Pots to allow finer granularity for Over voltaging
5) Improved FSB margin - ie(high potential FSB overclock) >= 1333FSB
6) Kentsfield support
7) PECI support for Digital Diode
August 29, 2006 7:30:08 PM

Quote:

That's odd! I'm much less knowledgeable on GPUs than on CPUs; however, aside RT & quality, it's my belief that rendering speed also depends a lot on the GPU used... then again, I might be wrong, of course.

As a mere curiosity, here's a view on the Woodcrest platform:

http://www.realworldtech.com/includes/templates/articles.cfm?ArticleID=RWT052306090721&mode=print


Cheers!


Trust me I know my stuff on this one.

We worked with Gelato renderer (like Mental Ray or Renderman for Maya) on our last project, and the only real advantage over other render clients is, that it uses the Nvidia Geforce Quadro for rendering. It is a little faster for lots of geometry but it sucks at rendering many light sources.

Other render clients like Mental Ray for Maya don't use the GPU for final rendering.
You can do some cool viewport stuff, realtime shaders, realtime bumpmapping with some plugins and a nice OpenGL card, but the quality and speed still sucks compared to any game engine.
For final rendering it's raw CPU power with normal (as in all except gelato) render clients. And Gelato is very beta right now and not as good as Mental Ray imho.

Greetz!

Like i said earlier, i'm a Graphic Designer and PC Game Designer (specializing in 3D modeling, i use primarily Lightwave and MAX). I just built a new PC (specs in my sig) for my work at home, since i work FROM home. It's a C2D rig similar to yours that i use for gaming too.

You are absolutely right about an OGL card, they're really a waste for Consumer 3D modeling applications like Maya, MAX, Lightwave, etc... They are however, really good for CAD applications and people who have extremely high resolution "super monitors". Which, as you've stated, you don't plan to use. So i professionally second your choice for a good gaming card ;) .

And you're absolutely right about final rendering, it's ALL CPU, not GPU. Now if you wanted to do real-time scene previews with AA and radiosity, etc.. etc.. then you should go for a quadro or FireGL... But seriously, we (as 3D artists) rarely do that. A high end gaming card will give you more than adequite resources for consumer 3D modeling and rendering applications, and be MUCH better if you intend to use your PC for gaming as well, which you do, and so do i.

I personally chose an nVidia GeForce 7900GTX for my rig because it's the best nVidia card (other than the 7950GX2), and nVidia has a slight edge in OpenGL over ATi. And since Lightwave is an OpenGL program, i decided on nVidia over ATi. But i had $2500 to spend on my rig, but as you are still a student (i think you said that, didn't you? errrr, at least you're just starting out in the feild?).

If you're on a budget and want a good gaming / design card, i would recommend something like an eVGA GeForce 7900GT. I would personally recommend this card for your situation and budget. And i like eVGA, my 7900GTX is eVGA and i've liked it a lot; plus they have a nifty trade-up program, where within 90 days, if you decide you want a better card, say a 7900GTX, all you have to pay is the difference of price b/w the 2 cards and they'll send you the new one and pay for the shipping back of the old one!

However, if you are further restricted by your budget, you could also go with an eVGA 7600 GT. Of course you don't have to get eVGA for either card, that's just the brand i've chose and have liked. Nothing against ATi too, you could always go with an ATi X1900GT or ATi X1800XT.



Now as for your CPU, don't wait for quad core, get a C2D. For your budget, i'd go with a E6600. Pair it with a good CPU Cooler like a Scythe Ninja+ and some good quality ram, Corsair XMS2 DDR2-800 and you can OVERCLOCK the hell out of your CPU and get awesomly fast render times ;) .
August 30, 2006 12:00:10 PM

Thank you LogicSequence for all the info!

Right now I'm in the middle of a projecct, so never change a "limping" system before it reaches the finish line.
I think I'll just buy board+ram+C2D sometime in the next months till Christmas.
I have a GeForce6800GS from Gainward right now and that should do the trick till the first good DX10 cards are in the 250$-300$ range.
!