Socket 1366 obsolete, SMT a 'gimmick'

Status
Not open for further replies.

eleanor296

Distinguished
Nov 5, 2009
9
0
18,510
I was wondering when somebody would comment on that... saw it this morning.
Seems like it goes totally against the prevailing opinions on most forums... if it's true though, I'll rethink getting an i5.
 

roofus

Distinguished
Jul 4, 2008
1,392
0
19,290
not really controversial. the real question is what currently exists to take advantage of the tri-channel memory. it stands to reason the dual channel numbers would look decent on benchmarks designed for existing tech.
 

MRFS

Distinguished
Dec 13, 2008
1,333
0
19,360
> the real question is what currently exists to take advantage of the tri-channel memory.


Maybe ramdisks e.g. for high-speed database access?

P55 supports up to 16GB, but X58 supports up to 24GB.

Subtract the RAM needed for the OS, and the rest is
available for application programs and ramdisk(s).


MRFS
 

andyKCIUK

Distinguished
Jun 18, 2008
153
0
18,690
I5/P55 is clearly the best setup on the maket now. Keeping in mind that x8/x8 PCIE takes off only a few FPS it's a no-brainer really.
 
The claim that SMT decreases performance in x264 and games goes against both my own testing and every reliable review that I've seen (and no, this doesn't count as a reliable review).
 

That's odd.

I could have sworn that an i7-975 was faster than an i5...




Now, I think you can make a decent case that the i5-750/p55 is the best value on the market right now (though there are some excellent values from AMD as well, especially at lower price points), but to call it the best setup on the market is ridiculous.
 

jennyh

Splendid
Imo it's pretty obvious enough, the more aggressive turbo on the i5 is making the difference a lot more than tri-channel memory.

I'm not sure about SMT being a gimmick but I'm starting to believe tri-channel memory was.
 

andyKCIUK

Distinguished
Jun 18, 2008
153
0
18,690



Sure it is. It's way more expensive too...

For a gamer (and I'm talking the highest in-game settings here) an I5 based rig is all they need. A good P55 mobo -which BTW is cheaper than X58- will provide tons of fun with overclocking and the money saved on RAM can be put into graphics card or SSD. I really feel for ppl who got themselves an I7 based platform.
 

Kewlx25

Distinguished
Hyper Threading would *technically* reduce single thread performance because of the shared cache and a few shared internal resources. But, it's only reduced performance when you're barely using the CPU itself. If you actually loaded a Nehalem base CPU, HT would increase performance.

My lowly 2.66ghz 920 hovers about 40% cpu usage when compressing a 1920x1080 FRAPS dump with xvid 1.2.2 to a 1920x1080 Max quality stream @ 55fps, and the HD bottlenecks. So, I can compress a 1080p data stream @ 55FPS @ 40% cpu.

Assuming I had my games/OS on a diff HD, I could compress a 1hr 1080p movie in 33 minutes, have 1 free cpu for playing my video games and doing 5.2gigaflops(2 cpus) on folding@home. (decompressing the 1080p stream consumes about 10% cpu so that's almost 1 cpu, which is why I said 1 for gaming)

Some german magazine had a Quad socket sexa-core AMD (24 c ores) gett'n beat'n by a Dual socket quad-core 3ghz i7(8 cores) in a few enterprise level (DB)benchmarks.
 

someguy7

Distinguished
Dec 12, 2007
1,186
0
19,310
Tri-channel is not a gimmick. It just is what it is.

Same thing for dual channel. http://www.tomshardware.com/reviews/PARALLEL-PROCESSING,1705-11.html.


http://www.tomshardware.com/reviews/amd-reinvents,1258-28.html.


If you where actually expecting tri-channel to make a difference on the desktop then you have been living in a hole/cave or something.


There was always this simple test as well. Run your DDR2 in single channel mode vs dual and see if you notice difference. Nope. There are benchies on this as well but I dont feel like looking them up.


 

Kewlx25

Distinguished
It was my understanding that the i7 can "thread" memory access across the memory channels which results in lower "under load" latencies. So, you're best gains would be under heavy random load from multiple programs. But for the average user, running MSN in your task bar while playing MW2 doesn't count as "heavy" load.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860
While I am normally against Intel, this article is stupid in its conclusion. Tests with a 1st gen i7 with the 2nd gen ... duh. Conclusion should be thie 17 - 900 series is obsolete. Lets see you put the i9 in your socket 1166.

And before you say x8/x8 isn't that much of a difference, lets see it put to the test with 2 5970s or 2 295s. Until you saturate the channels, it doesn't make much difference. No one who plans on buying 2 ultra high end cards would dare to cripple them.
 

notty22

Distinguished
Declaring a popular platform "dead" or "obsolete" is silly and now I will have the unpleasure of people quoting that term when people ask about which build to go for.
*facepalm*
 

andyKCIUK

Distinguished
Jun 18, 2008
153
0
18,690

Who buys 2 5970s?!? Noobs...

People with less then 2x5970 ( I dare to say, 99.9% of us) will never even notice the difference between x8/x8 and x16/x16.
 
Yes - In situations where you are not memory bandwidth limited (read: "nearly all Desktop Apps"), then Triple Channel Memory is not a feature your apps are capable of taking advantage of.

As a general rule, even LGA775 isn't bandwidth limited on the desktop - We've known that for a long time. Stands to reason that adding a lot more bandwidth wouldn't make a substantial difference.

Clearly a slow news day. ;)
 
Well, my i5/i7 ripoff thread was much more popular than Id thought itd be, but theres some truth to it.
Im still expecting this usage to be done down the road with a "fusion" like product, where even tri channel may not be enough
 

roofus

Distinguished
Jul 4, 2008
1,392
0
19,290
i wouldn't say early adopters got bent over because an article on Fud claims it so but if there isn't some serious hardware around an i7, it is a waste. Quad SLI/Quadfire with all the trimmings would necessitate the 1366 platform. any cards that can saturate the 8x lanes at peak will suffer on the i5 platform and you will find that goes beyond the 5790 and the 295 when people are volt modding and liquid cooling.
it was a waste for me but i came out of it unscathed lol
 


x8 PCIe is ok for last generation's video card.

With the rise of the 5870 (essentially a 4870x2) and the 5970 (a 4870x4), you're going to start seeing massive bottlenecks soon.

I wouldn't be surprised if the 5970 - or even the 5870, has already maxed out PCIe 2.0 x8.



EDIT:
http://www.tomshardware.com/reviews/pci-express-2.0,1915-10.html

As you can see, a PCIe x8 slot is slightly bottlenecking a 9800GX2. A 5850 is around 20% than a GX2, and a 5870 is around 60+% better.
Thus, running on a 1156 platform, running two 5850s at x8 will result in bottlenecks, and running two 5870s at x8 each will result in significant bottlenecks.
 
1366 obsolete...puh-leez...that's like saying AM2+ is obsolete because of AM3.

Intel made it known that the i7 was for the "enthusiast" and that i5/i3 was for the mainstream.

I think what this article really illustrates is that the mainstream computing platform and hardware (i5 and Phenom w/dual channel) has finally reached a level of performance that can run any and all applications even the most sophisticated of power users would need. Or to phrase it another way, software development has fallen so far behind hardware development that it doesn't take as much hardware to get the most out of all applications.

Also what this article illustrates is that not every gamer no longer needs uber high end hardware to play the latest titles.

This is quite different from the state of desktop computers of even three years ago when the fastest cpu's and dual gpu's were necessary to run the latest gaming title with all the eye candy...yes, I'm referring to Crysis...and in some instances, the "best" hardware at the time was still not enough to max out the eye candy and get playable frame rates. Heck, my old single core Dell laptop with single channel DDR would get bogged down solely from working in large Excel spreadsheets and large Access databases. Today, my Dell 630 with a Core2Duo and 2GB dual channel barely blinks at those spreadsheets and databases.

For anyone just getting into DYI computers within the past 2 years, this article may seem topical. But for anyone with a sense of history or anyone who has been building computers for more than 5 years, this article could be considered pedantic.

Lastly, this is a Fudzilla article we're talking about...
 

jennyh

Splendid
Well that's another thing isn't it. Sure Gulftown looks 'amazing' but how much of it is benchmarketing?

I won't be upgrading to Thuban even though I have one of the lowest x4 Phenom II's, and that's with Thuban costing a lot less than Gulftown I'd bet. Is anybody who bought an i7 920 or so actually going to buy an i9?
 
Gulftown would be amazing. I do flow simulations in Solidworks that take 8 hours and load every core. I'll be buying a Gulftown if it is cheap enough and/or clocks high enough.
 
Status
Not open for further replies.