Sign in with
Sign up | Sign in
Your question
Closed

Intel Core i3, i5 Arrandale and Clarkdale in Photos

Last response: in News comments
Share
December 18, 2009 11:31:58 PM

Nice. Now, when will I have one of these in ma case?
Score
5
Related resources
December 19, 2009 12:15:05 AM

Shiny! I thought that Larrabee project was dropped?
Score
-13
December 19, 2009 12:23:17 AM

Im hearing good things about performance of these babys - around double the performance of previous generations etc

And no Intel IGP's have nothing to do with Larrabee
Score
6
December 19, 2009 12:51:37 AM

apache, so what chip do they use?
Score
-4
December 19, 2009 1:04:08 AM

apparently some advanced version of the current X4500 featured in the G45 chipsets today
Score
7
December 19, 2009 1:14:48 AM

They look awesome
Score
3
December 19, 2009 1:35:19 AM

i'm kind of outta the "news" as far as this CPU goes, but what can it's on-board GPU be comparable to? for example: nvidia 9800GT? 8800? or is it not even meant to be that beefy of a GPU?
Score
0
December 19, 2009 2:22:29 AM

HansVonOhainShiny! I thought that Larrabee project was dropped?


It was... it's not Larrabee.
Score
5
December 19, 2009 3:46:35 AM

any chance for higher res shots?
Score
0
December 19, 2009 4:00:33 AM

TidalWaveOneIt was... it's not Larrabee.

I don't remember Intel saying it was permanently dropped.
Score
-3
December 19, 2009 4:04:17 AM

drools........
Score
0
December 19, 2009 4:06:38 AM

Can't wait to see them in a MacBook Pro.
Score
-10
December 19, 2009 5:27:17 AM

LeJayI don't remember Intel saying it was permanently dropped.

No, but "Delayed Indefinitely" might as well be the same thing as far as most of us are concerned.
Score
7
Anonymous
December 19, 2009 8:27:24 AM

"i'm kind of outta the "news" as far as this CPU goes, but what can it's on-board GPU be comparable to? for example: nvidia 9800GT? 8800? or is it not even meant to be that beefy of a GPU?"

Think more like a nvidia 4200..... MAYBE a 6200.
Score
6
December 19, 2009 8:32:10 AM

ben850i'm kind of outta the "news" as far as this CPU goes, but what can it's on-board GPU be comparable to? for example: nvidia 9800GT? 8800? or is it not even meant to be that beefy of a GPU?

Think 2x an older Intel IGP. In other words, not terrible for an IGP, but miserable for any serious use.
Score
7
December 19, 2009 11:33:30 AM

More likely it's performance is close to a GeForce 7300GT gDDR2 version.
Score
0
December 19, 2009 1:14:57 PM

"On-chip" sounds miss leading. Usually people say a "chip" is a piece of silicon. The picture is obviously not one die. All they did is move the video card closer to the processor. Technically this is not on-chip graphics, its in package graphics.
Score
7
Anonymous
December 19, 2009 2:28:00 PM

zipzoomflyhigh got it right, this is to AMD's Fusion what Intel's hyperthreading was to AMD's dual-core. All gimmick, no substance.
Score
9
December 19, 2009 2:34:12 PM

ben850i'm kind of outta the "news" as far as this CPU goes, but what can it's on-board GPU be comparable to? for example: nvidia 9800GT? 8800? or is it not even meant to be that beefy of a GPU?


Try closer Nvidia Geforce 4.
Score
4
December 19, 2009 5:30:57 PM

Comparison to the ION?
Score
0
a b å Intel
December 19, 2009 5:49:18 PM

Pretty pictures only?
Score
0
December 19, 2009 6:21:19 PM

not even pictures, these look like renderings
Score
-3
December 19, 2009 6:58:53 PM

Quote:
Comparison to the ION?


This will be slightly superior to AMD's "Llano" on the CPU side, yet vastly inferior on the GPU side.

Since Llano has already taped out, I think you may be able to find Llano systems very soon.
Score
0
December 19, 2009 7:44:30 PM

mmmmm nerd pornography.
Score
5
December 19, 2009 10:50:02 PM

ben850i'm kind of outta the "news" as far as this CPU goes, but what can it's on-board GPU be comparable to? for example: nvidia 9800GT? 8800? or is it not even meant to be that beefy of a GPU?


You got a printer? Yeah, it'll be a tad faster than that... ;) 
Score
6
December 20, 2009 12:17:36 AM

That IGP will be great for things like, web browsing and word and flash, and .... erm.... photo viewing, maybe some video usage too. Think of Win 98 era gpus.
Score
0
December 20, 2009 12:44:17 AM

Boxa786That IGP will be great for things like, web browsing and word and flash, and .... erm.... photo viewing, maybe some video usage too. Think of Win 98 era gpus.

And how do all of you guys know or assume this???
Score
2
Anonymous
December 20, 2009 1:21:03 AM

liquidsnake718: Evidence to suggest Intel has (or will ever have) a decent graphics solution or GTFO plz thx.


The laws of probability give the IGP a roughly 100% chance of sucking.
Score
0
December 20, 2009 2:29:40 AM

34kl3l4k"i'm kind of outta the "news" as far as this CPU goes, but what can it's on-board GPU be comparable to? for example: nvidia 9800GT? 8800? or is it not even meant to be that beefy of a GPU?"Think more like a nvidia 4200..... MAYBE a 6200.


From what I hear and have seen its about on par with a ATI 785GX GPU. Not too bad for such a little change.

zipzoomflyhighGreat. A crappy 45nm graphics chip to heat up your shiny new 32nm processor. YUCK.


And I am sure AMDs Fusion will be better. Hell lets slap a 5870 on or next to the CPU. I am sure a GPU thats used to a 50c idle will be happy with a CPU thats used to a 30c idle. And when the GPU heats up to 70-80c under load and the CPU is set to turn off at those temps, it will be awesome.

Its what it is. A low power, cheap decent IGP. Thats why Intel has the major market share of GPUs. Cheap.

beefy_mcpoozipzoomflyhigh got it right, this is to AMD's Fusion what Intel's hyperthreading was to AMD's dual-core. All gimmick, no substance.


Wait.... Intel had their dual core CPUs with no hyperthreading out a week before AMD had their dual cores out. Pentium D came out on May 26th 2005, Athlon X2 came out June 5th 2005. Hyperthreading was only in the Pentium 4 single core CPUs that competed with AMDs Athlon XP/64. So I am not sure how hyperthreading was meant to compete with a dual core rather pave the way for multicore programming......

As for the item itself, it is a on-chip GPU. The entire package is the chip. On die is a different story and TBH, will probably be harder to do since the lithography is different as is the process normally for a CPU and GPU. Those who think AMD will pull it off without a hitch beware. I doubt they will do it problem free nor will Intel. I bet on-die GPUs are still a few years away.
Score
2
Anonymous
December 20, 2009 3:09:18 AM

@jimmysmitty: but... I was refering to hyperthreading being a neglible performance gain that looks like 2 cores, vs. the near doubling of power that an extra core can give. AMD is aiming to move GPGPU to the mainstream, Intel have never proven themselves capable in highly parallel computing. Even in their darkest of days, AMD/ATI graphics were light years ahead of anything Intel has ever done, and the recent Larra-fail fiasco only underscores that. I'd go as far as to say that the billions spent on Larrabee R&D were a complete waste, and will never become a worthwhile product.
Score
-1
December 20, 2009 4:47:50 AM

hmmmm.....more i5s eh? what will the prices be like tho
in aus, the price of i5 pushed the i7 (lga 1366 socket) up :( 
Score
0
December 20, 2009 4:58:35 AM

I think this one is gonna replace Core 2 Duo...
Score
1
December 20, 2009 3:53:52 PM

Having integrated graphic chip is a bad idea cause it's useless and it will create additional heat. Nvidia's approach is the only right at this moment where the things are done well with Fermi.
Score
0
December 20, 2009 5:35:36 PM

At the end of the day it's a 32nm CPU that will offer superior performance and reduced power usage. The GPU will be good enough for playing blu-ray movies and games such as WOW that have less intense graphics requirements.
Score
0
Anonymous
December 20, 2009 5:44:36 PM

lradunovic: Yes and no... I think the aim was to reduce latency, the problem with GPGPU now is that offloading to the GPU requires enormous latency, everything must be copied to the GPUs memory, then processed, and sent back over to the CPU. If the GPU is integrated into the CPU, it can share L3 cache at a latency of perhaps 50 cycles, instead of a latency measured in seconds(billions of cycles). You'll never be able to fit a teraflop+ GPU under the CPU's heatsink, but it could allow floating point operations to be scheduled automatically by the CPU onto the GPU.
Score
0
December 20, 2009 6:31:01 PM

Not too bad for on chip GPU. they'll get better in years to come. And better. And better. Like it or not, gamers, this is the future for all but the tipity top, top of the top of the line hardcore ATX formfactor tower computers. Computers will continue to get smaller and more energy efficient. That means goodbye to ATX formfactors and goodbye to graphics cards.

Never say never, oP3n_CL_pr0gramm3r. When they get down to 11 nanometers it'll be a whole nuther ballgame.
Score
-1
December 20, 2009 7:54:42 PM

Intel's hyper threading is not a gimmick by any means, look at the benefits of it on the core i7s in CPU intensive tasks like rendering. It's not as good as a full dedicated hardware core but it certainly helps drastically with much smaller cost to intel and the end user.

We don't know how good these chips are, but it's a step in the right direction. Baby steps must be taken before pulling something huge off. I'm not an intel fan but I can't help but be impressed by their streak of wins recently.

This technology isn't aimed at the high end desktop user (gamer) but it has it's place in may different scenarios such as netbooks, media streaming boxes, ultra portable internets devices and etc. Give it time folks, why the negative criticism?
Score
0
Anonymous
December 20, 2009 8:26:03 PM

Niva: The typical real-world performance gain from hyperthreading is +10% to -10%, and personally I hope AMD never goes that route. Most of i7's (occasional) superiority is from the new SSE instructions, not the god-awful return of hyperthreading. Phenom II and i7 usually tie in games, because none of them take advantage of the new SSE instructions, whereas video encoding does, hence i7 wins that one hands down.
Score
-1
December 21, 2009 1:53:40 AM

Looks like some nice looking chips ;) 
Score
0
December 21, 2009 2:01:10 AM

for serious use like graphics workstation, this is ridiculous. this chip is for the corporations. and where in hell does the width of bitrate go. Right out to lunch, it started with the 775. I hope a peoples computer returns again. Damn sound like hitler to a beetle.
Score
0
December 21, 2009 12:32:42 PM

YAY!! Maybe the Core i3 wont cost a million dollars!!??
Score
0
December 21, 2009 2:45:47 PM

riversdirect"On-chip" sounds miss leading. Usually people say a "chip" is a piece of silicon. The picture is obviously not one die. All they did is move the video card closer to the processor. Technically this is not on-chip graphics, its in package graphics.


It is in fact "on-chip"....you're confusing "on-chip" with "on-die"....it's essentially the same as Intel was doing with the Pentium-D...2 die's, 1 chip.
Score
1
December 22, 2009 9:47:40 PM

Great, nobody mentioned "will it play SuperMario"
Score
0
December 25, 2009 12:09:58 PM

AnonymousNiva: The typical real-world performance gain from hyperthreading is +10% to -10%, and personally I hope AMD never goes that route. Most of i7's (occasional) superiority is from the new SSE instructions, not the god-awful return of hyperthreading. Phenom II and i7 usually tie in games, because none of them take advantage of the new SSE instructions, whereas video encoding does, hence i7 wins that one hands down.


Whoever posted this comment...please refrain from posting again. You don't have even the slightest clue what you're talking about. Now, on to facts... The PhenomII and Core i7 perform so closely in gaming benchmarks because games only use 1-2 threads and therefore don't make any use of HyperThreading. The processors are left to handle all the processing on physical cores. The +/-10% performance impact was for Pentium4....in cases where HyperThreading is used on the Core i7, it's impact is upwards of +25% according to the benchmarks posted everywhere on the net.

virtualbanGreat, nobody mentioned "will it play SuperMario"


No....it won't run Crysis either...
Score
1
!