Intel's Future Chips: News, Rumours & Reviews - Page 3
Tags:
- rumors
- skylake
-
Intel
- haswell
-
CPUs
- broadwell
-
Systems
- news
-
Processors
Last response: in CPUs
mayankleoboy1
January 31, 2013 2:02:16 PM
skaughtz said:
Has there been any information released regarding where Haswell's graphics will fall in relation to Ivy Bridge, Trinity, etc.?If you believe Intel, it will be "miles ahead. True revolution in graphics, with HD gaming."
If you believe AMD , "Trinity beats all competition. With Richland, we will be aheader"
If you believe Nvidia, " we have Nvidia optimized games for all platforms, which will suck on all other hardware".
-
Reply to mayankleoboy1
skaughtz
January 31, 2013 2:39:13 PM
mayankleoboy1 said:
If you believe Intel, it will be "miles ahead. True revolution in graphics, with HD gaming."If you believe AMD , "Trinity beats all competition. With Richland, we will be aheader"
If you believe Nvidia, " we have Nvidia optimized games for all platforms, which will suck on all other hardware".
I believe in nussing, Lebowski.
... but I'll take that as a "no." I skimmed through the links posted in this thread, but I couldn't find anything definitive. Oh well.
-
Reply to skaughtz
skaughtz said:
I believe in nussing, Lebowski.... but I'll take that as a "no." I skimmed through the links posted in this thread, but I couldn't find anything definitive. Oh well.
here's what i know so far. add 'rumor' in every sentence and take 'em with a grain of salt.
desktop haswell igpu will be hd4600, 20 eu(shaders) 4 higher than ivy bridge's 16 shaders in hd4000. that one will probably have minor improvement over ivy bridge. if there's more it will be from mature(LOL) drivers and design tweaks/improvements.
there is another haswell igpu variant named gt3 (gt2 being 20 shader hd4600 and gt1 ia the least powerful one) with 40 shaders which, so far seem to be exclusive to ultrabook skus. in theory, 40 shaders in the igpu can deliver massive performance, by intel standards. but, it looks like the extra shaders will be used for better power efficiency in ultrabook-class devices. intel did do a 'demo' of gt3 vs nvidia gt650M in ces showing gt3 being close to 650M(bragging ensued). at this point, it's unclear whether there will be gt3 based skus for desktop and mainstream laptops.
as for faring against amd igpus, hd4000 can catch up to llano's 6550d igpu in some non gaming tasks while trinity's 7660d igpu can run circles around it. i don't think gt2 will outperform 7660d but gt3 might have a chance if it's given desktop/laptop class thermal headroom.
-
Reply to de5_Roy
truegenius said:
if putting more shaders can give a linear performance boost then amd would have already done this as they have plenty of room for tdp in 5600kbut it is not linear
question :- intel's eu is cluster of shadders or shadders only
any technical specifications about it ??
http://www.realworldtech.com/ivy-bridge-gpu/
-
Reply to JAYDEEJOHN
mayankleoboy1 said:
If you believe Intel, it will be "miles ahead. True revolution in graphics, with HD gaming."If you believe AMD , "Trinity beats all competition. With Richland, we will be aheader"
If you believe Nvidia, " we have Nvidia optimized games for all platforms, which will suck on all other hardware".
If proves to be true that the next gen. consoles are to use AMD that leaves nVidia the odd man out. Serves them right... So much for PhysX!
-
Reply to RussK1
mayankleoboy1
February 1, 2013 5:36:19 AM
de5_Roy said:
here's what i know so far. add 'rumor' in every sentence and take 'em with a grain of salt.desktop haswell igpu will be hd4600, ............. thermal headroom.
Just to add a bit, that the Ultrwide RAM a.ka. Crystalwell is real. But as CharlieD says, its pricing makes it very difficult for OEM's to integrate.
-
Reply to mayankleoboy1
i guess, go to bios > set current, power limit to as high as you can > go crazy with the voltage settings.
haswell will likely have the same ever-smaller-cpu-space-on-die issue as ivy bridge, so water cooling would be needed for high overclock....
although, i don't think intel allows their cpus to use so much power that they fry instantly but gradual degradation is quite possible.
otoh, i think amd unlocked cpus can be 'overwatted' quite easily. amd's are fully unlocked.
Core i7 4765T is 35W quad Haswell
http://www.fudzilla.com/home/item/30339-core-i7-4765t-i...
Intel the ones with BSdpY-series Ivy Bridge lasts until Q4 13
http://www.fudzilla.com/home/item/30338-intel-y-series-...
haswell will likely have the same ever-smaller-cpu-space-on-die issue as ivy bridge, so water cooling would be needed for high overclock....
although, i don't think intel allows their cpus to use so much power that they fry instantly but gradual degradation is quite possible.
otoh, i think amd unlocked cpus can be 'overwatted' quite easily. amd's are fully unlocked.
Core i7 4765T is 35W quad Haswell
http://www.fudzilla.com/home/item/30339-core-i7-4765t-i...
Intel the ones with BSdpY-series Ivy Bridge lasts until Q4 13
http://www.fudzilla.com/home/item/30338-intel-y-series-...
-
Reply to de5_Roy
lazykoala
February 4, 2013 5:12:31 AM
RussK1 said:
If proves to be true that the next gen. consoles are to use AMD that leaves nVidia the odd man out. Serves them right... So much for PhysX!Very much doubt that; PhsyX isn't going away for no other reason then no one else had bothered to create a dedicated Physics engine as capable as PhysX yet. Wouldn't be shocked if DX eventually adds a physics API that looks VERY similar to PhysX in layout. [Hey, SM 1.1 was basically NVIDIA's proprietary Shader Model implementation, so it wouldn't be the first time...]
-
Reply to gamerk316
mayankleoboy1
February 4, 2013 1:06:57 PM
gamerk316 said:
Very much doubt that; PhsyX isn't going away for no other reason then no one else had bothered to create a dedicated Physics engine as capable as PhysX yet. Wouldn't be shocked if DX eventually adds a physics API that looks VERY similar to PhysX in layout. [Hey, SM 1.1 was basically NVIDIA's proprietary Shader Model implementation, so it wouldn't be the first time...]Not sure how 'capable' Physx is. IIRC, it uses ancient x87 instructions to cap performance on modern processors.
Most gamers would be ecstastic if DirectX adds a vendor agnostic GPU accelerated physics engine...
-
Reply to mayankleoboy1
mayankleoboy1 said:
Not sure how 'capable' Physx is. IIRC, it uses ancient x87 instructions to cap performance on modern processors.*Used*. The API was re-written as of Version 3 (May 2011 or so if I recall).
And in terms of what it is reasonable able to simulate at a reasonable speed (at least on a GPU, which IIRC is where dynamic physics *should* be computed on), nothing comes even close.
-
Reply to gamerk316
ph1sh55
February 4, 2013 4:31:12 PM
lazykoala said:
Haswell will be compatible with all current and soon to be released GPU's right? I don't want to buy a GPU now and end up having it be incompatible with a new CPU in a few months!The CPU won't cause an incompatibility with a discrete GPU, you don't have to worry about this. Any motherboard for the foreseeable future should have a pci-e slot for your GPU .
-
Reply to ph1sh55
gamerk316 said:
Very much doubt that; PhsyX isn't going away for no other reason then no one else had bothered to create a dedicated Physics engine as capable as PhysX yet. Wouldn't be shocked if DX eventually adds a physics API that looks VERY similar to PhysX in layout. [Hey, SM 1.1 was basically NVIDIA's proprietary Shader Model implementation, so it wouldn't be the first time...]HAVOK... favored by far and uses SSE instructions.
-
Reply to RussK1
blingturd
February 4, 2013 11:32:05 PM
blingturd said:
IHRD NW PROCCSSOR WILL HVE AMD GRAFIKSI highly doubt that Intel's new processor will have AMD-designed graphics. NVIDIA would be more likely but very unlikely to happen. I bet they stick with the current stuff they're running until/unless they end up buying NVIDIA in which case they'll roll in GeForce GPU technology. Intel would actually do much better to hire programmers who can write a GPU driver worth half a crap than to throw more hardware at their GPU problems right now.
Also, you might want to work on your spelling and capitalization. Tom's isn't a text message or Twitter, you can use regular English and aren't limited to 140 characters
-
Reply to MU_Engineer
Intels iGPU is like a spray gun, in few isolated tasks it is comparible but in general HD4000 is at least 10-15% slower than that on a 3850/3870k and around 40-45% slower than that on the 5800k. GT3 will probably match the high end Llano parts in gaming and Trinity's 7660D in some tasks. Problem is the expenses ratio and eventual SKU pricing for GT3 parts. It will only be on the high end i7's and i5's which cost 2.2 and 3.5x the cost of a 5800K and manufacturing wise even more.
I would say the true improvement is around 30% give or take but in some games it still falls flat, almost like the developers don't recognise intel HD with there engines. And NO |Skyrim| doesn't max out not by a long shot.
Where Intel will probably beat AMD is in the mobility front, with AMD still working on power issues on mobility, AMD integrated solutions on mobile platforms is very cut back on. Hopefully with Richland and Kaviri AMD will lower the thermals to throw more stream processors on to a very skint mobile solution.
I would say the true improvement is around 30% give or take but in some games it still falls flat, almost like the developers don't recognise intel HD with there engines. And NO |Skyrim| doesn't max out not by a long shot.
Where Intel will probably beat AMD is in the mobility front, with AMD still working on power issues on mobility, AMD integrated solutions on mobile platforms is very cut back on. Hopefully with Richland and Kaviri AMD will lower the thermals to throw more stream processors on to a very skint mobile solution.
-
Reply to sarinaide
hixbot
February 5, 2013 3:44:48 PM
Anonymous
a
b
à
CPUs
February 5, 2013 3:52:44 PM
ph1sh55
February 5, 2013 10:19:50 PM
Anonymous
a
b
à
CPUs
February 6, 2013 3:09:22 AM
you can see about the last two generations here:
http://en.wikipedia.org/wiki/Sandy_Bridge#Desktop_platf...
http://en.wikipedia.org/wiki/Ivy_Bridge_(microarchitecture)#Desktop_processors
http://en.wikipedia.org/wiki/Sandy_Bridge#Desktop_platf...
http://en.wikipedia.org/wiki/Ivy_Bridge_(microarchitecture)#Desktop_processors
-
Reply to Anonymous
Chad Boga
February 7, 2013 10:29:15 AM
Whatever slim chance Itanium had of a meaningful future, appears to be over
Intel has updated the definition of the next generation Itanium® processor, code name “Kittson”. Kittson will be manufactured on Intel’s 32-nm process technology and will be socket compatible with the existing Intel Itanium 9300/9500 platforms, providing customers with performance improvements, investment protection, and a seamless upgrade path for existing systems. The modular development model, which converges on a common Intel® Xeon®/Intel Itanium socket and motherboard, will be evaluated for future implementation opportunities.
So it doesn't look like HP is prepared to pay Intel to develop a 22nm successor to Poulson.
It looks like Kittson will be to Poulson what Montvale was to Montecito, a meaningless new name for the same chip and a pitiful frequency increase.
Where this leaves HP's Itanium based customers in the future will be an interesting fallout to observe.
Presumably HP will want to move them to Xeon based systems, but I wonder how many customers they will lose in the process.
All this will probably come as no great surprise to anyone who has been following Itanium's rocky path for the last few years.
Can't help but think that the encroachment by ARM, simply meant that Intel decided to focus all its efforts on x86, and maybe this was an inevitable consequence anyway, perhaps sped up by 5 or so years.
I wonder when the official announcement of the cessation of Itanium based systems from HP will be made, and if this will have any outcome in the legal wranglings between HP and Oracle.
So it doesn't look like HP is prepared to pay Intel to develop a 22nm successor to Poulson.
It looks like Kittson will be to Poulson what Montvale was to Montecito, a meaningless new name for the same chip and a pitiful frequency increase.
Where this leaves HP's Itanium based customers in the future will be an interesting fallout to observe.
Presumably HP will want to move them to Xeon based systems, but I wonder how many customers they will lose in the process.
All this will probably come as no great surprise to anyone who has been following Itanium's rocky path for the last few years.
Can't help but think that the encroachment by ARM, simply meant that Intel decided to focus all its efforts on x86, and maybe this was an inevitable consequence anyway, perhaps sped up by 5 or so years.
I wonder when the official announcement of the cessation of Itanium based systems from HP will be made, and if this will have any outcome in the legal wranglings between HP and Oracle.
-
Reply to Chad Boga
mayankleoboy1
February 7, 2013 12:16:14 PM
-
Reply to mayankleoboy1
mayankleoboy1
February 7, 2013 1:20:09 PM
Haswell desktop transition significantly slower
http://www.fudzilla.com/home/item/30389-haswell-desktop...
penticide
http://www.fudzilla.com/home/item/30388-21-pentium-proc...
http://www.fudzilla.com/home/item/30389-haswell-desktop...
penticide
http://www.fudzilla.com/home/item/30388-21-pentium-proc...
-
Reply to de5_Roy
GT3's performance is as expected, the bigger issue is GT3 is only going to feature on the high end parts which the end user will buy discrete graphics for anyways. I don't understand the line of reasoning behind it. With GT2's performance quite a drop off from GT3 it doesn't quite lend itself to the budget user.
-
Reply to sarinaide
sarinaide said:
GT3's performance is as expected, the bigger issue is GT3 is only going to feature on the high end parts which the end user will buy discrete graphics for anyways. I don't understand the line of reasoning behind it. With GT2's performance quite a drop off from GT3 it doesn't quite lend itself to the budget user.I believe it's an attempt to get people who don't want a discrete GPU to pay more for a higher-end CPU with the fully-functional IGP. Think of businesses who don't want one more component than can fail in their machines. Or people making ultra small form factor systems who are space/thermally limited, which again includes many business machines. HTPC users would also apply here.
-
Reply to MU_Engineer
MU_Engineer said:
I believe it's an attempt to get people who don't want a discrete GPU to pay more for a higher-end CPU with the fully-functional IGP. Think of businesses who don't want one more component than can fail in their machines. Or people making ultra small form factor systems who are space/thermally limited, which again includes many business machines. HTPC users would also apply here.Most businesses in the sense mentioned are not going to need anything really more than HD2500/3000 let alone HD5000, and if the business is graphics design then they will use a multicore CPU with a professional Firepro or Quadro GPU, as to forcing people to buy higher end just to get a iGPU solution which is still weaker than what the competitor offers at a lower cost is just unethical to the core, but it is Intel we are talking about.
GT3 would have served the i3 family well, would have offered enough competition to the APU's to possibly pressure AMD in the budget and HTPC spectrum. It would be nice if we had a low cost intel setup of repore but that seems to be gone out the window. If you do not add a discrete card to an Intel setup it is completely hopeless, adding discrete to a i3 3220 is rendered moot when a FX6300 is the same cost and offers more. It is like you are forced to build a high end Intel setup just to get value for money. This I say in the sense that you want a low cost setup that does more than word processor or excell or quickbooks etc, if you want a fun element to a setup at a target budget Intel offers absolutely nothing. I rarely ever get a person buying a i3, Pentium or i5 under a 3570K, but conversely sell like 20-30 APU's a week, lots of FX6300's and 8350's.
Essentially my point here is that GT3 on the highend parts makes absolutely no sense at all.
Quote:
I thought I read somewhere that they are doing away with the i3 sku.Instead just making a i5 dual core variant.Though that might of been about laptops. Wolf in sheeps clothing.
Intel should have the following SKU's i5 standard and k, i7 standard and k and Intel extreme just loose that 3820 abomination, being a i7 990X owner that part makes me cry custard its just so far from extreme.
-
Reply to sarinaide
It just makes no sense, and the arguement to force people to spend $250 to get a i5 for GT3 is ludicrous if the persons needs are just some iGPU love with a kick. GT3 is slower than 7660D but still a good enough solution for the average Joe that doesn't want a discrete card, but doesn't want to spend a mortgage payment on a CPU either.
-
Reply to sarinaide
Anonymous
a
b
à
CPUs
February 9, 2013 12:44:23 PM
Well the FX8350 is for the most part around $190, the 8320 was as low as $150 recently but the FX parts are there for the enthusiasts who will use a discrete card, the FX6300 is a mere $10 more than a i3 3220 offering more overall value, while a person who wants to run a iGPU setup the A10 5800k is a paultry $110 when looking well. Not to mention the FM2 A85 ATX/MATX/ITX offerings there is the option for a person on skint nothing to get a pretty decent system that can actually play games along with high end features and connectivity provided by chip and platform. with intel you are almost forced into buying a GPU, GT3 for the point of argument would have been best suited to i3's and lower end i5's where the end user may not be a power user but get very acceptable performance.
Bare in mind the argument is not on price its on what you get for your money. GT3 on lower end parts would have remedied this to a fair extent.
Bare in mind the argument is not on price its on what you get for your money. GT3 on lower end parts would have remedied this to a fair extent.
-
Reply to sarinaide
sarinaide said:
It just makes no sense, and the arguement to force people to spend $250 to get a i5 for GT3 is ludicrous if the persons needs are just some iGPU love with a kick. GT3 is slower than 7660D but still a good enough solution for the average Joe that doesn't want a discrete card, but doesn't want to spend a mortgage payment on a CPU either.GT3 is their "Extreme Edition" for mobile processors GPU wise. Expect them to be well over $250.
-
Reply to Cazalan
GT3 will also feature on high end desktop parts followed by GT2 an GT1 variants. As the chip gets cheaper so does the iGPU solution. Right now GT3 has its place in mobility particularly while AMD iGPU mobile solutions remain very basic. The iGPU makes money at the lower price segment where a i5 4570K or i7 4770k is not needed but better integrated graphics is more valuable to the end user.
-
Reply to sarinaide
Anonymous
a
b
à
CPUs
February 9, 2013 3:17:19 PM
sarinaide said:
Well the FX8350 is for the most part around $190, the 8320 was as low as $150 recently but the FX parts are there for the enthusiasts who will use a discrete card, the FX6300 is a mere $10 more than a i3 3220 offering more overall value, while a person who wants to run a iGPU setup the A10 5800k is a paultry $110 when looking well. Not to mention the FM2 A85 ATX/MATX/ITX offerings there is the option for a person on skint nothing to get a pretty decent system that can actually play games along with high end features and connectivity provided by chip and platform. with intel you are almost forced into buying a GPU, GT3 for the point of argument would have been best suited to i3's and lower end i5's where the end user may not be a power user but get very acceptable performance.Bare in mind the argument is not on price its on what you get for your money. GT3 on lower end parts would have remedied this to a fair extent.
you're throwing around a lot of apples and oranges.
the point is; what other offerings are there for an "upper mainstream/enthusiast class" cpu with a decent igpu are out there? sorry but you're just rehashing the now two year old argument that intel only puts the better igpu on the higher end SKUs . . me thinks folks would get over that by now. so intel doesn't focus on the budget market and looking at the last few years in stock prices, i think they know what side of the bread gets the butter . . . so hows that working out for AMD?
and now looking . . .oh mmyyyyyyy.
-
Reply to Anonymous
Its not good is one of the lines of work is retail, we have basically been left with excess intel i3's and lower end i5's including sandybridge because nothing bellow the $200 mark for Intel sells well. We can sell a fortune on 3570k and 3770k. Again Looniam its just the principle that the lower end segment with GT3 makes more sense than on the 45xx and 47xx family where most will be using high end graphics cards, ie Intel are not selling the GT3 component they are selling the CPU component's performance as no power user will ever use GT3 unless the shiny new GTX780 goes poof.
-
Reply to sarinaide
Yuka said:
Like I've said many times, the only reason I'd want a good iGPU on a mid/high end CPU is because of Lucids Software. Too bad AMD hasn't taken that chance to actually take real advantage of it.I'll have to wait for HSA to bear fruits I guess.
Cheers!
You do realize that Gigabyte, ASRock and ASUS all have Lucid MVP on the top line A85 and A75 offerings? I believe even Sapphire and Biostar will include it, the only catch up is MSI who only have the A85 GD65 which as we know is their mid level offering. Lucid MVP with the HD7660D is a world of good, even more useful when you don't need the discrete card to just shut it down.
-
Reply to sarinaide
sarinaide said:
You do realize that Gigabyte, ASRock and ASUS all have Lucid MVP on the top line A85 and A75 offerings? I believe even Sapphire and Biostar will include it, the only catch up is MSI who only have the A85 GD65 which as we know is their mid level offering. Lucid MVP with the HD7660D is a world of good, even more useful when you don't need the discrete card to just shut it down.Ah, that's very good to know.
And you really have to see Lucid's MVP thingy working with a discrete card to be impressed by it. I know the 7660D is a good iGPU, but when AMD gets a little better CPU alongside the iGPU, Lucid's thingy is going to be really awesome in AMD+anything.
Cheers!
-
Reply to Yuka
Anonymous
a
b
à
CPUs
February 10, 2013 2:19:51 PM
sarinaide said:
Its not good is one of the lines of work is retail, we have basically been left with excess intel i3's and lower end i5's including sandybridge because nothing bellow the $200 mark for Intel sells well. We can sell a fortune on 3570k and 3770k. Again Looniam its just the principle that the lower end segment with GT3 makes more sense than on the 45xx and 47xx family where most will be using high end graphics cards, ie Intel are not selling the GT3 component they are selling the CPU component's performance as no power user will ever use GT3 unless the shiny new GTX780 goes poof.you are completely overlooking that there are i3 cpus with the 3000/4000 HD graphics.
(hint: their number ends in a 5)
so i would believe the decision is/was based on past market performance weighed against manufacturing costs. personally, i'd like porche to have a line of budget orientated family vehicles . . .
-
Reply to Anonymous
Anonymous
a
b
à
CPUs
February 10, 2013 4:15:17 PM
sarinaide said:
But Porsche and Ferrari both have family level SUV's, Sedans and sports cars targeting the average person, both forced to do so.I know the 3225 has, similarly the older 2320 and 2130(or 2125/2105) had the top end but seemed to never have available SKU's.
ok, there you are skewing the point. i said "budget orientated family vehicles". i really would like to see a family of 5 with a dog fit into a cayenne at any price . . .
and really on a personal note, i am there with you and would like to see the lower end SKUs get a few more bells and whistles. but it is seemingly not economically feasible for intel to provide such a product. and i say that as someone who only took an economics class because it was necessary, which i slept though and passed because of multiple choice exams.
it seems to work for them. so after two years of seeing such, with the release of sandy and intel graphics HD 3000, i have learned to get over it and move on . . . (hint)
-
Reply to Anonymous
- First
- Previous
- 3 / 10
- 4
- 5
- 6
- 7
- … More pages
- Next
- Newest
- 1
- 2
- 3 / 10
- 4
- 5
- 6
- 7
- 8
- 9
- 10
!