Sign in with
Sign up | Sign in
Your question

ATI Radeon HD 6xxx series graphics cards?

Last response: in Graphics & Displays
Share
July 18, 2010 5:22:44 PM

AMD/ATI is planning on releasing the Radeon HD 6xxx series graphics cards sometime in late 2011. There isn't much information about these cards, but approximately how much faster are these cards said to be than the current HD 5xxx series? Will it a huge 2x performance increase like the 4xxx to 5xxx series? Or a small 30-35% performance increase?

I also heard that these cards will use DX11.1.

AMD is also releasing their 8-core "bulldozer" chips circa 2011. I'd like to see how they perform compared to Intel's 8-core Sandy Bridge.
a c 235 U Graphics card
July 18, 2010 5:39:33 PM

ambam said:
AMD/ATI is planning on releasing the Radeon HD 6xxx series graphics cards sometime in late 2011. There isn't much information about these cards, but approximately how much faster are these cards said to be than the current HD 5xxx series? Will it a huge 2x performance increase like the 4xxx to 5xxx series? Or a small 30-35% performance increase?

I also heard that these cards will use DX11.1.

AMD is also releasing their 8-core "bulldozer" chips circa 2011. I'd like to see how they perform compared to Intel's 8-core Sandy Bridge.


if there "isn't much information about these cards" then how can anyone compare them to the 5xxx series?

also, the 5xxx are not 2x faster then the 4xxx series. while they may share similar names, 4770,4850, 5770,5850 etc.,

they have totally different price points at launch. The ATI 4770 launched at around $100 and the 4850 at around $200. Both the 5770 & 5850 cards have much higher launch price points.
m
0
l
July 18, 2010 5:41:00 PM

The 6000 series aren't too clear. I'm guessing the performance will be similar to the 4000 to 5000 transition, maybe a bit more. If Ati keeps up with the naming scheme, then I'm guessing (for example) a 5770 would be equivelent of a 6670 or so. It has taken 20-28 months for DX11 to really take on (and it still hasn't really) after the annoucement of DX11. DX12 hasn't been announced yet (has it?), so DX11.1 is a most likely. But at the same time, the 5000s series should support DX11.1...

Bulldozer--hard to say. This will be AMD's higher end chip, so it will perform well, especially for its price. Sandybridge is a bit of gain, and a lot of optimization. As in, creating the graphics and memory on the 32nm on a single dice makes the manufacturing a lot easier too.

Honestly though, I'm not sure 2011 will need 8 cores. Six is already starting to be un-useful.
m
0
l
Related resources
a c 235 U Graphics card
July 18, 2010 5:46:05 PM

sprunth said:
Six is already starting to be un-useful.


starting?? i can only think of one game that actually needs 4 cores and that's because of a piss poor job on porting the game from console to PC
m
0
l
July 18, 2010 6:41:06 PM

ct1615 said:
starting?? i can only think of one game that actually needs 4 cores and that's because of a piss poor job on porting the game from console to PC


And which game is that, exactly?
m
0
l
July 18, 2010 6:42:40 PM

I'd like to be able to play future titles such as Deus Ex 3, and Crysis 2 on very high or maximum settings. I'm not sure if a Core i7 and an HD 5970 is enough. Games are becoming EXTREMELY GPU-demanding, Metro 2033 has already overtaken Crysis in terms of hardware demand.
m
0
l
July 18, 2010 6:51:16 PM

ambam said:
I'd like to be able to play future titles such as Deus Ex 3, and Crysis 2 on very high or maximum settings. I'm not sure if a Core i7 and an HD 5970 is enough. Games are becoming EXTREMELY GPU-demanding, Metro 2033 has already overtaken Crysis in terms of hardware demand.


crysis 2 is suppose to be console optimized so it really will be less demanding than the original crysis which they didn't sell on consoles because the consoles had problems running it.
m
0
l
July 18, 2010 6:57:03 PM

combatpro said:
crysis 2 is suppose to be console optimized so it really will be less demanding than the original crysis which they didn't sell on consoles because the consoles had problems running it.


What about Deus Ex 3? Will it be more demanding than Crysis? Can it be maxed out with currently existing hardware?
m
0
l
July 18, 2010 7:04:39 PM

ambam said:
What about Deus Ex 3? Will it be more demanding than Crysis? Can it be maxed out with currently existing hardware?

I don't know much about that game, but you currently have a 5970 and a i7 which are the fastest single cpu and gpu on the market, you shouldn't have any problems with any upcoming games.
m
0
l
July 18, 2010 7:22:37 PM

exactly, we can't know what the future requirements will be, but however the only thing we can do is predict the requirements by reading comments or reviews by the makers of the game, like if they say its going to have better graphics then obviously its going to be more demanding than their previous game, but if they say its going to be optimized then its going to be less demanding because the game uses the system components on a better way.
m
0
l
a c 235 U Graphics card
July 18, 2010 7:22:59 PM

ambam said:
I'd like to be able to play future titles such as Deus Ex 3, and Crysis 2 on very high or maximum settings. I'm not sure if a Core i7 and an HD 5970 is enough. Games are becoming EXTREMELY GPU-demanding, Metro 2033 has already overtaken Crysis in terms of hardware demand.


deus ex 3 and crysis will both be made to run on consoles, neither one will be as demanding as crysis

games are not becoming GPU demanding, most are made for consoles. Crappy ports and eye candy make games demanding but a simple dual core + 8800GT can still run any game out there on medium settings.

your system of an i7+ATI 5970 will be more then enough for very high settings on any current game or near future one
m
0
l
a b U Graphics card
July 18, 2010 8:11:45 PM

ambam said:
AMD/ATI is planning on releasing the Radeon HD 6xxx series graphics cards sometime in late 2011. There isn't much information about these cards, but approximately how much faster are these cards said to be than the current HD 5xxx series? Will it a huge 2x performance increase like the 4xxx to 5xxx series? Or a small 30-35% performance increase?


First off it depends on what you are thinking of here, the likely successor to the name HD6K is the Southern Islands update, and it's likely to be late 2010 not late 2011, Northern Islands is the major refresh for late 2011. SI is the gaping the bridge because NI was designed for 32/28nm, and all the delays in the fabs have forced them to make an intermediary refresh that incorporates some NI features in SI, but still is made on the 40nm TSMC process.

Since it's not a process/node change and not a full architecture change, the increase in performance will be similar to that of a modest update, more like and HD4870->4890 than something like the HD3K->4K->5K.

Sure some situations will see an above 200% increase but they would be the ones that are currently noticeably limited (setup/shader/memory) and not all of those will likely be addressed either. You will also see some major 30-60% boosts in some situations, but the will likely also be rare, although likely those focused on by PR. On average I would expect something in the 20-40% boost rather than anything major. The bigger boost would come from a true leap to Northern Islands architecture, and likely this will now make that boost appear a little less because of this intermediary step.


* WTF is wrong with the forum today, posts getting truncated and deleted and not appearing? Missed the second half about CPUs not being utilized (heck even with physics they could do alot more. Friggin' annoying. :fou: 
m
0
l
a c 172 U Graphics card
July 18, 2010 8:49:43 PM

TheGreatGrapeApe said:
First off it depends on what you are thinking of here, the likely successor to the name HD6K is the Southern Islands update, and it's likely to be late 2010 not late 2011, Northern Islands is the major refresh for late 2011. SI is the gaping the bridge because NI was designed for 32/28nm, and all the delays in the fabs have forced them to make an intermediary refresh that incorporates some NI features in SI, but still is made on the 40nm TSMC process.

Since it's not a process/node change and not a full architecture change, the increase in performance will be similar to that of a modest update, more like and HD4870->4890 than something like the HD3K->4K->5K.

Sure some situations will see an above 200% increase but they would be the ones that are currently noticeably limited (setup/shader/memory) and not all of those will likely be addressed either. You will also see some major 30-60% boosts in some situations, but the will likely also be rare, although likely those focused on by PR. On average I would expect something in the 20-40% boost rather than anything major. The bigger boost would come from a true leap to Northern Islands architecture, and likely this will now make that boost appear a little less because of this intermediary step.


* WTF is wrong with the forum today, posts getting truncated and deleted and not appearing? Missed the second half about CPUs not being utilized (heck even with physics they could do alot more. Friggin' annoying. :fou: 



Did you ever hear about the 70,000 blogs and video streaming sites that were scrubbed from the net on copyright and abuse grounds by the US government?
m
0
l
a b U Graphics card
July 18, 2010 9:06:29 PM

I don't think it's that, looking more like all the st00pid ad servers are clogged and F'ed up, likely because they're in Europe and it's Sunday night and they're being cycled in/out.
m
0
l
a c 172 U Graphics card
July 18, 2010 9:13:22 PM

TheGreatGrapeApe said:
I don't think it's that, looking more like all the st00pid ad servers are clogged and F'ed up, likely because they're in Europe and it's Sunday night and they're being cycled in/out.



Who knows it could be just power outages, been a lot of those lately.
m
0
l
July 18, 2010 10:07:00 PM

ct1615 said:
starting?? i can only think of one game that actually needs 4 cores and that's because of a piss poor job on porting the game from console to PC


Just because gaming doesn't use them doesn't mean that more then 4 cores are useless!
m
0
l
July 18, 2010 10:18:53 PM

victomofreality said:
Just because gaming doesn't use them doesn't mean that more then 4 cores are useless!


Exactly what I was going to comment.

I said 6 being somewhat useless because so few programs utilize it well (or at all).

But I do use my i7 860 well. Nowhere in my comment did I say gaming performance, nor is this the gaming forum. Compared to a single or dual core, the 4/8 cores is great. There are programs (like Blender) that use more cores. The PS3 has 1+6 (plus a last OS only processor).

Just my $0.02
m
0
l
a b U Graphics card
July 18, 2010 10:41:43 PM

victomofreality said:
Just because gaming doesn't use them doesn't mean that more then 4 cores are useless!


No one said they were simply useless, the term was 'starting to be un-useful', and then the rest fo the comments were focusing on gaming being limited due to improper optimization especially from console ports.

But thanks for standing up for multi-core CPUs, without that I think they would've returned to single-core / single-thread. :kaola: 
m
0
l
a c 171 U Graphics card
July 18, 2010 10:58:41 PM

they are bound to make a more powerful 5xxx card before the 6xxx series arrive. Way to early to call 6xxx series performance. Wait until you see a card. Otherwise it will be like all the speculation and people crapping on about fermi cards and how awesome they were gonna be then a big let down when they came out and they couldnt even knock ati off the top.
m
0
l
a c 376 U Graphics card
July 18, 2010 11:04:55 PM

Crysis came out almost 3 years ago. That there has only been one game since that matches or surpasses it in terms of graphical requirements shows clearly that games are not becoming more GPU intensive. Perhaps there will be a game or two intended mainly for PC that really embraces tessellation but otherwise I really don't see games getting much more graphically intensive in general until there are new consoles.
m
0
l
a c 235 U Graphics card
July 18, 2010 11:15:15 PM

TheGreatGrapeApe said:
No one said they were simply useless, the term was 'starting to be un-useful', and then the rest fo the comments were focusing on gaming being limited due to improper optimization especially from console ports.

But thanks for standing up for multi-core CPUs, without that I think they would've returned to single-core / single-thread. :kaola: 


seriously, i was about to ditch my phenom II x4 and stick in my Athlon XP in until I was told by Einstein there that games can use more then one core....
m
0
l
a b U Graphics card
July 18, 2010 11:15:50 PM

iam2thecrowe said:
they are bound to make a more powerful 5xxx card before the 6xxx series arrive...


No one said they couldn't, in fact with the new ASUS ARES they already have.

However, he's talking about an HD6K and the SI cards coming before the end of the year and are likely to be HD6K series cards, whereas any minor speed-boost jump will simply become the 5x90 series instead of a new 6K series card. NI's even further off.

SI should be somewhat similar to the change to the HD3K cards which weren't a major jump from the HD2K, they were a good refinement which is what SI looks to be but with a bit more boost on the high end since 2K->3K involved some improvements and some drops (transistors, memory bandwidth). Performance from SI is likely to be a bit better than HD2K->HD3K, but not as good as HD3K->4K->5K.
m
0
l
July 18, 2010 11:21:23 PM

wasn't ati suppose to lunch the 5890 before going for the 6xxx series? heard some rumors about it
m
0
l
a b U Graphics card
July 18, 2010 11:29:24 PM

jyjjy said:
Crysis came out almost 3 years ago. That there has only been one game since that matches or surpasses it in terms of graphical requirements shows clearly that games are not becoming more GPU intensive.


Well depends on your idea of being more GPU intensive.

Stalker and BattleField Bad Company are more GPU intensive and both make use of DX11 and DX10 features, and even AvP and Metro 2033 are more graphically demanding hardware wise, regardless of the debate about the resulting images/gameplay.

Considering the split for consoles is already DX9 v DX10, it's not like this adds anything new for DX9 v DX11+.

New consoles are a long way off.
m
0
l
a c 235 U Graphics card
July 18, 2010 11:35:24 PM

TheGreatGrapeApe said:
Well depends on your idea of being more GPU intensive.

Stalker and BattleField Bad Company are more GPU intensive and both make use of DX11 and DX10 features, and even AvP and Metro 2033 are more graphically demanding hardware wise, regardless of the debate about the resulting images/gameplay.

Considering the split for consoles is already DX9 v DX10, it's not like this adds anything new for DX9 v DX11+.

New consoles are a long way off.


STALKER becomes more GPU intensive in DX11, in DX 10 I can double my FPS in STALKER CoP compared to Crysis. If Crysis had a DX11 mode it would look like a power point slide show about power point slide shows. Also I noticed in most benchmarks for Crysis/Warhead the AA is usually turned off or set to 2xAA because of the eye candy hit. Even with Metro or BFBC2, they set AA on.

All those games are more CPU intensive then Crysis.


m
0
l
a b U Graphics card
July 18, 2010 11:40:01 PM

combatpro said:
wasn't ati suppose to lunch the 5890 before going for the 6xxx series? heard some rumors about it


That was the rumour supposedly if Fermi came anywhere near expectations where the GTX470 was supposed to come close and compete with HD5970 and GTX480 was supposed to surpass it). And tight production due to the 40nm issues means less flexibility to run multiple products through TSMC.
Fermi fell short, demand for existing HD58xx still outpaces supply at MSRP (thus prices are still jacked), and AMD/ATi still has the Halo card in the HD5970, so why bother with a new refresh card if the current ones are selling well? I think if there weren't 40nm production issues you might have seen one likley pre-emptively just before the GTX 480 launch similar to the X1900 & GF8800Ultra launches or to clear pent up 'just in case' stock-pile just after the GTX480 launch to simply clear the cautious 'in case card' like the GTX285/295/4890 launches]).
m
0
l
a c 376 U Graphics card
July 18, 2010 11:48:58 PM

TheGreatGrapeApe said:
Stalker and BattleField Bad Company are more GPU intensive

More GPU intensive than Crysis or than earlier games in the series?
Doesn't DX11 actually increase CoPs performance compared to DX10?
m
0
l
a b U Graphics card
July 18, 2010 11:58:12 PM

ct1615 said:
STALKER becomes more GPU intensive in DX11, in DX 10 I can double my FPS in STALKER CoP compared to Crysis. If Crysis had a DX11 mode it would look like a power point slide show about power point slide shows.


But we're talking about workload, not if Crysis had DX11. I'm sure it would be very system hoggish if it were there, and might be beautiful and worth it, but it's not there. Just like Oblivion with texture mods, we can make things tougher even without DX11, but I wouldn't be surprised if both titles would benefit both visually and performance wise from DX11 features, like tessellation and better features for things like vegetation and lighting.

Quote:
Also
I noticed in most benchmarks for Crysis/Warhead the AA is usually turned off or set to 2xAA because of the eye candy hit. Even with Metro or BFBC2, they set AA on.


The ones I've looked at had them with AA on. Sometimes some of the people disable some of the features or in Metro use AAA which is worse than 4X MSAA.

Quote:
All those games are more CPU intensive then Crysis.


Perhaps, but I'm looking at the max resolution and AA tests where CPU influence is reduced. And that a game like AvP or Stalker goes from a higher minimum fps at lower resolution to a lower one at higher resolutions and AA levels, which would indicate more restriction on Crysis that isn't due to graphics alone.

Although of course other things play a factor like VRAM, etc. You can make GTA4 crawl without 2GB of VRAM if you set view distance to an extreme, but I wouldn't say in general it's tougher other than that one issue.

To me it's about always pushing forward. I don't care if they launch DX12, or DX20 next year as long as they keep moving forward and give people the option to build and code for it.

Better that than being told X or Y feature would greatly improve things but we've decided not to give it to you yet because we're delaying DX11.1 / 12 's launch until at least 33% adoption by everyone. Worrying about the laggards and those who complain about change makes no sense, they would complain about DX10 if it were just being launched now.
m
0
l
a c 172 U Graphics card
July 19, 2010 1:37:55 AM

So far as I am concerned there wont be a very large boosts over the current generation till much later. I am still using a single 3870 and it does the job just not quickly enough. At least the prices for the 5xx0 will go down making a nice opportunity for the budget gamers. Hell I with that I had bought a second vf1000 so I could use my 8800gtx.
m
0
l
a b U Graphics card
July 19, 2010 9:48:29 PM

Funny to see Fudo essentially agree with what I said above, although of course more pessimistically.

http://www.fudzilla.com/graphics/graphics/radeon-hd-600...

Surprised at the idea of the September timeline (to fit in Q3), I was hearing end of Oct beginning Nov at earliest, but just in time for Xmas shoppers.
m
0
l
July 19, 2010 10:30:21 PM

ambam said:
AMD/ATI is planning on releasing the Radeon HD 6xxx series graphics cards sometime in late 2011. There isn't much information about these cards, but approximately how much faster are these cards said to be than the current HD 5xxx series? Will it a huge 2x performance increase like the 4xxx to 5xxx series? Or a small 30-35% performance increase?

I also heard that these cards will use DX11.1.

AMD is also releasing their 8-core "bulldozer" chips circa 2011. I'd like to see how they perform compared to Intel's 8-core Sandy Bridge.

Yep . AMD fusion is also coming out . The 6xxx series will probably have 2 gpus because of the 5970 having 2 gpus.
m
0
l
July 19, 2010 11:20:01 PM

ipenguins said:
Yep . AMD fusion is also coming out . The 6xxx series will probably have 2 gpus because of the 5970 having 2 gpus.


I'm sure some of the higher 6000s, like 6900s and such will have 2 gpus, but I'm pretty sure the lower end cards will not have 2. For example, the 5500s series target OEM-bought/basic video audience. These cards need to be cheap, and 2 gpus would be a waste. AMD will just have a new architecture, and will let their manufacturers come out with dual-gpu cards. The 5970 isn't the first dual-gpu card either...People could have said the same thing with 4870x2.
m
0
l
July 20, 2010 6:15:10 AM

Out of curiosity, do we have any idea how the cards will be named? Considering the NI coming in Q3/4 2010 will still be 40nm, albeit refreshed, I wonder if they'll earn the 6xxx naming, or if they'll be 5xxx with a numbering increment.
m
0
l
July 20, 2010 4:37:22 PM

boosterfire said:
Out of curiosity, do we have any idea how the cards will be named? Considering the NI coming in Q3/4 2010 will still be 40nm, albeit refreshed, I wonder if they'll earn the 6xxx naming, or if they'll be 5xxx with a numbering increment.


I believe they should go with the 6000 naming scheme. Without something like 'GTS/GTX' in front, the naming may get confusing.

For example, say the '6770' card used a 5000s name. That would mean it should go something like 5780 or 5760. It doesn't seem right since 1)the card is using a different architecture so it would X-Fire and such, and 2) the newer-lower-end cards (say a 'new' 6550') could be more powerful than a 5570.

I would still have AMD not go with a 'GTS/GTX' naming scheme, and just move on up the names.

an interesting idea, when we get to 10000+ series ( :D  ) is to seperate the name with a dash. So it would be something like 12-570 or 12-850 (instead of 12570 and 12850).

Like the early ATI consumer gfx cards (the X300...X1600) series, and the current Firepro series (S400, V8800), ATI does still kinda use a naming scheme. But I believe each architecture has a single letter, in that it doesn't vary until the next generation.
m
0
l
July 20, 2010 4:58:12 PM

sprunth said:
I believe they should go with the 6000 naming scheme. Without something like 'GTS/GTX' in front, the naming may get confusing.

For example, say the '6770' card used a 5000s name. That would mean it should go something like 5780 or 5760. It doesn't seem right since 1)the card is using a different architecture so it would X-Fire and such, and 2) the newer-lower-end cards (say a 'new' 6550') could be more powerful than a 5570.

I would still have AMD not go with a 'GTS/GTX' naming scheme, and just move on up the names.

an interesting idea, when we get to 10000+ series ( :D  ) is to seperate the name with a dash. So it would be something like 12-570 or 12-850 (instead of 12570 and 12850).

Like the early ATI consumer gfx cards (the X300...X1600) series, and the current Firepro series (S400, V8800), ATI does still kinda use a naming scheme. But I believe each architecture has a single letter, in that it doesn't vary until the next generation.

their saving the 1000's for the future ;D
m
0
l
a b U Graphics card
July 20, 2010 5:25:34 PM

i would prefer 6xxx for SI and 7xxx for NI just to make the card naming not confusing
m
0
l
July 21, 2010 1:42:59 PM

It's gonna be a premature launch this year, everyone! I think it's because of the nVidia's 460 Fermi that's breathing down AMD/ATI's neck...lol
m
0
l
July 21, 2010 2:02:07 PM

damasvara said:
It's gonna be a premature launch this year, everyone! I think it's because of the nVidia's 460 Fermi that's breathing down AMD/ATI's neck...lol


Well, yes and no. In a way, it's a very mature launch, because it's actually refined 40nm process, so whatever they'll be, I guess those cards will be very efficient and fast. I just wouldn't expect them to have new technology. That'll probably be for the cards of 2011.

Keep in mind that AMD/ATI hasn't launched a new cards in a long while by now, and that they're not at the same stage of development as NV.
m
0
l
a b U Graphics card
July 21, 2010 2:12:26 PM

damasvara said:
It's gonna be a premature launch this year, everyone! I think it's because of the nVidia's 460 Fermi that's breathing down AMD/ATI's neck...lol


i would imagine the reason why they are coming out with new cards is because the 5xxx came out a year ago, so its about right for a new series
m
0
l
a c 130 U Graphics card
July 21, 2010 2:19:35 PM

damasvara said:
It's gonna be a premature launch this year, everyone! I think it's because of the nVidia's 460 Fermi that's breathing down AMD/ATI's neck...lol


Don't think so, I can see your logic but the time frame was always looking at being where it is now. Its meant to be a yearly cycle so November for a release is right.
The only undecided thing as far as i could see was just what was going to come out.
NI has been scuppered by production issues but it was never a given that they would actually release NI next anyway.
People have been wondering for, well just about the day after the 5 series came out people were posting "whats coming next".
We could have been given a clocked and tweaked 5 series refresh and it wouldn't have been a real surprise.
The 460 has sure stired things up some and I'm sure the driver teams are looking to get better performance out of the new chips even as we speak.
Im not claiming to know if it was always planed this way or not but they have done this type of mixing aspects of a new chip with older chips before. I guess it gives then a look at the parts in action and gives thenm time to polish getting the most out of them.

Mactronix
m
0
l
July 21, 2010 2:32:02 PM

From news on the main page:

Quote:
"We will start introducing the second-generation of our DX 11 products before the end of the year." - Dirk Meyer, AMD CEO

ATI's second generation DX 11 products, code named Southern Islands, will be fabricated on the 40nm process as well and be branded as the Radeon HD 6000 series.


Which pretty much solves the naming issue, and confirms that it'll be the 40nm products. I'm curious, though... is it really a new generation or something like a generation 1.5? I'm hesitant to have high expectations about the 6000, especially about the introduction of newer technologies. :heink: 
m
0
l
a b U Graphics card
July 21, 2010 2:34:48 PM

The most room for improvement is the 5770/5830 price points. Imo, your not going to see those parts this year. But when you do, I guess the drivers will be fully mature, (wink wink) :) 
m
0
l
a c 130 U Graphics card
July 21, 2010 2:36:53 PM

Southern islands will basically be a new uncore wrapped around the existing engine. Similar to what we got with the 2 series to 3 series progresion. I beleive TGGA mentioned this earlier in the thread.
Thought so.
TGGA wrote,
"SI should be somewhat similar to the change to the HD3K cards which weren't a major jump from the HD2K, they were a good refinement which is what SI looks to be but with a bit more boost on the high end since 2K->3K involved some improvements and some drops (transistors, memory bandwidth). Performance from SI is likely to be a bit better than HD2K->HD3K, but not as good as HD3K->4K->5K."

Mactronix
m
0
l
July 21, 2010 3:37:13 PM

I'm saying this because the launch date was previously set for early next year. The reason AMD changed their mind is probably because of the 460 Fermi hype going on. Stop a hype with a bigger hype, this way AMD will draw attention off the 460. But that's just a stupid me who's talking...lol

Whatever the reason is, the 6xxx series launch could only mean 1 thing: price cut for the 5xxx series!!!.. hooray!!!..
m
0
l
a c 130 U Graphics card
July 21, 2010 5:03:08 PM

Cant see AMD droping any prices significally any time soon even after launch of a new series. They are selling all they can make and so have no reason to drop any prices. August is said to see the launch of what will probably be the 450, this will take care of the 5770 price point.
Im in the UK and to be honest prices here make the 460 a very good purchase indeed. Its only a little more than a 5770 in its 768 guise and the 1gb card is more or less the same price as the 5830. Either kill a 5770 and depending on the game the 5850 can be beaten by the 1gb version.

Mactronix
m
0
l
a b U Graphics card
July 21, 2010 11:04:05 PM

damasvara said:
I'm saying this because the launch date was previously set for early next year. The reason AMD changed their mind is probably because of the 460 Fermi hype going on. Stop a hype with a bigger hype, this way AMD will draw attention off the 460.


Dude, you're WAY off, and confusing companies.
NI was scheduled for this year Q4'10, and Evergreen was schedule for beginning of Q3 (preferably July), but they got delayed with 40nm on Evergreen and then 32/28nm was so delayed that they knew they would have to fill the gap before they could get 28nm production going at either GloFo or TSMC. So they created SI to fill that gap, and its announcement was long long before the GTX465's launch let alone GTX460, might even have been before the GTX480 actual hardware launch (not the 3 paper launches nV did prior to that for the 'fight hype with hype' strategy you speak of). These chips take a long time to bring to market, and it's not like they just decided to put SI together when the GTX460 lanched.

Quote:
But that's just a stupid me who's talking...lol


Yeah that's what I was thinking. :kaola: 
BTW, AMD has no pressure, as Mac mentioned, they are still selling more than they can make, so it's not like they have to rush out a replacement like they did after HD2900 or like nV did once the G200 was EOLed.


Quote:
Whatever the reason is, the 6xxx series launch could only mean 1 thing: price cut for the 5xxx series!!!.. hooray!!!..


Perhaps, or maybe just the removal of a player in a category. Replace the HD5970 with an HD6870 do you really affect the other prices if there aren't derivatives for those segments to get pricing pressure from?

Don't count on anything, especially since 40nm production at TSMC is still below target, and so it's not like you get alot of benefits to leverage the HD6000 series pushing down prices.

I'll just be happy if they can meet the August-September target they were mentioning in their conference call. I suspect Oct-Nov is far more realistic.
m
0
l
a c 376 U Graphics card
July 21, 2010 11:11:37 PM

True. For a lesson from recent history the HD5000/GTX 400 series really didn't do much to lower the prices of the HD4000 or GTX 200 cards. The opposite in some cases.
m
0
l
a b U Graphics card
July 21, 2010 11:39:43 PM

jyjjy said:
True. For a lesson from recent history the HD5000/GTX 400 series really didn't do much to lower the prices of the HD4000 or GTX 200 cards. The opposite in some cases.


to be honest, i think it was because instead of gently flowing into the 5xxx series, AMD launched the WHOLE series in a VERY short amount of time. This made the 4xxx series less of a lower end, and more of something to be discontinued, so they didnt bother with the price, as most of the 5xxx series covered every possible spot on the use spectrum, and generally beat the 4xxx card in its budget sector, so there wasnt a need to lower costs. With nvidia however, i dont know if it was the production costs, them being dumb or stupid, but they didnt even touch their prices. They must have been dead from the market for a few months there considering you could buy a gtx 285 for more than a 5850 for a while, and still can! Nvidia also has taken this release very slowly, while in a matter of weeks, ATI got 10 cards out. Nvidia has has a few weeks, and they have 3 real cards out, the gtx 465 just got caught up in friendly fire. Therefore, i dont mean this in a fanboy sort of way, but if it falls to ATI to lower their past lineups prices, they probably will depending on the speed of 6xxx release, however i doubt Nvidia will by much. Just my 2 cents.
m
0
l
a b U Graphics card
July 22, 2010 12:19:54 AM

ares1214 said:
to be honest, i think it was because instead of gently flowing into the 5xxx series, AMD launched the WHOLE series in a VERY short amount of time. This made the 4xxx series less of a lower end, and more of something to be discontinued,


Bu that's something ATi has done since the R9500 series, with very quick top to bottom refreshes. Even nV is somewhat similar (except for G200 series) although usually on a longer time-line.

Quote:
Therefore, i dont mean this in a fanboy sort of way, but if it falls to ATI to lower their past lineups prices, they probably will depending on the speed of 6xxx release, however i doubt Nvidia will by much. Just my 2 cents.


I think nV is more cost prohibited from lowering prices, as for ATi I think it depends on if they planned an SI mid-range. Without that mid-range you likely won't see much movement, because as you mentioned, the HD6000 series likely won't affect GTX4xx pricing as much since 'loyal customers' are willing to over-pay and keep that price high.

The biggest issue IMO will be what replaces the segment that the GTX460 is currently competing in. If AMD can bring a new part to that segment it will likely drag and push prices all over the place. But if they launch the HD6800 series only, then you may see only a mild change at the top (5970 and 5870 maybe) if that.

m
0
l
July 22, 2010 12:29:00 AM

i dont care about how many cores honestly i think 2 is plenty, make that 4 if you count ht. what i want is a 5ghz cpu :) 
m
0
l
!