Sign in with
Sign up | Sign in
Your question

Fermi could run an OS

Last response: in Graphics & Displays
Share
Anonymous
a b U Graphics card
October 13, 2009 3:27:20 PM
October 13, 2009 4:05:54 PM

Highly optimized.
And pft! even my 7300 can run an OS.... ON A CPU!!! >=D
October 13, 2009 4:15:04 PM

nvidia is making a mistake, we gamers are their biggest customers.
Related resources
October 13, 2009 4:20:18 PM

So .. according to this guy.. it can run an OS .. BUT .. it has to be highly optimized .. BUT .. it can't REALLY run an OS. You'll still need a CPU.

That logic makes perfect sense!
Anonymous
a b U Graphics card
October 13, 2009 4:26:06 PM

Quote:
I have a 3 year old cel phone that runs an OS and its not 12" long. LOL

Still talking about something coming in the future huh? How bout those awesome GT210/220's they just launched, THEIR 1ST 40NM CARDS LOL.

John, you got your head so far up nvidia's ass, I bet you have a doll that you named Fermi and you sleep with it at night dont you?

They aren't even developing chipsets anymore, Nvidia is screwed, they are soooo far behind. Bye bye!


never say never...remember that
a c 130 U Graphics card
October 13, 2009 4:29:51 PM

Quote:
never say never...remember that

He never said never.
October 13, 2009 4:40:53 PM

rescawen said:
nvidia is making a mistake, we gamers are their biggest customers.


Not for long.
a c 130 U Graphics card
October 13, 2009 4:44:03 PM

smithereen said:
Not for long.

There's no reason to buy from them anymore. The only people who would are the fanbois or the people who think that because the GTX295 costs 600$ it has to be better than the $380 5870.
Anonymous
a b U Graphics card
October 13, 2009 4:47:54 PM

nvidia will never let down hardcore gamers... they just want to produce cGPU that you can use in something else than just games.... larabee is knocking at the door so nvidia need to prepare for that and fermi is the answer
October 13, 2009 5:48:26 PM

Nuff said.
Just go on nvidia's website.
"FailMI" can be found under products/HIGH PERFORMACNE COMPUTING.

nVidia itself doesnt consider this a gaming GPU. Your waiting over crap man.
And with all that massive technology, x8 faster then any cpu, ECC control, gigafail threading, the cost of this card will be somewhere around the 1000~$. They did charge us 800 for a mere 8800ULTRA right?
It's over. Nvidia was too optimistic.
They failed. "FailMI" says it all.

Ati has the entire market for them. and nvidia is going up against the leader of computing technology from grounds up. The whole fermi thingy was a pure optimism. In fact, the cake was a lie.
October 14, 2009 1:42:18 PM

Quote:
Nuff said.
Just go on nvidia's website.
"FailMI" can be found under products/HIGH PERFORMACNE COMPUTING.

nVidia itself doesnt consider this a gaming GPU. Your waiting over crap man.
And with all that massive technology, x8 faster then any cpu, ECC control, gigafail threading, the cost of this card will be somewhere around the 1000~$. They did charge us 800 for a mere 8800ULTRA right?
It's over. Nvidia was too optimistic.
They failed. "FailMI" says it all.

Ati has the entire market for them. and nvidia is going up against the leader of computing technology from grounds up. The whole fermi thingy was a pure optimism. In fact, the cake was a lie.


Even though the Fermi has everything needed for extreme gaming the 5800's have and THEN some... don't spew b/s if you have no idea what you are talking about.

And it is unlikely it will be $1000, or even $800 for that matter, prob $500 to $600
October 14, 2009 2:08:04 PM

what corporation do you know that will rush head long into gpu compute, they like 2 stick 2 what they know(fact), i dont expect them to sell a great deal
October 14, 2009 2:37:09 PM

rangers said:
what corporation do you know that will rush head long into gpu compute, they like 2 stick 2 what they know(fact), i dont expect them to sell a great deal


Do you remember when the GTX200's were overpriced as hell, and then AMD released the 4800's with performance way above what nvidia thought they'd be for a better cost ratio.

Then nvidia caved their 200 series pricing to stay competitive and is still trying to recover.

They wont make that mistake twice.
October 14, 2009 2:43:12 PM

RealityRush said:
Quote:
Nuff said.
Just go on nvidia's website.
"FailMI" can be found under products/HIGH PERFORMACNE COMPUTING.

nVidia itself doesnt consider this a gaming GPU. Your waiting over crap man.
And with all that massive technology, x8 faster then any cpu, ECC control, gigafail threading, the cost of this card will be somewhere around the 1000~$. They did charge us 800 for a mere 8800ULTRA right?
It's over. Nvidia was too optimistic.
They failed. "FailMI" says it all.

Ati has the entire market for them. and nvidia is going up against the leader of computing technology from grounds up. The whole fermi thingy was a pure optimism. In fact, the cake was a lie.


Even though the Fermi has everything needed for extreme gaming the 5800's have and THEN some... don't spew b/s if you have no idea what you are talking about.

And it is unlikely it will be $1000, or even $800 for that matter, prob $500 to $600



Yes obviously.
they sold a mere 8800ultra at 800 $. What special was in that chip?
Features

* Full support for both Microsoft® DirectX 10 and DirectX 9 for unparalleled levels of graphics realism and performance
* NVIDIA PureVideo™ HD2 technology provides unsurpassed Blu-ray and HD DVD movie picture quality
* NVIDIA 8 Series graphics processors are essential for accelerating the best Microsoft® Windows Vista™ experience

Wow, was that even special? it was just the best chip back then.
Just like the GTX 295. Double gpu so they sold it for near 700CAD.
Now what do we have here?
• 512 CUDA cores
• NVIDIA Parallel DataCache technology
• NVIDIA GigaThread™ engine
• ECC support

Do you even now whats ECC Suport? 512 freakin cuda cores?
Thats twice the last amount.
And the last card was lauched at 600.
This card will be sold MORE then 800 you ca be sure of that.
What compagnie would waste so much money to make such a big card to then sell it 600 because some people over the internet are beggin to buy it because for some reason, they dont want to get an ati that is STRONGER and cost WAY LESS.

Your waiting, over, crap. PURE crap.
This card isnt made for gaming.
It HAS the capabilities for, but they added WAY too much useless crap in it, making its production price obsolete to sell to mere gamers.
the best you can hope is that they produce a lower CUDA one to lower the price, maybe you will be able to buy that one: At 600. For 10 fps more then HD5870 at 400.
Have fun with FailMI
October 14, 2009 2:54:45 PM

bboynatural said:
Yes obviously.
they sold a mere 8800ultra at 800 $. What special was in that chip?
Features

* Full support for both Microsoft® DirectX 10 and DirectX 9 for unparalleled levels of graphics realism and performance
* NVIDIA PureVideo™ HD2 technology provides unsurpassed Blu-ray and HD DVD movie picture quality
* NVIDIA 8 Series graphics processors are essential for accelerating the best Microsoft® Windows Vista™ experience

Wow, was that even special? it was just the best chip back then.
Just like the GTX 295. Double gpu so they sold it for near 700CAD.
Now what do we have here?
• 512 CUDA cores
• NVIDIA Parallel DataCache technology
• NVIDIA GigaThread™ engine
• ECC support

Do you even now whats ECC Suport? 512 freakin cuda cores?
Thats twice the last amount.
And the last card was lauched at 600.
This card will be sold MORE then 800 you ca be sure of that.
What compagnie would waste so much money to make such a big card to then sell it 600 because some people over the internet are beggin to buy it because for some reason, they dont want to get an ati that is STRONGER and cost WAY LESS.

Your waiting, over, crap. PURE crap.
This card isnt made for gaming.
It HAS the capabilities for, but they added WAY too much useless crap in it, making its production price obsolete to sell to mere gamers.
the best you can hope is that they produce a lower CUDA one to lower the price, maybe you will be able to buy that one: At 600. For 10 fps more then HD5870 at 400.
Have fun with FailMI


As I just explained above, they got massacred price wise because of the 4800 vs. the 200's, unless they are completely retarded, they wont do that twice. The pres of nVidia has said on a couple different occasions, that consumers will be "pleaseantly surprised" about the 300 series pricing. It's nVidia, so I take that with a LARGE spoon full of salt, but that at least means it wont be $800 like everyone thinks it will for a single GPU card.

As for what the card offers.... again, it offers everything the 5800's do for graphics and gaming, all of it. You're also getting some neat computing tools that helping gaming in subtle ways you wont notice, but you're right, the majority of them don't impact you (some of them will though, so don't just ignore the CUDA cores :p ). You're also right that the 300's will be more money, so what are you paying for? While, if the whitepaper proves to me more than just that, paper, and the performance is as intended, you're also paying for another 20% increase in speed.

Everyone needs to realize something, AMD is the performance/price ratio king, absolutely, they always will be, and nVidia will always be more expensive. But the thing is nVidia is more expensive (and owns more of the market) for a good reason, they make better, higher performing cards.

Whether they release this year or next, the new card will be better than anything AMD has on the market right now. At which point AMD will release a comparable card a few months later for less money, and then nVidia will..... etc. Just enjoy the benefits competition provides to us gamers.
October 14, 2009 6:01:59 PM

They will do so in all likelyhood.

The high-end will likely come out first, maybe a small amount at the end of this year. That way nVidia can show up ATI (or fail epicly at doing so) and regain it's crown (or lose half its marketshare). Then it can take a couple months and release the mainstream cards next year.

This is what I hope for at least.
a b U Graphics card
October 14, 2009 6:16:25 PM

All their innovation went into CPU and server functionality. They have already said they are shipping mostly Tesla's first. I dont think we will see new nvidia gaming GPUs available by the end of the year.

Oh wait, I'm wrong. They did fire back at ATI with the release of the 210 and 220. :p 
a c 172 U Graphics card
October 14, 2009 6:31:01 PM

Hmmmm :sleep:  I'll be sticking with my G80 and G92 cards for now as for ATI it's a vintage R580 but hopefully soon it will be a R800 some ware in that collection.
October 14, 2009 9:06:40 PM

RealityRush said:
As I just explained above, they got massacred price wise because of the 4800 vs. the 200's, unless they are completely retarded, they wont do that twice. The pres of nVidia has said on a couple different occasions, that consumers will be "pleaseantly surprised" about the 300 series pricing. It's nVidia, so I take that with a LARGE spoon full of salt, but that at least means it wont be $800 like everyone thinks it will for a single GPU card.

As for what the card offers.... again, it offers everything the 5800's do for graphics and gaming, all of it. You're also getting some neat computing tools that helping gaming in subtle ways you wont notice, but you're right, the majority of them don't impact you (some of them will though, so don't just ignore the CUDA cores :p ). You're also right that the 300's will be more money, so what are you paying for? While, if the whitepaper proves to me more than just that, paper, and the performance is as intended, you're also paying for another 20% increase in speed.

Everyone needs to realize something, AMD is the performance/price ratio king, absolutely, they always will be, and nVidia will always be more expensive. But the thing is nVidia is more expensive (and owns more of the market) for a good reason, they make better, higher performing cards.

Whether they release this year or next, the new card will be better than anything AMD has on the market right now. At which point AMD will release a comparable card a few months later for less money, and then nVidia will..... etc. Just enjoy the benefits competition provides to us gamers.



Yeah I like your explanation.
But it's just not that simple. The pricing that we will be happy about, won't be the full FERMI GTX300 that they are talking about for one good reason:
nVidia cannot FORCE a price. The technology they are using in this GPU is AMAZING. BUT, it cost ALOT just to produce these cards, they cannot make a lower price then the production price, and cannot make the price of production since they will not win a single penny. It's not their fault, they just make too good cards. But seriously, I never payed 100$/10 fps performance, that is why I never paid for nVidia cards...Maybe someday when I will be drunk...

The only explanation for a possible "pleasant" price would be a non ecc support/less cuda cores card. Because ANYWAY, we gamers do NOT need that and wont be paying 200$ for subtle 5 fps increase.I think nVidia should concentrate on SLI. There SLI is the best dual GPU technology, it scales perfectly. In fact, you can play crysis with over 60 FPS at 1600x1200 thingy (current "Gamer" resolution) with 2 GTX 260 (not even the 216 edition. With 216 it will be better). That was proved in a biased benchmark of i7 against the Phenom II 955 here at tom's (they used a 790x chip and CF4870 against a X58 with SLI260, more biased you don't get.).

Well I cant wait to see how nvidia will go up this pit..But as I said, you cant hope to get THE fermi at less then 800$ no matter WHAT marketing technique they will use. It's just not possible. Im sure they are planning a full gaming "fermi" instead of the high performance computing one.
October 14, 2009 10:24:50 PM

i think ur wrong^^crossfire scales better than SLI
October 14, 2009 10:27:07 PM

put an end to this and say fermi should be moved to the CPU section
a c 130 U Graphics card
October 14, 2009 10:47:44 PM

If nVidia wants to lose in the gamer market, that's their problem.
October 14, 2009 10:56:02 PM

rangers said:
i think ur wrong^^crossfire scales better than SLI


NOW yeah, No sh!et!! HD5770 scales to 100% in some game. INCREDIBLE!!!

But back then, and with the 4800 serie VS GT200, I think nVidia had the upperhand...Well thats what the benchmarks were showing...
Idunno,..
October 15, 2009 12:35:25 AM

John gets the biggest troll/fanboy in the Universe Award. Every post mentions Nvidia multiple times with only good things. Are you JHH's son?
a b U Graphics card
October 15, 2009 1:51:01 AM

Nvidia is doing what it has to defend itself from LBR. Intel is expanding and coming into Nvidia/Ati territory quite aggressively. If any company wants to survive they need to expand. Its as simple as that. Both companies are now being challenged against the giant intel. Intel has much more money/resources then both companies put together.
October 15, 2009 2:03:33 AM

*** ing cpu is 300$ dont even imagine gpu.
AMD will beat them in this domain anytime. The only reason they dont in CPU is because the price of a cpu is somewhat low ANYWAY so somepeople dont care about the performance/price ratio.
In a gpu? Intel is bound to lose.
a b U Graphics card
October 15, 2009 2:17:56 AM

"AMD will beat them in this domain anytime."
"Intel is bound to lose."

Any source to back these claims?

October 15, 2009 2:22:27 AM

No source
Im just talking logic, intel make obsolete price for their....performance...if you call it like that.
In a gpu price competing world, where there is already 2 big compagnies fighting for the market, intel doesnt stand a chance. They will be like a second nvidia.
And since AMD beats intel in price, so will ATI.
Pure logic. OF course pure speculations too.
a b U Graphics card
October 15, 2009 2:32:41 AM

bboynatural said:
No source
Im just talking logic, intel make obsolete price for their....performance...if you call it like that.
In a gpu price competing world, where there is already 2 big compagnies fighting for the market, intel doesnt stand a chance. They will be like a second nvidia.
And since AMD beats intel in price, so will ATI.
Pure logic. OF course pure speculations too.


Intel pretty much clobbers AMD in CPUs ever since the i7s came out last year. I read an article a few weeks ago saying AMD has been losing money in CPUs for years.

Intel will need to buy nvidia if they want to match up to AMD in GPUs....Intel's larabeewhachamacallit has no chance
a b U Graphics card
October 15, 2009 2:40:26 AM

If LBR does some how offer great performance in games and some how manages to compete with Nvidia/Ati low/mid segments thats all they need to do really. Nvidia/Ati make most of there profit from low/med range products. As much as I don't want this to happen. In my opinion I think LBR will at least scratch there low end discrete cards.
October 15, 2009 3:00:03 AM

bboynatural

way to be bias intel clearly wins in cpu for years against AMD, AMD bleeds money like a hemophiliac.

Hell Nvidia is worth the same as AMD+ATI combined and frankly i think ATI carries AMD even though AMD bought them.

Intel currently wins in the ~$180+ range or about 150$ and up. And intel cpu's still beat amd below that when you oc them. Although it's great that AMD finally sure-ed up the $120 and blow range at stock speed but frankly intel is much more of a brand name and is more likely to be bought by someone, hell they advertise when was the last time you saw AMD commercial. The most AMD does is slap a sticker on a laptop or computer.

Everyone by now should know first fermis and crap is high end business and server cards, basically Tesla replacements.

Nvidia looks like it wants to make a move into those sectors pretty much only in competition with cpu's, nvidia wants to fill a niche and believes they can make a good sum of money in the super computing niche, which as it can run assembly and c++ nativity if it's given great support which i'm guessing nvidia will provide it could do quite well.

Nvidia is pushing the high end gaming market in the back burners. Which i think is a bad thing nvidia the way it's mean to be played is very important to pc gaming that's a lot of money leaving pc game development when the pc gaming industry is pretty much in a steep decline in terms of game quality and number per a year. Is it a good thing well as long as Nvidia focuses on gaming market at the $150 price range it wont hurt them much at all imo that is were the most cards are bought 70-150 range is the most important. So w.e Nvidia decides to do it probably will do fine hell it has been turning a profit for years so unless they made some really bad gambles it's unlikely it will change.
October 15, 2009 4:14:07 AM

IzzyCraft said:
bboynatural

way to be bias intel clearly wins in cpu for years against AMD, AMD bleeds money like a hemophiliac.

Hell Nvidia is worth the same as AMD+ATI combined and frankly i think ATI carries AMD even though AMD bought them.

Intel currently wins in the ~$180+ range or about 150$ and up. And intel cpu's still beat amd below that when you oc them. Although it's great that AMD finally sure-ed up the $120 and blow range at stock speed but frankly intel is much more of a brand name and is more likely to be bought by someone, hell they advertise when was the last time you saw AMD commercial. The most AMD does is slap a sticker on a laptop or computer.

Everyone by now should know first fermis and crap is high end business and server cards, basically Tesla replacements.

Nvidia looks like it wants to make a move into those sectors pretty much only in competition with cpu's, nvidia wants to fill a niche and believes they can make a good sum of money in the super computing niche, which as it can run assembly and c++ nativity if it's given great support which i'm guessing nvidia will provide it could do quite well.

Nvidia is pushing the high end gaming market in the back burners. Which i think is a bad thing nvidia the way it's mean to be played is very important to pc gaming that's a lot of money leaving pc game development when the pc gaming industry is pretty much in a steep decline in terms of game quality and number per a year. Is it a good thing well as long as Nvidia focuses on gaming market at the $150 price range it wont hurt them much at all imo that is were the most cards are bought 70-150 range is the most important. So w.e Nvidia decides to do it probably will do fine hell it has been turning a profit for years so unless they made some really bad gambles it's unlikely it will change.


^^^ QFT, and do people honestly believe Nvidia will release a card weaker then a 5870, not saying it will be affordable, but it is the truth. Regardless this gen of graphics cards seems like a flop seeing as no games will catch up to the hardware. In my opinion ever since the G80 series software has been moving at a slower pace and is ussually 1 generation behind the technology (of course except for the badly optimized outlier **cough** **cough** crysis).

I still believe the 8800gtx and q6600 was the best bang for your buck if you bought them when they came out, and I dont think there will ever be such a leap in technology like that again in the graphic card industry. yea you spent a *** load of money, but that combo can still play games at the highest settings almost 3 years laters.
October 15, 2009 4:24:19 AM

IzzyCraft said:
bboynatural

way to be bias intel clearly wins in cpu for years against AMD, AMD bleeds money like a hemophiliac.

Hell Nvidia is worth the same as AMD+ATI combined and frankly i think ATI carries AMD even though AMD bought them.

Intel currently wins in the ~$180+ range or about 150$ and up. And intel cpu's still beat amd below that when you oc them. Although it's great that AMD finally sure-ed up the $120 and blow range at stock speed but frankly intel is much more of a brand name and is more likely to be bought by someone, hell they advertise when was the last time you saw AMD commercial. The most AMD does is slap a sticker on a laptop or computer.

Everyone by now should know first fermis and crap is high end business and server cards, basically Tesla replacements.

Nvidia looks like it wants to make a move into those sectors pretty much only in competition with cpu's, nvidia wants to fill a niche and believes they can make a good sum of money in the super computing niche, which as it can run assembly and c++ nativity if it's given great support which i'm guessing nvidia will provide it could do quite well.

Nvidia is pushing the high end gaming market in the back burners. Which i think is a bad thing nvidia the way it's mean to be played is very important to pc gaming that's a lot of money leaving pc game development when the pc gaming industry is pretty much in a steep decline in terms of game quality and number per a year. Is it a good thing well as long as Nvidia focuses on gaming market at the $150 price range it wont hurt them much at all imo that is were the most cards are bought 70-150 range is the most important. So w.e Nvidia decides to do it probably will do fine hell it has been turning a profit for years so unless they made some really bad gambles it's unlikely it will change.



Relax man I didnt say ANYTHING biased. I said it, intel does beat AMD in performance, they way I wrote it (with dots) is just because I think that for the mainstream, Us, Gamers, are just buying i7 for the braggin rights. Nothing else. ONCE AGAIN I DO SAY THAT INTEL BEATS AMD IN PERFORMANCE SO PLEASE KEEP YOUR EGO DOWN:

http://www.tomshardware.com/reviews/phenom-x4-955,2278-...

Same fkin performance.

http://www.tomshardware.com/reviews/phenom-x4-955,2278-...

Here in world in conflict, i7 looks like he has the advantage. But weird how a usually very used option such as AA makes the i7 drop by 20 FPS. While 955 drop only 10. Anyway, you will HARDLY see a difference for the ~300$ (mobo+cpu+ram) your paying. This fps drop is very present in intel cpu. If you get a Intel CPU that performs same in gaming that a Phenom, You will get a lot more instable Frame rate. Sometimes dropping from 60 to 30, making the game look extremly lame compared to a constant 45-35 from Phenom.

http://www.tomshardware.com/reviews/phenom-x4-955,2278-...

Phenom is more power efficient, And this is what gamers need, The power should be used by GPU's and make us pay less for PSU.

Now lets talk about that ADVANTAGE of i7 that MOST (not ALL because that would be lie) and by MOST i mean at least 70% NEVER use.

http://www.tomshardware.com/reviews/phenom-x4-955,2278-...

less then a minute, less then 50 seconds, and 10 seconds difference. Is that EVEN such a big deal? I would rather pay 300 less and wait 1 minute for something I might use...once every year? I never zip anyway and most dont. We torrent and UNZIP not ZIP. The only REAL disadvantage is the video encoding that MAYBE some of us use. But 300$ pops in my mind again, and 1min/300$ is FAR from being a good deal to me.

ONCE AGAIN THIS IS NOT BIASED IM TALKING ABOUT THE GAMER COMMUNITY.
As you see, the ONLY reason AMD is Bleeding money is because people buy for the braggin right, they don't even take the time to compare for their need and often end up paying more.
300$ man. 300 $, add the 200$ you were gonna get your GPU with, and you can get a 500$ GPU. 2HD5850 or 1 HD5870 or even 1 GTX295. Do you understand how much gamers waste? This is why I wrote it in a "biased" way.
This is the only reason AMD is bleeding money, it's only because nobody takes time to READ about the actual technology, they don't even ask therselves what will they do with that damn CPU there getting at over 300$. And please don't mention i5, I really don't want to go look for those slicing review that cuts this cpu to pieces. Intel is definitly not made for "mid range" CPU's.

One review is not enough? Here's more:
http://www.hardwarecanucks.com/charts/cpu.php?pid=69,70...

Watch what happens if you overclock it to a mere 3.8
Yes, it reach the performance of a 1000$ CPU.

http://www.hardwarecanucks.com/charts/cpu.php?pid=69,70...

Beat the i7.

Now don't get me wrong, the i7 is DEFINITLY a better CPU. DEFINITLY.
But is it worth the money? Is it worth 300$ that you couldv spent on a better GPU?
Check for example the valve particle simulation, of course i7 beat the Phenom II 955, but look at the frames here, it's already past 60, you will not notice any difference because YOUR MONITOR IS SET TO 60HZ. Even the bests are only 75.

Do you understand? AMD isnt bleeding money cuz they suck, AMD is bleeding money because PEOPLE SUCKS.
Computing used to be a professional domain, now any sick @ss can come and claim that he can use a computer, while they hardly know what provide them Gaming power. Seriously If I could show you that guy on this forum that said something like : Anyway the graphic card only provides 40% of the power the rest is CPU, in a thread about HD 4650. I was astonished that this guy even had the RIGHT to use a damn computer. If all people would take time to CHECK and LEARN, AMD would be rich by now.

Now I didnt take the time to make this thread just to get dissed or start a i7 vs Phenom II 955 war, I already lost.Im talking GAME WISE. And since I am on the graphic forum, I think AMD should rule this part. But no it don't, because most of you people only care about brand name or fashion, "since everybody is going i7 Im going too".

And FIRST OF ALL, I never meant to write this comment, I never was biased, I just felt the need to show the truth to everyone that SHOULD see it. I personally would prefer saving my money and laughing on idiots that NEVER will use the REAL power of the i7 but still wasted 300 bucks.

Anyway If 2 reviews ain't enough, and that you might want to talk about the "awesome" scaling power the i7 has to offer, here's a little something for you:

http://www.modreactor.com/english/Reviews/Test-ATI-HD-4...

People claim that a CPU Bottlenecking can be seen at maximum quality. Here's your Bottlenecking.

http://www.modreactor.com/english/Reviews/Test-ATI-HD-4...

Once again, the moment we crank the AA, i7 shows its true weakness.

http://www.modreactor.com/english/Reviews/Test-ATI-HD-4...

Okay, I give the i7 ONE win. GG. we NEARLY reached 60 fps with phenom tho, if you overclock it, youl go over 60. Once again, this is a performance that will not be seen.

http://www.modreactor.com/english/Reviews/Test-ATI-HD-4...

Enough said. I am ready to get flammed, raped, jumped at for this insane blasphemy. Go on. Just remember that: I am talking GAMEWISE, I admit a i7 is better CPU that phenom II 955 but still ask yourself if 1 min is worth 300$ that you could win 30 fps with. Love me or hate me it's an obsession, love me or hate me that is the question...

I will admit something FRANKLY, Quad cores of last generation DID beat AMD! I will NEVER deny this. But I am talking about the current generation. Also, 6-cores and 8 cores (Bulldozer) are coming in AMD while i9 and 8 core are coming at Intel, Now I wonder wich one will cost less for the performance that never will be used and wich one idiotic gamers that dwell on their parent's money will get. ^^

One last thing, I mentioned that i5 is totally out of the question and this is why: First I read a review that said that the testers were disapointed by i5, and that i7-860 was the real deal for LGA1156. Another thing is i5 only scales dual x8 for the same price of a dual x16 from AMD. And last but not least, LGA1156 is already dead: This is intel's response to next year's 6core and 2011's 8core: http://www.fudzilla.com/content/view/15749/35/
Less cache, more pricey, still dual x8. GG intel.
October 15, 2009 4:57:39 AM

Way to long for me to care let alone read i'll just post something about what i caught

"Do you understand? AMD isnt bleeding money cuz they suck, AMD is bleeding money because PEOPLE SUCKS."

Naw it's because AMD does suck they lost their great reputation which they gained during intels shitty Pentium 4 years at the start of the first gen phenom's and wasn't looking so hot when C2D series came out from intel.

AMD doesn't advertise it doesn't, try to compete imo on a business level against intel. Intel does everything to stay ahead.

AMD has some pretty great products out currently but i couldn't really say that about a year ago or two years ago for that matter, compared to intel.

Really i think AMD dropped the ball awhile back and is still feeling the consequences. When you sell a line of products unless you are the undisputed best you have an imange you have to upload to sell to a manufacture and consumer. Hell if i was some random guy off the street would i but a shitty atom cpu or a shitty VIA cpu they are both pretty shitty but i'd but intel's cpu over VIA's cpu just because i'm more likely to know the brand.

That's the one fault of amd i've found is they assume they are a brand name with quality assorted with them already, like sony or samsung or toyota. But frankly in the US i never see any form of advertisement in stores or tv
October 15, 2009 5:01:22 AM

Okay and? Point is? I still prove big time that your OMFG SUPER SPECIAL AWESOME INTEL POWER!!!!11!!11!1!!!! is just a lack of research in the gaming world.
Give the company a rest, You can't always hold on to past errors? And just because of that, you think they deserve not to be bought anymore? maybe they CAN'T afford advertising??

You said it yourself, even intel had shitty CPU in the past? So your comment doesnt stand on any foundation to begin with, it just your personal hate over a company that made a mistake and made you QQ back then.
October 15, 2009 5:12:26 AM

You can hate on a company as much as you want.
But I think you should first of all start and run a company that afforded the world with some of the best CPU and was able to hold against one of the world's first CPU company, before saying HOW YOU THINK they dropped the ball or what they should do.
They ARE losing money, Because of stupid people once again.
But the laptop market, that is basicaly made of people that dont give a fk wether they have intel or amd, they just want the lowest price to run what they need, is and WILL keep AMD alive.

Also, ATI has enough "advertizing" to promote the name of AMD (because it's a GPU, and a GPU is generally used by people that care, or care enough about the budget), and make SMART AND MONEY EFFICIENT gamers like many in this world do some...REGULAR..research over the internet and discover the awesome power behind the name AMD.
Of course there is some stupid gamers (talking about those that never zip/video encode so relax your ego) that basicaly don't give a fk about money, those are the market of intel. Crazy how this world is mainly made of idiots hu?
a b U Graphics card
October 15, 2009 6:18:35 AM

I am not a janitor, and I do not have time to clean up every thread you two post in, so if you're not going to talk about the subject of the OP, or even any vaguely-related subject, don't post in the damn thread!
October 15, 2009 6:20:56 AM

randomizer said:
I am not a janitor, and I do not have time to clean up every thread you two post in, so if you're not going to talk about the subject of the OP, or even any vaguely-related subject, don't post in the damn thread!

Sorry...
October 15, 2009 6:41:56 AM

Lol what happend to my post..
Ok the corrected post is.

Fermi card cost = $1500. :p 
!