Sign in with
Sign up | Sign in
Your question

Very Interesting read regarding upcoming consoles

Last response: in Graphics & Displays
Share
July 2, 2005 12:19:31 AM

Microsoft's Xbox 360 & Sony's PlayStation 3 - Examples of Poor CPU Performance

In our last article we had a fairly open-ended discussion about many of the challenges facing both of the recently announced next-generation game consoles. We discussed misconceptions about the Cell processor and its ability to accelerate physics calculations, as well as touched on the GPUs of both platforms. In the end, both the Xbox 360 and the PlayStation 3 are much closer competitors than you would think based on first impressions.

The Xbox 360’s Xenon CPU features more general purpose cores than the PlayStation 3 (3 vs. 1), however game developers will most likely only be using one of those cores for the majority of their calculations, leveling the playing field considerably.

The Cell processor derives much of its power from its array of 7 SPEs (Synergistic Processing Elements), however as we discovered in our last article, their purpose is far more specialized than we had thought. Speaking with Epic Games’ head developer, Tim Sweeney, he provided a much more balanced view of what sorts of tasks could take advantage of the Cell’s SPE array.

The GPUs of the next-generation platforms also proved to be quite interesting. In Part I we speculated as to the true nature of NVIDIA’s RSX in the PS3, concluding that it’s quite likely little more than a higher clocked G70 GPU. We will expand on that discussion a bit more in this article. We also looked at Xenos, the Xbox 360’s GPU and characterized it as equivalent to a very flexible 24-pipe R420. Despite the inclusion of the 10MB of embedded DRAM, Xenos and RSX ended up being quite similar in our expectations for performance; and that pretty much summarized all of our findings - the two consoles, although implementing very different architectures, ended up being so very similar.

So we’ve concluded that the two platforms will probably end up performing very similarly, but there was one very important element excluded from the first article: a comparison to present-day PC architectures. The reason a comparison to PC architectures is important is because it provides an evaluation point to gauge the expected performance of these next-generation consoles. We’ve heard countless times that these new consoles would offer better gaming performance than anything we’ve had on the PC, or anything we would have for a matter of years. Now it’s time to actually put those claims to the test, and that’s exactly what we did.

Speaking under conditions of anonymity with real world game developers who have had first hand experience writing code for both the Xbox 360 and PlayStation 3 hardware (and dev kits where applicable), we asked them for nothing more than their brutal honesty. What did they think of these new consoles? Are they really outfitted with the PC-eclipsing performance we’ve been lead to believe they have? The answer is actually quite frequently found in history; as with anything, you get what you pay for.

Learning from Generation X

The original Xbox console marked a very important step in the evolution of gaming consoles - it was the first console that was little more than a Windows PC.
It featured a 733MHz Pentium III processor with a 128KB L2 cache, paired up with a modified version of NVIDIA's nForce chipset (modified to support Intel's Pentium III bus instead of the Athlon XP it was designed for). The nForce chipset featured an integrated GPU, codenamed the NV2A, offering performance very similar to that of a GeForce3. The system had a 5X PC DVD drive and an 8GB IDE hard drive, and all of the controllers interfaced to the console using USB cables with a proprietary connector.

For the most part, game developers were quite pleased with the original Xbox. It offered them a much more powerful CPU, GPU and overall platform than anything had before. But as time went on, there were definitely limitations that developers ran into with the first Xbox.

One of the biggest limitations ended up being the meager 64MB of memory that the system shipped with. Developers had asked for 128MB and the motherboard even had positions silk screened for an additional 64MB, but in an attempt to control costs the final console only shipped with 64MB of memory.
The next problem is that the NV2A GPU ended up not having the fill rate and memory bandwidth necessary to drive high resolutions, which kept the Xbox from being used as a HD console.

Although Intel outfitted the original Xbox with a Pentium III/Celeron hybrid in order to improve performance yet maintain its low cost, at 733MHz that quickly became a performance bottleneck for more complex games after the console's introduction.

The combination of GPU and CPU limitations made 30 fps a frame rate target for many games, while simpler titles were able to run at 60 fps. Split screen play on Halo would even stutter below 30 fps depending on what was happening on screen, and that was just a first-generation title. More experience with the Xbox brought creative solutions to the limitations of the console, but clearly most game developers had a wish list of things they would have liked to have seen in the Xbox successor. Similar complaints were levied against the PlayStation 2, but in some cases they were more extreme (e.g. its 4MB frame buffer).

Given that consoles are generally evolutionary, taking lessons learned in previous generations and delivering what the game developers want in order to create the next-generation of titles, it isn't a surprise to see that a number of these problems are fixed in the Xbox 360 and PlayStation 3.

One of the most important changes with the new consoles is that system memory has been bumped from 64MB on the original Xbox to a whopping 512MB on both the Xbox 360 and the PlayStation 3. For the Xbox, that's a factor of 8 increase, and over 12x the total memory present on the PlayStation 2.

The other important improvement with the next-generation of consoles is that the GPUs have been improved tremendously. With 6 - 12 month product cycles, it's no surprise that in the past 4 years GPUs have become much more powerful. By far the biggest upgrade these new consoles will offer, from a graphics standpoint, is the ability to support HD resolutions.

There are obviously other, less-performance oriented improvements such as wireless controllers and more ubiquitous multi-channel sound support. And with Sony's PlayStation 3, disc capacity goes up thanks to their embracing the Blu-ray standard.
But then we come to the issue of the CPUs in these next-generation consoles, and the level of improvement they offer. Both the Xbox 360 and the PlayStation 3 offer multi-core CPUs to supposedly usher in a new era of improved game physics and reality. Unfortunately, as we have found out, the desire to bring multi-core CPUs to these consoles was made a reality at the expense of performance in a very big way.

-------------------------------------------------------------

Problems with the Architecture

At the heart of both the Xenon and Cell processors is IBM’s custom PowerPC based core. We’ve discussed this core in our previous articles, but it is best characterized as being quite simple. The core itself is a very narrow 2-issue in-order execution core, featuring a 64KB L1 cache (32K instruction/32K data) and either a 1MB or 512KB L2 cache (for Xenon or Cell, respectively). Supporting SMT, the core can execute two threads simultaneously similar to a Hyper Threading enabled Pentium 4. The Xenon CPU is made up of three of these cores, while Cell features just one.

Each individual core is extremely small, making the 3-core Xenon CPU in the Xbox 360 smaller than a single core 90nm Pentium 4. While we don’t have exact die sizes, we’ve heard that the number is around 1/2 the size of the 90nm Prescott die.

IBM’s pitch to Microsoft was based on the peak theoretical floating point performance-per-dollar that the Xenon CPU would offer, and given Microsoft’s focus on cost savings with the Xbox 360, they took the bait.

While Microsoft and Sony have been childishly playing this flops-war, comparing the 1 TFLOPs processing power of the Xenon CPU to the 2 TFLOPs processing power of the Cell, the real-world performance war has already been lost.

Right now, from what we’ve heard, the real-world performance of the Xenon CPU is about twice that of the 733MHz processor in the first Xbox. Considering that this CPU is supposed to power the Xbox 360 for the next 4 - 5 years, it’s nothing short of disappointing. To put it in perspective, floating point multiplies are apparently 1/3 as fast on Xenon as on a Pentium 4.

The reason for the poor performance? The very narrow 2-issue in-order core also happens to be very deeply pipelined, apparently with a branch predictor that’s not the best in the business. In the end, you get what you pay for, and with such a small core, it’s no surprise that performance isn’t anywhere near the Athlon 64 or Pentium 4 class.

The Cell processor doesn’t get off the hook just because it only uses a single one of these horribly slow cores; the SPE array ends up being fairly useless in the majority of situations, making it little more than a waste of die space.

We mentioned before that collision detection is able to be accelerated on the SPEs of Cell, despite being fairly branch heavy. The lack of a branch predictor in the SPEs apparently isn’t that big of a deal, since most collision detection branches are basically random and can’t be predicted even with the best branch predictor. So not having a branch predictor doesn’t hurt, what does hurt however is the very small amount of local memory available to each SPE. In order to access main memory, the SPE places a DMA request on the bus (or the PPE can initiate the DMA request) and waits for it to be fulfilled. From those that have had experience with the PS3 development kits, this access takes far too long to be used in many real world scenarios. It is the small amount of local memory that each SPE has access to that limits the SPEs from being able to work on more than a handful of tasks. While physics acceleration is an important one, there are many more tasks that can’t be accelerated by the SPEs because of the memory limitation.

The other point that has been made is that even if you can offload some of the physics calculations to the SPE array, the Cell’s PPE ends up being a pretty big bottleneck thanks to its overall lackluster performance. It’s akin to having an extremely fast GPU but without a fast CPU to pair it up with.

-------------------------------------------------

What About Multithreading?

We of course asked the obvious question: would game developers rather have 3 slow general purpose cores, or one of those cores paired with an array of specialized SPEs? The response was unanimous, everyone we have spoken to would rather take the general purpose core approach.

Citing everything from ease of programming to the limitations of the SPEs we mentioned previously, the Xbox 360 appears to be the more developer-friendly of the two platforms according to the cross-platform developers we've spoken to. Despite being more developer-friendly, the Xenon CPU is still not what developers wanted.

The most ironic bit of it all is that according to developers, if either manufacturer had decided to use an Athlon 64 or a Pentium D in their next-gen console, they would be significantly ahead of the competition in terms of CPU performance.

While the developers we've spoken to agree that heavily multithreaded game engines are the future, that future won't really take form for another 3 - 5 years. Even Microsoft admitted to us that all developers are focusing on having, at most, one or two threads of execution for the game engine itself - not the four or six threads that the Xbox 360 was designed for.

Even when games become more aggressive with their multithreading, targeting 2 - 4 threads, most of the work will still be done in a single thread. It won't be until the next step in multithreaded architectures where that single thread gets broken down even further, and by that time we'll be talking about Xbox 720 and PlayStation 4. In the end, the more multithreaded nature of these new console CPUs doesn't help paint much of a brighter performance picture - multithreaded or not, game developers are not pleased with the performance of these CPUs.

What about all those Flops?

The one statement that we heard over and over again was that Microsoft was sold on the peak theoretical performance of the Xenon CPU. Ever since the announcement of the Xbox 360 and PS3 hardware, people have been set on comparing Microsoft's figure of 1 trillion floating point operations per second to Sony's figure of 2 trillion floating point operations per second (TFLOPs). Any AnandTech reader should know for a fact that these numbers are meaningless, but just in case you need some reasoning for why, let's look at the facts.

First and foremost, a floating point operation can be anything; it can be adding two floating point numbers together, or it can be performing a dot product on two floating point numbers, it can even be just calculating the complement of a fp number. Anything that is executed on a FPU is fair game to be called a floating point operation.

Secondly, both floating point power numbers refer to the whole system, CPU and GPU. Obviously a GPU's floating point processing power doesn't mean anything if you're trying to run general purpose code on it and vice versa. As we've seen from the graphics market, characterizing GPU performance in terms of generic floating point operations per second is far from the full performance story.

Third, when a manufacturer is talking about peak floating point performance there are a few things that they aren't taking into account. Being able to process billions of operations per second depends on actually being able to have that many floating point operations to work on. That means that you have to have enough bandwidth to keep the FPUs fed, no mispredicted branches, no cache misses and the right structure of code to make sure that all of the FPUs can be fed at all times so they can execute at their peak rates. We already know that's not the case as game developers have already told us that the Xenon CPU isn't even in the same realm of performance as the Pentium 4 or Athlon 64. Not to mention that the requirements for hitting peak theoretical performance are always ridiculous; caches are only so big and thus there will come a time where a request to main memory is needed, and you can expect that request to be fulfilled in a few hundred clock cycles, where no floating point operations will be happening at all.

So while there may be some extreme cases where the Xenon CPU can hit its peak performance, it sure isn't happening in any real world code.

The Cell processor is no different; given that its PPE is identical to one of the PowerPC cores in Xenon, it must derive its floating point performance superiority from its array of SPEs. So what's the issue with 218 GFLOPs number (2 TFLOPs for the whole system)? Well, from what we've heard, game developers are finding that they can't use the SPEs for a lot of tasks. So in the end, it doesn't matter what peak theoretical performance of Cell's SPE array is, if those SPEs aren't being used all the time.

Another way to look at this comparison of flops is to look at integer add latencies on the Pentium 4 vs. the Athlon 64. The Pentium 4 has two double pumped ALUs, each capable of performing two add operations per clock, that's a total of 4 add operations per clock; so we could say that a 3.8GHz Pentium 4 can perform 15.2 billion operations per second. The Athlon 64 has three ALUs each capable of executing an add every clock; so a 2.8GHz Athlon 64 can perform 8.4 billion operations per second. By this silly console marketing logic, the Pentium 4 would be almost twice as fast as the Athlon 64, and a multi-core Pentium 4 would be faster than a multi-core Athlon 64. Any AnandTech reader should know that's hardly the case. No code is composed entirely of add instructions, and even if it were, eventually the Pentium 4 and Athlon 64 will have to go out to main memory for data, and when they do, the Athlon 64 has a much lower latency access to memory than the P4. In the end, despite what these horribly concocted numbers may lead you to believe, they say absolutely nothing about performance. The exact same situation exists with the CPUs of the next-generation consoles; don't fall for it.

------------------------------------------------------

Why did Sony/MS do it?

For Sony, it doesn't take much to see that the Cell processor is eerily similar to the Emotion Engine in the PlayStation 2, at least conceptually. Sony clearly has an idea of what direction they would like to go in, and it doesn't happen to be one that's aligned with much of the rest of the industry. Sony's past successes have really come, not because of the hardware, but because of the developers and their PSX/PS2 exclusive titles. A single hot title can ship millions of consoles, and by our count, Sony has had many more of those than Microsoft had with the first Xbox.

Sony shipped around 4 times as many PlayStation 2 consoles as Microsoft did Xboxes, regardless of the hardware platform, a game developer won't turn down working with the PS2 - the install base is just that attractive. So for Sony, the Cell processor may be strange and even undesirable for game developers, but the developers will come regardless.

The real surprise was Microsoft; with the first Xbox, Microsoft listened very closely to the wants and desires of game developers. This time around, despite what has been said publicly, the Xbox 360's CPU architecture wasn't what game developers had asked for.

They wanted a multi-core CPU, but not such a significant step back in single threaded performance. When AMD and Intel moved to multi-core designs, they did so at the expense of a few hundred MHz in clock speed, not by taking a step back in architecture.

We suspect that a big part of Microsoft's decision to go with the Xenon core was because of its extremely small size. A smaller die means lower system costs, and if Microsoft indeed launches the Xbox 360 at $299 the Xenon CPU will be a big reason why that was made possible.

Another contributing factor may be the fact that Microsoft wanted to own the IP of the silicon that went into the Xbox 360. We seriously doubt that either AMD or Intel would be willing to grant them the right to make Pentium 4 or Athlon 64 CPUs, so it may have been that IBM was the only partner willing to work with Microsoft's terms and only with this one specific core.

Regardless of the reasoning, not a single developer we've spoken to thinks that it was the right decision.

---------------------------------------------------------

The Saving Grace: The GPUs

Although both manufacturers royally screwed up their CPUs, all developers have agreed that they are quite pleased with the GPU power of the next-generation consoles.

First, let's talk about NVIDIA's RSX in the PlayStation 3. We discussed the possibility of RSX offloading vertex processing onto the Cell processor, but more and more it seems that isn't the case. It looks like the RSX will basically be a 90nm G70 with Turbo Cache running at 550MHz, and the performance will be quite good.

One option we didn't discuss in the last article, was that the G70 GPU may feature a number of disabled shader pipes already to improve yield. The move to 90nm may allow for those pipes to be enabled and thus allowing for another scenario where the RSX offers higher performance at the same transistor count as the present-day G70. Sony may be hesitant to reveal the actual number of pixel and vertex pipes in the RSX because honestly they won't know until a few months before mass production what their final yields will be.

Despite strong performance and support for 1080p, a large number of developers are targeting 720p for their PS3 titles and won't support 1080p. Those that are simply porting current-generation games over will have no problems running at 1080p, but anyone working on a truly next-generation title won't have the fill rate necessary to render at 1080p.

Another interesting point is that despite its lack of "free 4X AA" like the Xbox 360, in some cases it won't matter. Titles that use longer pixel shader programs end up being bound by pixel shader performance rather than memory bandwidth, so the performance difference between no AA and 2X/4X AA may end up being quite small. Not all titles will push the RSX to the limits however, and those titles will definitely see a performance drop with AA enabled. In the end, whether the RSX's lack of embedded DRAM matters will be entirely dependent on the game engine being developed for the platform. Games that make more extensive use of long pixel shaders will see less of an impact with AA enabled than those that are more texture bound. Game developers are all over the map on this one, so it wouldn't be fair to characterize all of the games as falling into one category or another.

ATI's Xenos GPU is also looking pretty good and most are expecting performance to be very similar to the RSX, but real world support for this won't be ready for another couple of months. Developers have just recently received more final Xbox 360 hardware, and gauging performance of the actual Xenos GPU compared to the R420 based solutions in the G5 development kits will take some time. Since the original dev kits offered significantly lower performance, developers will need a bit of time to figure out what realistic limits the Xenos GPU will have.

--------------------------------------------------------

Final Words

Just because these CPUs and GPUs are in a console doesn't mean that we should throw away years of knowledge from the PC industry - performance doesn't come out of thin air, and peak performance is almost never achieved. Clever marketing however, will always try to fool the consumer.

And that's what we have here today, with the Xbox 360 and PlayStation 3. Both consoles are marketed to be much more powerful than they actually are, and from talking to numerous game developers it seems that the real world performance of these platforms isn't anywhere near what it was supposed to be.

It looks like significant advancements in game physics won't happen on consoles for another 4 or 5 years, although it may happen with PC games much before that.

It's not all bad news however; the good news is that both GPUs are quite possibly the most promising part of the new consoles. With the performance that we have seen from NVIDIA's G70, we have very high expectations for the 360 and PS3. The ability to finally run at HD resolutions in all games will bring a much needed element to console gaming.

And let's not forget all of the other improvements to these next-generation game consoles. The CPUs, despite being relatively lackluster, will still be faster than their predecessors and increased system memory will give developers more breathing room. Then there are other improvements such as wireless controllers, better online play and updated game engines that will contribute to an overall better gaming experience.

In the end, performance could be better, the consoles aren't what they could have been had the powers at be made some different decisions. While they will bring better quality games to market and will be better than their predecessors, it doesn't look like they will be the end of PC gaming any more than the Xbox and PS2 were when they were launched. The two markets will continue to coexist, with consoles being much easier to deal with, and PCs offering some performance-derived advantages.

With much more powerful CPUs and, in the near future, more powerful GPUs, the PC paired with the right developers should be able to bring about that revolution in game physics and graphics we've been hoping for. Consoles will help accelerate the transition to multithreaded gaming, but it looks like it will take PC developers to bring about real change in things like game physics, AI and other non-visual elements of gaming.






This was from Anandtech.com, unfortunently it was pulled seconds afterwards, but managed to find the whole thing posted from another site.
July 2, 2005 1:48:10 AM

If people are too lazy to read this whole thing, the simplest summery is this:

The upcoming consoles' CPUs blows, but their graphics cards are decent, not revolutionary.
July 2, 2005 1:53:11 AM

Read important points over at AT, before it was taken down.
Related resources
July 2, 2005 1:58:23 AM

Why would this be pulled?

I find a couple things interesting....they bascially say this "Things are as great as they could have been in a perfect senerio, we will have to wait to see how badly they messed up, and how messed up more"

Who bloody cares? i am waiting till the Xbox360 ships. By then working samples will be out with games that i can go look at and stuff. When the PS3 comes out i will go look at ti and figure out if its good enough to buy.

My guess is that both will be worth buying, not only for the gameplay but for the graphics and all the stuff people are bitchingg about.

Yeah it may not be the best thing of all time, and maybe some computers could beat it graphically. but how many people are going to actually own the $600+ Nvidia/ATI gpu, and the $500 CPU to match these consoles at the time of release? and get that same $1100 performance out of a 300-400 machine?

hope this made sense

__________________________________________
July 2, 2005 2:01:59 AM

I think the reason why it was pulled is because the Microsoft insider was afraid of being caught, so Anandtech pulled it.
July 2, 2005 2:04:20 AM

I do kinda wander about the post though, since it's really not up to anandtech quality.
July 2, 2005 2:08:37 AM

BTW i read that whole thing....

Its like in school when they teach you about the freedom of speech but that same teacher is the first person to send you to the office for asking AND I QUOTE "How come the entire world hates jews? they have been kicked out of or made to be 2nd class citizens in every major country in known history"

Its bascially saying dont listen to reports and critics about the consoles....listen to their report by their critic.

Maybe i should run by Medina(Bellevue) when i go the clubs tonight to ask Billy Gates about the real inside scoop on the 360 is....

__________________________________________
July 2, 2005 2:09:55 AM

I gave up on speculating, just want to see the consoles for myself.
July 2, 2005 2:15:52 AM

I can't wait to see GTA....I want to kill people with a double ended dildo in HD.

Xbox360 has no games that are system buyers for me. PS3 has 2.

Is everyone buying both? neither? or waiting to see full running games?

__________________________________________
July 2, 2005 2:18:07 AM

That article was a complete joke, no wonder it was pulled. It makes me wonder how it even got put up.

How can you say Xenos is not revolutionary? Um I don't see unified shaders in other graphics cards, do you?

Some people are like slinkies....
Not really good for anything but you cant help smile when you see one tumble down the stairs.
July 2, 2005 2:21:57 AM

I smell console fanboyism:D 
July 2, 2005 2:22:55 AM

For me I think i'm going to wait for the price to drop a little, and find out which one is easier to mod and pirate games for.
July 2, 2005 3:25:50 AM

You smell wrong. My only interest is Xenos since it is the future or at least for ATI.

And how is that fanboyism? Are they not valid points? If theres any fanboyism I'd say its from you for posting that article in the wrong section.

Some people are like slinkies....
Not really good for anything but you cant help smile when you see one tumble down the stairs.<P ID="edit"><FONT SIZE=-1><EM>Edited by Action_Man on 07/01/05 11:54 PM.</EM></FONT></P>
July 2, 2005 5:26:39 AM

Quote:
That article was a complete joke, no wonder it was pulled. It makes me wonder how it even got put up.

Quote:
And how is that fanboyism? Are they not valid points? If theres any fanboyism I'd say its from you for posting that article in the wrong section.

ROFLMAF!! VALID POINTS?! LOL!! give me some of that shlt you're smoking right now!!
July 2, 2005 5:39:49 AM

Did you miss my point on unified shaders?

Seriously did you even read the article? Comparing to PC processors is just thing to compare against. PC's have OoOE and the consoles don't, of course PC code isn't going to run well.

Another point, these cpus are supposed to only offer twice the performance of a 733 P3 celeron and they're using dual G5's as alpha kits which are supposed to be only 40% of the final hardware, hmm.

I also doubt the xcpu is 1/2 the size of a prescott. At 135mm^2 for the new P4 2MB chips, 1/2 the size would be about ~70mm^2. I think not.

Quote:
given that its PPE is identical to one of the PowerPC cores in Xenon

*rolls eyes*

Quote:
the SPE array ends up being fairly useless in the majority of situations, making it little more than a waste of die space.

*rolls eyes again*

Quote:
but it looks like it will take PC developers to bring about real change

*more eye rolling*

*smokes more of his sh!t*

Some people are like slinkies....
Not really good for anything but you cant help smile when you see one tumble down the stairs.<P ID="edit"><FONT SIZE=-1><EM>Edited by Action_Man on 07/02/05 03:57 AM.</EM></FONT></P>
July 2, 2005 8:30:49 AM

no offence mate, one thing you get what you pay for.

state of the art gaming engine will not be in the next gen consoles, but actually in PCs its a bit obvious to see, thats what i think this article was sopposed to be about.

the gpu power is very high, remember your going to be paying $299-$399 for these consoles you think sony or microsoft like running at a loss, they are obviously going to cut corners and use marketing strategies to pull brainless zombies like you in by saying they can produce super computers at the fraction of the cost.

If you can produce a super computer for less than $1000 i will buy one, but by no means think that the ps3 or 360 is one if so dont you think all computer manufacturers will be making them?

i look around and hear the chatting of the 360 and ps3 will dessimate all but its all sh!t, think about it if you still can think beyond the unified shaders your ranting about.ROFL

The stupidity of people surprizes me.



<font color=purple> BOW DOWN AND SUCK MY eD!cK </font color=purple>
July 2, 2005 8:48:50 AM

Quote:
no offence mate, one thing you get what you pay for.

AMD cpus are cheaper then intels and are better. Sorry I had to say that.

Quote:
state of the art gaming engine will not be in the next gen consoles

Thanks for stating the obvious.

Quote:
but actually in PCs its a bit obvious to see, thats what i think this article was sopposed to be about.

Anandtech is a PC site and they're mainly bitching about they're not using PC cpus and how the PC is so much better. The PC is better at being a PC and a console is better at being a console.

Quote:
they are obviously going to cut corners and use marketing strategies to pull brainless zombies like you in by saying they can produce super computers at the fraction of the cost.

Um did I ever say that these are supercomputers? Ummmm no! And where are they cutting corners? If I'm a brainless zombie, I'd hate to think what you are.

Quote:
but by no means think that the ps3 or 360 is one if so dont you think all computer manufacturers will be making them?



Where did I say that or imply that? I never brought up supercomputers, you did!

Quote:
i look around and hear the chatting of the 360 and ps3 will dessimate all but its all sh!t, think about it if you still can think beyond the unified shaders your ranting about


Where did I say they would dessimate all? Where? Please tell me. My eye sight just isn't what it used to be. I didn't say that, imply that, think that or have anything to do with that line of though.

Ranting? Umm lets see, i highlighted it was revolutionary (which it is) and atis future cards (r600) will be using unified shaders which is what i'm interested in since thats when I'll be buying my next graphics card. Nvidia isnt taking that route (judging by their comments on the matter) so if it proves badass with xenos (which appears so) I'll be getting a R600. I then pointed out it was a valid comment that was ignored and then I was insulted.

Quote:
The stupidity of people surprizes me.

I'll say!

PC's are PC's and will continue to be PC's.
Consoles are consoles and will continue to be consoles.

Both are very enjoyable, why can't people get along?

Some people are like slinkies....
Not really good for anything but you cant help smile when you see one tumble down the stairs.<P ID="edit"><FONT SIZE=-1><EM>Edited by Action_Man on 07/02/05 05:20 AM.</EM></FONT></P>
July 2, 2005 11:04:53 AM

Some of the things i siad are not aimed at you, its more general of what i have seen of people around me, where i work on the net, and on my degree.

<font color=purple> BOW DOWN AND SUCK MY eD!cK </font color=purple>
July 2, 2005 11:15:38 AM

Oh ok, just seemed that way to me which seemed sort of odd.

Some people are like slinkies....
Not really good for anything but you cant help smile when you see one tumble down the stairs.
July 2, 2005 3:24:02 PM

Sorry mate.

<font color=purple> BOW DOWN AND SUCK MY eD!cK </font color=purple>
July 2, 2005 10:45:12 PM

Its fun to write things out of nowhere and invent numbers and conclusion out of nothing concrete. yes, it is.

Signature (up to 200 characters). You may use <font color=blue><b>Markup</b></font color=blue> in your signature
July 3, 2005 1:30:06 AM

np. I do agree with most of your generalisations.

Reading over the article again it seems to me they're trying to take away from the hype of the new consoles since there isnt much to hype in the PC side of things.

G70? Boring, higher clocks and more pipes.
R520? Boring, SM3, higher clocks and more pipes.
Dual cores? Interesting but not as fast as single cores and way too expensive.

PS3 has the cell which is ahead of its time.
Xbox 360 has xenos which has unfied shaders which are the future.

They're more interesting and they're being hyped like they are every refresh. (interesting != better)

Some people are like slinkies....
Not really good for anything but you cant help smile when you see one tumble down the stairs.
a b U Graphics card
July 3, 2005 8:33:38 AM

LOL!

Yeah saw this initially at work.

Funniest part to me was the VPU section. Let's see what camp Anand could possibly be in;


<font color=green><b>First, let's talk about NVIDIA's RSX in the PlayStation 3</b>. We discussed the possibility of RSX offloading vertex processing onto the Cell processor, but more and more it seems that isn't the case. It looks like the RSX will basically be a 90nm G70 with Turbo Cache running at 550MHz, <b>and the performance will be quite good</b>.

One option we didn't discuss in the last article, was that <b>the G70 GPU may feature a number of disabled shader pipes already to improve yield. The move to 90nm may allow for those pipes to be enabled and thus allowing for another scenario where the RSX offers higher performance at the same transistor count as the present-day G70.</b> Sony may be hesitant to reveal the actual number of pixel and vertex pipes in the RSX because honestly they won't know until a few months before mass production what their final yields will be.

Despite strong performance and support for 1080p, a large number of developers are targeting 720p for their PS3 titles and won't support 1080p. Those that are simply porting current-generation games over will have no problems running at 1080p, but anyone working on a truly next-generation title won't have the fill rate necessary to render at 1080p.

<b>Another interesting point is that despite its lack of "free 4X AA" like the Xbox 360, in some cases it won't matter</b>. Titles that use longer pixel shader programs end up being bound by pixel shader performance rather than memory bandwidth, so the performance difference between no AA and 2X/4X AA may end up being quite small. Not all titles will push the RSX to the limits however, and those titles will definitely see a performance drop with AA enabled. In the end, whether the RSX's lack of embedded DRAM matters will be entirely dependent on the game engine being developed for the platform. Games that make more extensive use of long pixel shaders will see less of an impact with AA enabled than those that are more texture bound. Game developers are all over the map on this one, so it wouldn't be fair to characterize all of the games as falling into one category or another.
</font color=green>


<font color=red>ATI's Xenos GPU is also looking pretty good <b>and most are expecting performance to be very similar to the RSX, but real world support for this won't be ready for another couple of months</b>. Developers have just recently received more final Xbox 360 hardware, and gauging performance of the actual Xenos GPU compared to the R420 based solutions in the G5 development kits will take some time. Since the original dev kits offered significantly lower performance, developers will need a bit of time to figure out what realistic limits the Xenos GPU will have.</font color=red>

Hmm, about 3 times as much focus on the RSX.

Interesting, seems that while there's some questions as to the R500/Xenos performance for quite some time until proper test and such can be run, the RSX/pseudoG70 is definitely going to have 'performance that will be quite good' despite not knowing the number of pipes or archiecture (less definite and/or revealed than the R500/Xenos).

I loved the segment where they say AA doesn't matter that much as a counter to their being 'native' support built into the Xenos from the start.

Of course it could just be that they don't have the sources to have the same amount of info on both systems.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil: 
July 3, 2005 6:54:43 PM

The article really isn't great, like i said it didn't seem to be up to the AT quality, the reason why it really grabbed my attention was the fact that it was taken down a few minutes after it was posted.
July 3, 2005 6:57:12 PM

Hey Action man i agree with your points, the R500 does look very promising, i just don't like to use the word revolutionary very often, i think in this case, the r500 is best described as evolutionary.


LOL and my previous post was to simply take what you said out of context, therefore questioning your credibility:D  Sorry that's what we've been learning in political science, so have to use it somewhere:D 
July 3, 2005 10:35:53 PM

Revolutionary certainly does get thrown around way too much. I do think this will be alot of chnage especially in the low and mid range cards. Either way it'll be very interesting.

Quote:
Sorry that's what we've been learning in political science, so have to use it somewhere:D 


hahaha.

Some people are like slinkies....
Not really good for anything but you cant help smile when you see one tumble down the stairs.
July 3, 2005 11:08:30 PM

Very interesting indeed, considering the fact that ATI's having production problems with the "simpler" R520, i wander what kind of delay and problems they're going to run into with the R500.
July 4, 2005 1:39:52 AM

Well actually the R520 is the more complex chip, its 300-350 million transistors and the R500 is 232 million and 105 for the eDRAM and some logic and being in different packages should help improve yields a fair bit.

Some people are like slinkies....
Not really good for anything but you cant help smile when you see one tumble down the stairs.
a b U Graphics card
July 4, 2005 1:22:32 PM

Interesting overview by ARStechnica about the AT article;

<A HREF="http://arstechnica.com/news.ars/post/20050629-5054.html" target="_new">http://arstechnica.com/news.ars/post/20050629-5054.html...;/A>

Interesting viewpoints on the CPU side of things, nothing about the VPUs unfortunately.

I like the comment about the 'free ride' as I think that's necessarily so. The only way to make consoles truely forward looking was to approach it from a multi-thread/core environment and if your going to be stuck with a single design for a while, might as well overdesign to some extent if the cost factor isn't that much more, because unlike PCs you won't get a second chance to add a physX engine or better sound processor later.

I still find the whole thing hillarious from the standpoint of the article still being pulled but it's all over the place. :evil: 

Also interesting that in this month's CPU mag, Anand praises the move to HD in his article: <i>"The move to HD is so critical for the Xbox 360; I can't believe that it is 2005 and we still don't have a console that has games rendered at higher resolutions than 640x480"</i> Yet you get the feel from this pulled AT article that it's not as important due to his pull back on 1080P. And since there is no mention of the PS3/RSX you see a slightly different view of the Xenos and it's benifits (eDRAM get a bit better shake).

Reading the CPU magazine bit you definitely get a different feel than this AT pulled article.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil: 
July 4, 2005 3:54:16 PM

I was very interested in the comment that the 3.2Ghz clocked processor will only be equivalent 2x performance of the original xbox, which was a Celeron/P3 hybrid at 733mhz, so that makes the Xenon equivalent a Willamette P4 1.4Ghz?
Anonymous
a b U Graphics card
July 4, 2005 5:27:21 PM

Why is it that I posted that in the CPU forums and got almsot no reply lol!.

That comment shocked me too, but they're talking about single threaded performance I think.
Now my question is do developpers really need more than 2X CPU power right now?
In 3 years down the road this power wont be enough but by then games should become more multi threaded and they should be able to take better advantage of the triple core.

In a heavily multi threaded environment im sure this CPU will do much more than 2X the performance of that hybrid old cpu...

Asus P4P800DX, P4C 2.6ghz@3.25ghz, 2X512 OCZ PC4000 3-4-4-8, MSI 6800Ultra stock, 2X30gig Raid0
July 4, 2005 5:41:28 PM

I knew they were talking about in a single threaded environment, or a hyperthreaded environment. But still, that is a horrible 3.2Ghz, that means a dothan 1.4Ghz totally destroys it.

A step up in cores and clock frequency, but a huge step back in architecture, that article sorta explained how they could pack 3 cores into such a small die.
a b U Graphics card
July 4, 2005 6:45:00 PM

Here's the funny thing about the CPU performance (ironic considering chastized someone for talking CPU in GPU forum :wink: ), is that while it may perform only 2X as well as the previous Xbox core, but really think about what it will be handling. The core will be doing physics, sound, networking, AI, etc. Now if propely programmed you don't need a 5ghz P4 to do all that, but preferably a bunch of slower procs that could dedicate themselves to those tasks. Overall I'd rather have the equivalent 6 x 1.4 ghz or even 1ghz cores that can be dynamically assigned the task that when combined seem to clobber even the strongest CPUs in current titles.

I like ARStechnica's view that you really can't comment on the power until you have the games actually working on these parrallel setups.

The other thing AT seems to forget is that while it was possible to make it with a dual core P4 or AMD64, that's the way to guarantee that eventually a PC will have more power than itself. When going the parrallel route it makes it a little difficult to compare, and to really achieve parity you would need more than just a dual core CPU. Of course the move to quad core may make that a moot point, but really I think that the issues involved with coding for these system are very similar if not the same as those that the developers will have to face when coding for the X2 and PentiumD.

Also WTF, is it really THAT easy right now porting games from PC to PS2? 'Cause that doesn't seem to have kept them from building the largest library out there.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil: 
July 4, 2005 8:25:24 PM

Not necessarily, if the cores are built using an EPIC-like architecture. You can have so much less transistors and wasted die space when going that route, yet still achieve stuff like two-fold+ performance improvements in FP, as seen with the Itanium 2.

--
The <b><A HREF="http://snipurl.com/blsb" target="_new"><font color=red>THGC Photo Album</font color=red></A></b>, send in your pics, get your own webpage and view other members' sites.
March 14, 2009 2:09:02 PM

the PS4 may use more number of SPE's on their cell architecture ....

i think the wii 2 will be releasing sometime soon ... can be in this year too .. hope so ! m eager to see some new hardware ...

here r some info on probable release dates & features wii 2 & others

Wii 2 release date & features
http://techtadka.net/articles/gadget/186-wii-2-release-...

PS4 release date & features
http://techtadka.net/articles/gadget/183-playstation-4-...

XBox 720 release date & features
http://techtadka.net/articles/gadget/185-xbox-720-relea...


PSP 2 release date & features
http://techtadka.net/articles/gadget/187-psp-2-release-...

:) 
!