Sign in with
Sign up | Sign in
Your question

New R600 bench's

Last response: in Graphics & Displays
Share
March 20, 2007 4:11:08 PM

VR-Zone has some prelim bench's with 3dmark06 of R600

"Next, we got hold of some preliminary benchmarks figures of the R600 XTX card with core clock at 800MHz vs a GeForce 8800 GTX card. Using a Core 2 Extreme 2.93GHz processor on an Intel 975X board, the 3DMark06 score at 1600x1200 resolution is 97xx on the R600XTX compared to 95xx on the 8800GTX."

More about : r600 bench

March 20, 2007 4:12:37 PM

linkage??
March 20, 2007 4:14:28 PM

Quote:
linkage??


Yeah seriously.
Related resources
March 20, 2007 4:19:30 PM

Thanks Grape.
March 20, 2007 4:24:18 PM

Interesting link... I hope they improve performance before launch, though. I'd be unimpressed if it took them half a year to get out a card that wins by a mere 200 3dmarks, not even considering a 89xx.

But hopefully they'll be able to increase clockspeed and improve drivers decently by the time it's available.
a b U Graphics card
March 20, 2007 4:28:15 PM

Like I said in another thread, expect the R600 series to do mediocre/poorly in 3Dmark, but better in actual apps.

Look at the areas the GF8600 does well in compared to the GF7900/X1900, it is improving mainly in SM2.0 not SM3.0. Considering the early statements about R600 geometry shader power expect the differences to favour other situations.

That's why all this Bungholiomark stuff means so very little without the actual individual tests.
March 20, 2007 4:29:54 PM

*shakes wrist up and down*

thats about what those benchies are worth.....


I'm still wondering if we'll see anything before the end of march, because I'm holding out for one of these cards.
March 20, 2007 4:46:54 PM

Sorry, I was lazy.
March 20, 2007 5:22:43 PM

Well I hope they can improve that score. Core clock at 800 mhz means its overclocked version. According to some reports its supposed to 750 mhz for the stock speed (2900XTX).
March 20, 2007 5:27:23 PM

I don't think the r600 benches are worth much... unless players who are not actively playing on a basketball court can sit on it.

Can 8 full grown basketball players sit on it?
Sure its a beefy card... but can it support that?

I just don't know anymore.
March 20, 2007 6:04:20 PM

Im thinking its performance might be limited by the prerelease drivers, as mentioned in the article. Also delays on the card mean that its highly possible that the next card in the generation is near (it happened last generation to ATI with the X1800/X1900 cards)

The benchmark was also done at 1600x1200, without specifying AA levels. Does anyone with a similar setup know what AA levels it is running at by comparing the 8800 scores? Could R600 be CPU limited at that resolution and AA level?
March 20, 2007 6:34:54 PM

Of course drivers are going to help.

About 95XX is the score for the 8800 as is says in the article quote.

I'm thinking your not going to have to much cpu limitation with a res of 1600x1200. I think that was part of the reason to run the test at this res. Also I think they ran it with all AA and AF off.
March 20, 2007 6:49:16 PM

Let's hope this doesn't reflect real world gaming performance.
March 20, 2007 7:26:20 PM

Those numbers are horrible, and whatever the 8800 ultra is, it's a pretty safe bet to beat those numbers easily. If there is any truth to them then the R600 is shaping up to be a NV30 scale disaster. The silence from DAMMIT at this point is deafening. When are we going to get some real numbers? If they were good, why are they being kept a secret? When are we going to get a solid launch date?

I suspect the worst.
March 20, 2007 8:03:30 PM

I think we've been left so long that all we can do is suspect the worst. Also with that talk of the launch of R700 and the naming of the R600 to 2900XTX all shows a limit had been reached with this card and the silence just adds to it.
a b U Graphics card
March 20, 2007 8:15:50 PM

Quote:
Those numbers are horrible, and whatever the 8800 ultra is, it's a pretty safe bet to beat those numbers easily. If there is any truth to them then the R600 is shaping up to be a NV30 scale disaster. The silence from DAMMIT at this point is deafening. When are we going to get some real numbers? If they were good, why are they being kept a secret? When are we going to get a solid launch date?

I suspect the worst.

Wha? You are saying that rumored 3dmark scores of a non retail version of R600 equates to shaping up to an NV30 disaster? There is so much wrong with that logic.

For fun, lets just say that they have a reference board as will hit retail. Aren't you still putting too much weight into a bungilo_06? The X1900XT 512MB only beats a 7900GTO by 100 3dmarks in 06, yet play Oblivion or NFS Carbon on them, and the X1900XT totally spanks it. Have a look for yourself how little a 3dmark score means, especially a bloated SM2.0 score as TGGA mentioned. Apart from 2 ties, the X1900XT wins every game in this review:

3dmark06:
http://www.firingsquad.com/hardware/evga_e-geforce_7900...

Oblivion:
http://www.firingsquad.com/hardware/evga_e-geforce_7900...


Who knows at this point how R600 will stack up to the 8800 and 8900 over time, but one things for sure this 3dmark score means didly. Like Oblvion in my example above, imagine if that 200 point 3dmark score ends up being a 100% Crysis stomping. Doubtful, but who knows.
March 20, 2007 8:43:37 PM

Quote:
Like I said in another thread, expect the R600 series to do mediocre/poorly in 3Dmark, but better in actual apps.


right , we need a new version of 3DMark for those cards anyway.
March 20, 2007 10:02:54 PM

Quote:
Like I said in another thread, expect the R600 series to do mediocre/poorly in 3Dmark, but better in actual apps.


right , we need a new version of 3DMark for those cards anyway.

Exactly. What the heck are they doing over there at Futuremark anyways? They haven't come out with a "new" bench for over 2 years. 06' was basically just a more intensive v. of 05 with HDR and Dual Core support. And ever since nV has been "enhancing" their drivers to run better on 3DMark, the accuracy has gotten worse and worse.

I only will rely on real game benches and iQ comparisons to judge these cards. I do think however that once R600 cards are released, choosing over G80 will be a difficult decision. It's likely that one will run better on DX9 than DX10. Many will need to decide whether they want to buy the care that supports a handful of DX10 games, or the other, that is better on many titles in DX9.

I for one am unfortunately stuck with an SLI mobo, and must choose nV. Unless R600 can run Xfire on SLI mobos and I could trade in my 8800 GTXs for R600s... that would be NIIIICE *Borat thumbs up* :lol: 
March 20, 2007 11:25:52 PM

Quote:
8800 ultra


8800 Ultra? :?:

what am I missing? Whats a 8800ultra?
March 21, 2007 12:19:13 AM

Quote:
Those numbers are horrible, and whatever the 8800 ultra is, it's a pretty safe bet to beat those numbers easily. If there is any truth to them then the R600 is shaping up to be a NV30 scale disaster. The silence from DAMMIT at this point is deafening. When are we going to get some real numbers? If they were good, why are they being kept a secret? When are we going to get a solid launch date?

I suspect the worst.

Wha? You are saying that rumored 3dmark scores of a non retail version of R600 equates to shaping up to an NV30 disaster? There is so much wrong with that logic.

For fun, lets just say that they have a reference board as will hit retail. Aren't you still putting too much weight into a bungilo_06? The X1900XT 512MB only beats a 7900GTO by 100 3dmarks in 06, yet play Oblivion or NFS Carbon on them, and the X1900XT totally spanks it. Have a look for yourself how little a 3dmark score means, especially a bloated SM2.0 score as TGGA mentioned. Apart from 2 ties, the X1900XT wins every game in this review:

3dmark06:
http://www.firingsquad.com/hardware/evga_e-geforce_7900...

Oblivion:
http://www.firingsquad.com/hardware/evga_e-geforce_7900...


Who knows at this point how R600 will stack up to the 8800 and 8900 over time, but one things for sure this 3dmark score means didly. Like Oblvion in my example above, imagine if that 200 point 3dmark score ends up being a 100% Crysis stomping. Doubtful, but who knows.

Good post. 3dMark figures aren't everything.

3dMark06
7900 GTX: 6299
x1900 XT: 6100

Oblivion performance
7900 GTX: 20.2 fps
x1900 XT: 33.7 fps

I'll remain quiet until I see some "real" R600 benchmarks.
March 21, 2007 5:19:53 AM

3DMark is a bench which we could only use to test our system when tweaking our PCs mostly or comparing same hardware.They will always fix their drivers to run it faster if 3Dmark remains popular.
I`m stuck with a crossfire mobo and have 8800 GTS :evil: 
not that i will use xfire or sli after all but its an option.
March 21, 2007 7:04:42 AM

well we can hope.. all i know is that i want atleast 20% above 88gtx or else im waiting for next generation :x
March 21, 2007 2:06:21 PM

Quote:
Those numbers are horrible, and whatever the 8800 ultra is, it's a pretty safe bet to beat those numbers easily. If there is any truth to them then the R600 is shaping up to be a NV30 scale disaster. The silence from DAMMIT at this point is deafening. When are we going to get some real numbers? If they were good, why are they being kept a secret? When are we going to get a solid launch date?

I suspect the worst.

Wha? You are saying that rumored 3dmark scores of a non retail version of R600 equates to shaping up to an NV30 disaster? There is so much wrong with that logic.

For fun, lets just say that they have a reference board as will hit retail. Aren't you still putting too much weight into a bungilo_06? The X1900XT 512MB only beats a 7900GTO by 100 3dmarks in 06, yet play Oblivion or NFS Carbon on them, and the X1900XT totally spanks it. Have a look for yourself how little a 3dmark score means, especially a bloated SM2.0 score as TGGA mentioned. Apart from 2 ties, the X1900XT wins every game in this review:

3dmark06:
http://www.firingsquad.com/hardware/evga_e-geforce_7900...

Oblivion:
http://www.firingsquad.com/hardware/evga_e-geforce_7900...


Who knows at this point how R600 will stack up to the 8800 and 8900 over time, but one things for sure this 3dmark score means didly. Like Oblvion in my example above, imagine if that 200 point 3dmark score ends up being a 100% Crysis stomping. Doubtful, but who knows.

Good post. 3dMark figures aren't everything.

3dMark06
7900 GTX: 6299
x1900 XT: 6100

Oblivion performance
7900 GTX: 20.2 fps
x1900 XT: 33.7 fps

I'll remain quiet until I see some "real" R600 benchmarks.

Well R600 BEAT 8800 in 3Dmark, so hopefully it will spank it more than your example ot the x1900xt did... just hopeful speculation. :wink:
March 21, 2007 2:31:57 PM

Quote:
Those numbers are horrible, and whatever the 8800 ultra is, it's a pretty safe bet to beat those numbers easily. If there is any truth to them then the R600 is shaping up to be a NV30 scale disaster. The silence from DAMMIT at this point is deafening. When are we going to get some real numbers? If they were good, why are they being kept a secret? When are we going to get a solid launch date?

I suspect the worst.

Wha? You are saying that rumored 3dmark scores of a non retail version of R600 equates to shaping up to an NV30 disaster? There is so much wrong with that logic.

For fun, lets just say that they have a reference board as will hit retail. Aren't you still putting too much weight into a bungilo_06? The X1900XT 512MB only beats a 7900GTO by 100 3dmarks in 06, yet play Oblivion or NFS Carbon on them, and the X1900XT totally spanks it. Have a look for yourself how little a 3dmark score means, especially a bloated SM2.0 score as TGGA mentioned. Apart from 2 ties, the X1900XT wins every game in this review:

3dmark06:
http://www.firingsquad.com/hardware/evga_e-geforce_7900...

Oblivion:
http://www.firingsquad.com/hardware/evga_e-geforce_7900...


Who knows at this point how R600 will stack up to the 8800 and 8900 over time, but one things for sure this 3dmark score means didly. Like Oblvion in my example above, imagine if that 200 point 3dmark score ends up being a 100% Crysis stomping. Doubtful, but who knows.

Good post. 3dMark figures aren't everything.

3dMark06
7900 GTX: 6299
x1900 XT: 6100

Oblivion performance
7900 GTX: 20.2 fps
x1900 XT: 33.7 fps

I'll remain quiet until I see some "real" R600 benchmarks.


It's true that 3dmark isn't everything. But who did the test? It sounds kinda like ATI themselves did the test and showed the results, but if the R600 is so disadvantaged in 3dmark06, you'd think they wouldn't test with it.

Also, regarding the numerous Oblivion references, it's not a good comparison either. ATI cards always do better in Oblivion. As was mentioned, the x1900XT beat the 7900GTX in Oblivion, but it's widely considered that overall the 7900GTX is more the equivalent of the x1900XTX.

I guess until we find out how it performs in a variety of tests, we really can't conclude anything... this isn't encouraging though.
a b U Graphics card
March 21, 2007 5:34:49 PM

Quote:
Also, regarding the numerous Oblivion references, it's not a good comparison either. ATI cards always do better in Oblivion. As was mentioned, the x1900XT beat the 7900GTX in Oblivion, but it's widely considered that overall the 7900GTX is more the equivalent of the x1900XTX.

I wouldn't completely agree; as things shaped up with shader heavy titles, I think the plain X1900XT 512MB is easily equal (if maybe not better) overall than the 7900GTX. Although from the onset they looked about identical, the X1900XTX would be ahead of 7900GTX IMO.

Look at Need for speed carbon, X1900XT demolished the 7900GTX (and the 1:1 X1800XT) in that also. Oblivion isn't everything, but it's somewhat a sign of things to come IMO. While the 7900GTX will compete most of the time and win some games too (OGL), I wouldn't be surprised if X1900XT stomping the 7900GTX becomes more and more frequent down the road. But my comparison showed that X1900XT only beats the 7900GTO by 100 3dmarks, yet beats it pretty handily in all but a couple games. Oblivion being the biggest lead of 100%, so I linked that one. And as much as the X1900's crushed the GF7900's in Oblivion, the GF8800's did the same right back. So Oblivion doesn't favor ATI cards, just the X1900 architecture shines in it compared to GF7.

Quote:
I guess until we find out how it performs in a variety of tests, we really can't conclude anything

Yeah, exactly my feeling. It will take time with retail R600's being tested in many games to get the best picture. And as always, it can change as new games (like DX10 ones) come out. So it's possible 8800GTX could say equal R600XTX in current DX9 titles, but down the road R600 pulls ahead and proves it's muscle in newer games. SHoot, for all we know it could be th opposite.
March 21, 2007 5:55:20 PM

ATI could wait until August to release the R600 and it still wouldn't matter.

The market doesn't have enough DX10 titles to justify this jump in technology.

Instead of releasing mid-range R600 cards, they should just make R500 cards with DX10 support like an X1900XT for $150-175 that has DX10 support.

I would've preferred Nvidia get off the whole "Ultra" banding, because most of their Ultra cards were spanked by ATI's offerings.
March 21, 2007 6:24:04 PM

To further get at what Pauldh was saying, yes, Oblivion tends to perform better on ATi cards than nVidia cards, but it's not because it "favors" one maker over the other just by being subjective.

Rather, it's that Oblivion changes the whole dynamic of game engines, by placing a very, very high amount of shader scripts into use, such to the point where the load they bring dwarfs that brought by the textures. And this is even when the game can over-fill a 256MB card's buffer, so it's not because they skimped on texturing. It's that each color-map texture is ALSO accompanied by an additional map for normal-mapping, specular-mapping, and often paralax- and diffuse-mapping as well. Once you add in the shader-based procedural generation of foliage, the shaders for reflective water, and the presence of HDR, it works up to a massive load.

And the results are quite visible: even a year after the game's release, it's still quite arguably the best-looking game released yet; only Crysis and Unreal Tournament 3 really stand as games close to release that could beat it; everything else is largely bound to the maximum capabilities of the 7th generation consoles, namely the Xbox 360 and PS3; really only Oblivion and those two games surpass what the consoles can do. (which is why the settings for both console versions of Oblivion are reduced)

So, I would most certainly concur with Pauldh says, that Oblivion is a sign of what's to come. Of course, time will tell; the modded version of GameBryo that was used for Oblivion likely won't show outside of Fallout 3, but it'll be real interesting to see the Unreal v3 engine and the Crytec 2 engine, as those are likely to be widely adopted within a year after their respective games' releases. However, I feel the results will likely resemble those for Oblivion, as they show the same signs of going in the same direction.
March 21, 2007 8:05:52 PM

Quote:
Rather, it's that Oblivion changes the whole dynamic of game engines, by placing a very, very high amount of shader scripts into use, such to the point where the load they bring dwarfs that brought by the textures. And this is even when the game can over-fill a 256MB card's buffer, so it's not because they skimped on texturing. It's that each color-map texture is ALSO accompanied by an additional map for normal-mapping, specular-mapping, and often paralax- and diffuse-mapping as well. Once you add in the shader-based procedural generation of foliage, the shaders for reflective water, and the presence of HDR, it works up to a massive load.
Many newer games use multiple texture layers (diffuse/color {they're NOT different things}, normal, specular {may be part of the normal map, but could also be separate as a "specular color" map}, parallax), Oblivion's foliage is not procedurally generated, and many games have reflective water. Oblivion's engine is just very poorly optimized. No geometry occlusion during scene management is the most notable problem, wasting CPU cycles and precious bandwidth with overdraw. And the specular shader, which renders a cube map, means even more overdraw.
a b U Graphics card
March 21, 2007 8:18:56 PM

Quote:

Instead of releasing mid-range R600 cards, they should just make R500 cards with DX10 support like an X1900XT for $150-175 that has DX10 support.


I agree with the 'NEED' to release statement, but don't understand how you think they would tack on DX10 support to an R5xx series card?

They pretty much have to redesign for DX10 support, so might as well make it what is the X2600.
a b U Graphics card
March 21, 2007 8:41:33 PM

Quote:
Oblivion's foliage is not procedurally generated,


Actually that's not what I've read. While the greater 'canopy' is not procedurally generated the individual tress, bushes, plants and grass are.

Quote:
No geometry occlusion during scene management is the most notable problem, wasting CPU cycles and precious bandwidth with overdraw.


Actually I agree with that, that's one of the big issues I had with Bethesda's implementation in Oblivion, especially since culling would greatly improve performance in many cards that struggle with the grass and complex situations. This was an issue in Morrowind as well, where even from inside a building the game was calculating the render of the city as if it could see beyond the walls.
March 21, 2007 8:44:43 PM

Quote:
Many newer games use multiple texture layers (diffuse/color {they're NOT different things}, normal, specular {may be part of the normal map, but could also be separate as a "specular color" map}, parallax),

Technically, I consider the diffuse/color maps to be different, as while yes, the diffuse map is pointless without the color map, it's quite possible to run it and disable the diffuse-mapping; simply open the console, and type in "tlb," which disables all lighting-related shaders in the game.

Quote:
Oblivion's foliage is not procedurally generated,

Oblivion's foliage is most certainly procedurally generated, as one can find by repeatedly loading the same save-point, and noting that the trees and grass look different each time. It's called SpeedTree. The trees themselves are hand-placed, though the actual structure of them is generated each time. The grass can be "randomized" in fact, simply by spinning around 360 degrees.

Quote:
and many games have reflective water.

Yes, plenty of games have reflective water. However, Most only reflect the basic terrain layer and skybox, and do not include any objects; depending on the settings, Oblivion goes from including large building objects, to potentially just about everything, which means that the game goes to drawing everything twice.

Quote:
Oblivion's engine is just very poorly optimized. No geometry occlusion during scene management is the most notable problem, wasting CPU cycles and precious bandwidth with overdraw. And the specular shader, which renders a cube map, means even more overdraw.

No, it's not poorly optimized; I've done extensive personal testing to find this out. The overdraw is actually something with a minimal impact in most places, and besides, using a Binary Space Partitioning system like the Source engine, let alone a portal-based engine like Doom3, would prove impossible in such an open-structured game; those primarily work because they're put into "tunnel-based" games. It is Z-based renders that are used by Oblivion, Crysis, and Unreal Tournament 3 that allow them to have such an open area.

The actual performance drain occurs primarily from things such as the AI. This is demonstrated by the fact that simply eliminating the NPC activity provides the single-largest boost to performance most can find in the cities, which are classically the worst-performing areas of the game. Similarly, it also shows that approaching an Oblivion gate drops the framerate once the daedra around it spot you, and combat begins, spuring them to go into full-scale combat AI mode.

It's not the fault of the graphics engine; it runs rather fine. The fact that it performs worse on most GeForce cards has nothing to do with how optimized it is. Such claims are a tiring, misled rant that I've been putting up with for about a year now...
a b U Graphics card
March 21, 2007 8:51:48 PM

Quote:

Oblivion's foliage is most certainly procedurally generated, as one can find by repeatedly loading the same save-point, and noting that the trees and grass look different each time. It's called SpeedTree.


Aren't some of the ground textures even procedurally generated? IIRC I though that some of the terrain was also like that, but it's been a while and I forget which is/isn't anymore.
March 21, 2007 9:08:09 PM

Quote:
Oblivion's foliage is most certainly procedurally generated, as one can find by repeatedly loading the same save-point, and noting that the trees and grass look different each time. It's called SpeedTree. The trees themselves are hand-placed, though the actual structure of them is generated each time. The grass can be "randomized" in fact, simply by spinning around 360 degrees.

Yes, plenty of games have reflective water. However, Most only reflect the basic terrain layer and skybox, and do not include any objects; depending on the settings, Oblivion goes from including large building objects, to potentially just about everything, which means that the game goes to drawing everything twice.

No, it's not poorly optimized; I've done extensive personal testing to find this out. The overdraw is actually something with a minimal impact in most places, and besides, using a Binary Space Partitioning system like the Source engine, let alone a portal-based engine like Doom3, would prove impossible in such an open-structured game; those primarily work because they're put into "tunnel-based" games. It is Z-based renders that are used by Oblivion, Crysis, and Unreal Tournament 3 that allow them to have such an open area.

I've looked closely, and I haven't noticed any procedural generation of tree foliage. As for grass, the placement of the grass in the grass clumps is randomized, but this isn't a very complicated (and therefore not very intensive) process.

Oblivion also only reflects the skybox and basic terrain (unless you modify the INF), and its performance is terrible.

If the performance hit from overdraw was as trivial as you put it, no modern games would implement occlusion culling. The problem isn't that the GPU can't handle the useless triangles. The problem is the batching overhead on the CPU.
Crysis is an open area game, and it occludes hidden geometry (and even dynamically, it seems, look for the new Sandbox 2 demonstration videos). Though, I'm sure you'll go and say "it's newer than Oblivion"... so how about Far Cry as an example? Open area game, 2 years older than Oblivion, and it uses occlusion culling.

Quote:
Aren't some of the ground textures even procedurally generated? IIRC I though that some of the terrain was also like that, but it's been a while and I forget which is/isn't anymore.
No, they're not.
March 21, 2007 9:23:49 PM

[/quote]Well R600 BEAT 8800 in 3Dmark, so hopefully it will spank it more than your example ot the x1900xt did... just hopeful speculation. :wink:[/quote]

Linkage?
a b U Graphics card
March 21, 2007 9:24:03 PM

Quote:
No, they're not.


Care to expand on that? IIRC erosion patterns and some dungeon parts were procedurally generated.

At this point you'd need to provide more detail, because based on your replies to NTK seems like you're just guessing based on observation.

I'd need something more to suport your statment before I took it as anything more than a guess on your part.
March 21, 2007 9:44:41 PM

Quote:
No, they're not.


Care to expand on that? IIRC erosion patterns and some dungeon parts were procedurally generated.

At this point you'd need to provide more detail, because based on your replies to NTK seems like you're just guessing based on observation.

I'd need something more to suport your statment before I took it as anything more than a guess on your part.You could load up the TES Construction Set and find there's nothing in there about procedural generation of texture patterns and dungeons. Erosion, mass object placement, etc. in the game world is generated offline/precomputed. There are no Daggerfall style (or even semi-Daggerfall style) randomized dungeons in Oblivion (though, I was hoping that they'd do something like that for the realms of Oblivion, the interior parts at least).
a b U Graphics card
March 22, 2007 12:00:56 AM

Quote:
You could load up the TES Construction Set and find there's nothing in there about procedural generation of texture patterns and dungeons. Erosion, mass object placement, etc. in the game world is generated offline/precomputed.


Well now that I'm at home and have access to my laptop and old links, I found one of the articles that supports the generation of texture patterns for erosion, and it's from the developers, so I'll take their word for it;

http://www.gamechronicles.com/qa/elderscrolls4/oblivion...

"We're combining procedural generation of landscape (based on its soil type and years of erosion), trees (based on species and random growth clustering), and grass (base on regional patterns) to create some really amazing areas."

I'm certain there was also mention of dungeon features in another article, but doesn't matter, because that statement from Todd Howard supports the idea that there is procedural generation in Oblivion, and it does include terrain.
March 22, 2007 1:23:33 AM

Quote:
I'm certain there was also mention of dungeon features in another article, but doesn't matter, because that statement from Todd Howard supports the idea that there is procedural generation in Oblivion, and it does include terrain.
It's procedurally generated... offline. Precomputed. I know about the tools that do said landscape generation, texturing, and object placement, I've worked a bit with the TES CS before. You could reverse engineer the TES4 ESM/ESP file format and not find any procedurally stored data.
March 22, 2007 6:54:03 AM

Quote:
Let's hope this doesn't reflect real world gaming performance.


Oh god forbid, its a synthetic benchmark so its synthetic performance.
March 23, 2007 9:57:39 PM

Quote:
I've looked closely, and I haven't noticed any procedural generation of tree foliage. As for grass, the placement of the grass in the grass clumps is randomized, but this isn't a very complicated (and therefore not very intensive) process.

It is for a GeForce card. The benchmarks reveal that just on the surface, and further testing proves it further.

Quote:
Oblivion also only reflects the skybox and basic terrain (unless you modify the INF), and its performance is terrible.

The performance of the water is pretty decent, comparable, from what I've tried, to how it is in almost any other game. I should note that even at the lowest level, it still reflects any object that shows up in LOD cells, such as cities, and the large bridge near the Imperial City. Which is more than can be said for the water in >90% of games, where they just get the basic land and skybox.

Quote:
If the performance hit from overdraw was as trivial as you put it, no modern games would implement occlusion culling. The problem isn't that the GPU can't handle the useless triangles. The problem is the batching overhead on the CPU.

I'm not saying that it's trivial, I'm saying that the performance impact is actually in the minority of what's causing the extreme slow-downs in many areas. Obviously, given that it is partly a CPU-load thing, (I obviously recognized that) reducing it could help, but much of the load is also based upon frame-buffer bandwidth as well. Once you have a video card with decent memory bandwidth, most of the overdraw's performance hit vanishes. In this case, it's drastically reduced for the tile-based Xbox 360, and is also vastly reduced for a number of other cards that have particularly high bandwidth and efficiency for their segment.

Quote:
Crysis is an open area game, and it occludes hidden geometry (and even dynamically, it seems, look for the new Sandbox 2 demonstration videos). Though, I'm sure you'll go and say "it's newer than Oblivion"... so how about Far Cry as an example? Open area game, 2 years older than Oblivion, and it uses occlusion culling.
Their occlusion is nowhere near as complete as most people make it out to be. For the most part, it is based upon the terrain; i.e, the terrain blocks things, and it checks for that. Most of Oblivion's overdraw problems are in cities, and stem from the buildings. Most FPS titles would simply use a BSP tree or the like, something that really can't be used in the game without WORSENING performance in plenty of other areas.

Quote:
Aren't some of the ground textures even procedurally generated? IIRC I though that some of the terrain was also like that, but it's been a while and I forget which is/isn't anymore.
No, they're not.[/quote]
While editing, you can use a few procedural generation tools to apply filters to the ground heightmap, particularly that of soil erosion. However, once it's saved to the .ESM/.ESP file, it's all done and ready, no more generation necessary.
April 15, 2007 7:24:45 AM

Quote:
well we can hope.. all i know is that i want atleast 20% above 88gtx or else im waiting for next generation :x


Yea, G90 releases this fall and R600 will have a nasty premium till then anyways. Ati really tripped up on this one; I was expecting alot better than 10% performance lead. And i'd just about bet nVidia will trim the prices on the 8800s after R600 launches.
a b U Graphics card
April 15, 2007 8:37:45 AM

nV would have to trim alot if they're pricing into the $400 R600, they'd have to cut $100 of their card to equal the price, and that's at less performance.

Either way in a few weeks it'll be an interesting marketplace.
April 15, 2007 9:09:18 AM

While with the early tests the performance between the R600 and G80 are not that big of a gap. We just have to see which one offers superior image quality.
a b U Graphics card
April 15, 2007 9:19:20 AM

Well I doubt the IQ differences will be much. Even now the difference between the X1900 and GF80 wasn't that big. The biggest diff was AA levels, and with ATi adding 24X they should both have the 'more than you need' factor.
April 15, 2007 9:25:32 AM

Yeah 24xAA is a good addition to the R600 but I don't really use much AA and at 1280x1024 I only go up to 4xAA for it is good to me. But there's an onboard sound and whatever features they are still keeping in the dark.
April 15, 2007 9:48:01 AM

Quote:
ATI could wait until August to release the R600 and it still wouldn't matter.

The market doesn't have enough DX10 titles to justify this jump in technology.

Instead of releasing mid-range R600 cards, they should just make R500 cards with DX10 support like an X1900XT for $150-175 that has DX10 support.

I would've preferred Nvidia get off the whole "Ultra" banding, because most of their Ultra cards were spanked by ATI's offerings.
It doesn't work that way. :lol: 
Quote:
Well I doubt the IQ differences will be much. Even now the difference between the X1900 and GF80 wasn't that big. The biggest diff was AA levels, and with ATi adding 24X they should both have the 'more than you need' factor.
Yeah, I mean at 16x basically the only visible aliasing comes from the limited resolution of any monitor.
April 16, 2007 2:49:19 PM

Well R600 BEAT 8800 in 3Dmark, so hopefully it will spank it more than your example ot the x1900xt did... just hopeful speculation. :wink:[/quote]

Linkage?[/quote]

Maybe read the whole thread?

:roll:
a b U Graphics card
April 16, 2007 7:16:05 PM

Quote:
Yeah, I mean at 16x basically the only visible aliasing comes from the limited resolution of any monitor.


And from a post I read by Humus (the ATi GuRu) in B3D forums on alpha blended AA, in order to appreciate the higher AA levels (above 8x IIRC), you need to ensure you monitor is properly adjusted, and with normal LCDs you're going to run into a wall where added AA levels will not make a visible difference which makes sense since you can make a half pixel you need to blend the difference, and if the gamma, contrast, colour are off, then the effect would be ruined at the highest end.
!