Shader moderl 4.0 or dx10 Question

illuminatirex

Distinguished
Jan 23, 2006
1,149
0
19,290
Hi, i have been folowing the development of Project Offset game for some time now (http://www.projectoffset.com/game.html) does anyone know wheather the units look as they look because of the shader 4.0? or is it still shader 3? or maby they are done in dx10 (emulated) and thats why they look so good? Im a noobie in regards to the shader topics. The images which are on the site come from an ingame video and those 2 others were done during creation I know the Offset Engine uses a fully 64-bit floating point HDR rendering pipeline, is that the reason the units, enviroment looks so nice?
 

hannibal

Distinguished
Check in the forum on their site. If it isn't there they probably aren't talking about any spec's yet.


"Project Offset"
- Console and command parsing system
- OS support layer (everything that deals directly with the underlying operating system)
- File system, with full mod support
- Resource management / hash table lookups
- Profiler / performance stat output
- Generalized "Buffer" class, used for just about every type of buffered IO in the engine (file, network, memory)
- Texture manager

The texture manager, which gives a few insights into the kinds of features that you'll be seeing in the renderer. It supports the following features:

- Reads .tga, .dds, .png, and .jpg formats
- Writes .tga and .jpg formats
- Supports DXT1, DXT3, and DXT5 compression
- Supports cube maps and volume (3d) textures
- Supports floating point (64 bit and 128 bit) textures
- Supports normal maps
- Internally converts between image types
- Supports Industrial Light and Magic's EXR library (which includes 16-bit floating point support and is used in some of the image conversion routines)
- Supports render targets and dynamic textures
- New formats and conversion routines can easily be added when needed

- SM 3.0 and full floating point support
- fully 64-bit floating point HDR rendering pipeline
- All objects in the world are affected uniformly by shadowing, with correct self-shadowing on characters and other complex objects. Shadow softness and resolution can be controlled on a per light basis.
 
Yeah it just looks like SM3.0 right now, but likely like somany other games will move into the SM4.0 realm before launch once all the artwork and gameplay is finished. It's far enough off that I'd suspect it'll be a bit of both.

BTW 64bit HDR in this case is not 64bit per channel it's just the standard F16 per channel.
 

illuminatirex

Distinguished
Jan 23, 2006
1,149
0
19,290
thx for the info, would the 1900xt(non crossfire) be able to recreate similar images as the ones presented in the demo videos? (if the game will remain a 3.0 shader game), or would it be rather unlikely to get those results in a regular game? (using that 2 gigs of ram, and a core 2 duo 2.4 cpu)?
 
thx for the info, would the 1900xt(non crossfire) be able to recreate similar images as the ones presented in the demo videos? (if the game will remain a 3.0 shader game), or would it be rather unlikely to get those results in a regular game? (using that 2 gigs of ram, and a core 2 duo 2.4 cpu)?

Couldn't say for sure, but doesn't look like they are doing anything outside of DX9.0C, the self-shadowing in the GDC demo is impressive, but considering it's all supposed to be real-time, then that means for sure it's currently DX9.0c because there is no available DX10 hardware to render real-time on right now, only do it through CPU hardware emulation, so it's gotta be on DX9 hardware, and based on their relationship I would think it was done on nV hardware.
 

illuminatirex

Distinguished
Jan 23, 2006
1,149
0
19,290
yes it was done using nvidias 7900 i think or maby the sliof those vga's

thanks for the info, so i gues the 1900xt with that 2 gigs ram and 2.4 core 2 duo could handle it? shouldnt it?
 
Yeah, it should handle it fine, and maybe give you OPENEXRHDR + FSAA which nV can't do 'well'. The limitation for Ati my be Render2Vertex, but we'll see if it's even a concern if they support the work around (always wonder with TWIMTBP games).

I'd be interested in seeing the physics implementation they plan on using, and also the motion blur demo in last year's example was interesting but it mae me wonder about the motion blur, as it's supposed to aid in the fluidity of motion, not make it noticeably blurrier, it's supposed to be virtually unnoticeable when running realt time, and the first exposure of those blocks looked like they had smoke trails coming off them, motion blur should lead one and trail the next frame, but it all seemed to be trailing blur, but other than that should be good.

Oh and PS;

yes it was done using nvidias 7900 i think or maby the sliof those vga's

Except that the first demo was the middle of last year, so prior to the GF7900, so it's likley GF7800 in the tech demo, and then GF7900/GF7950 for the Dragon GDC demo.
 

illuminatirex

Distinguished
Jan 23, 2006
1,149
0
19,290
:D nice so i might be able to play it relatively well(with relatively high graphix)....hopefuly btw what is the nvidias counterpart to 1900xt is it the 7900 gt? (im not mutch into nvidia...sorry)
 
:D nice so i might be able to play it relatively well(with relatively high graphix)....hopefuly btw what is the nvidias counterpart to 1900xt is it the 7900 gt? (im not mutch into nvidia...sorry)

Yeah it'll be nice on either of them really.

And while they don't directly link up in everything, pretty much the competing parts are

X1900XTX=GF7900GTX
X1900XT=GF7900GT
 

illuminatirex

Distinguished
Jan 23, 2006
1,149
0
19,290
thank You for the nvidia|ati comaprison, i always got confused with nvidia namings, but i gues thats because i almost always paid atantion to ati, and to nvidia, i just new that nvidia had a beter, or worse gpu at a certain time, I think thats because my issues in the past with the 5500series (i forgot the exact name of it) gpu which i got just right after it was released, it had sme issues, didnt know if it was with xp (which was relatively new then) or something, and then i got an ati card, ...which im still using (the ati radeon 7500 all in wonder) and it worked fine the first time i installed it.
 
A lot of the members of their forum are under the impression that it's actually OpenGL based.

Could be but most of the stuff I read including the comparison to Oblivion, mentioned DX.

Regardless thereof the technology to support the demo would be found in both ATi and nV, it would be more about playing to the extensions and speed more than anything else, which may come into play depending on the Z handling.
 
if it is it might not be working on vista , wouldnt it? i read somewhere that OpenGL might not be "supported in vista.

Vista will support OpenGL it's just questionable at what level and what performance penalty if any.

OGL 1.5 is imbeded in Vista, 2.0+ will be supported through a compatability layer, and may result is some performance drop compared to OGL2.0+ on XP, but no one knows for sure yet. Some people sayin M$ is bending to the will of gamers and devlopers (now they're afraid of losing customers since tyhey're already delayed?), but we'll see when it ships.