Forget the mole. It makes her look more like a real person with a slight imperfection on the face. If you want to complain about details, the overall skin texture/makeup makes her looks more like a store manikin then anything else, especially in the second pic.
Maybe its not a mole, sunspot or birthmark, lol could be an artifact hahahaha wouldn't that suck.
Do you think we will ever see the day when a video card is made that is to powerful and be bottlenecked by even the fastest cpu? Dual/Quad or not?
Can someone please tell me why the skin on all of these models shimmer??? It looks pretty cartoonish in the second and third pictures, but the first one looks pretty good, especially in high resolution:
I have said this many times before but pictures like that mean NOTHING AT ALL. Still shots like that can be rendered on a gpu from 10 years ago it may take a while to render but it will still look the same. Still shots and supposed DX10 shots like that are totally meaningless. What actually matters is being able to render that real time. I used to use 3d Studio Max and you could do shots like that all day long but being able to render a video of that real time is a different story. What really counts is actual game play not a picture of what a card can supposedly do. 3dMark and the like are fine synthetic or not but all the hype over ruby is pointless and tell you nothing of what a card is capable of. I would bet you money that Ruby was created in 3d Studio or Maya or a similar program meaning that pretty much any card on the market can render the exact same shot regardless of its Direct X#. It is all about the program itself not the card while an old card might take all day to render the SAME picture instead of say a few minutes on a current gen card the fact remains that both cards can render the same shot and it has nothing to do with what version of Direct X it is using. You can make any game with DX10 level detail and graphics and it will run on any card you want (regardless of being aDX10 card) but being able to render the game in real time is the only thing that is different. The same goes for any videos showing Ruby (if there are) they were produced in the same 3d program and again mean nothing since it could and probably did take a few hours to render the video itself which therefore mean nothing in the real world. Show me real time rendering or shut up.
I have said this many times before but pictures like that mean NOTHING AT ALL. Still shots like that can be rendered on a gpu from 10 years ago it may take a while to render but it will still look the same
but thats not from a still shot :roll: , its actually from the demo shown in Cebit.
and i doubt they would show you pre-rendered stuff as an official ATI demo since 99% of ati demo's are then released to the public to test out with their own cards. Also, you'll notice a fluctuation in the frame rate, and i doubt that would be there if it was pre-rendered.
I have said this many times before but pictures like that mean NOTHING AT ALL. Still shots like that can be rendered on a gpu from 10 years ago it may take a while to render but it will still look the same.
I see your point but you're sadly once again takling bullsh!t.
It can be done. Now don't get me wrong I am not saying anything bad about the new R600 only on the current obsession on pictures that are supposed to show what a card can do. I did watch the video posted earlier and the video was pretty good but not all that much better than what could be done on current gen hardware although the last scene with the aforementioned dike lips shot but I personally have a hard time beliving that was rendered in realtime but I could be totally off base. The snow effects were great though. It is just the obsession with the pics from DX10 games and such that have been floating around are a bit over hyped. I think DX10 will help in programing games to better use the hardware but any developer could program a game to get the same visuals it just wouldn't run at a high enough fps to be playable.