Fugger and others, FPU performance
I haven't known or found anything to support your claims about video rendering. Could you elaborate? I know the professional cards can do way more than any mainstream products out there but not for actual precise rendering to disk. I don't believe the likes of toy story can be hardware rendered the way you describe but I am open to your input. I thought this needed a new thread since we kinda took over the other one that wasn't about this.
I guess the main factor here is cost.
A URL to software that utilizes hardware rendering to disk at its best.
More real time, 3D, 2D, chroma, live, to disk hard based rendering.
Intel based with Glint graphics ~ Onyx2
Not sure if you seen these yet from the geforce3 (I know this is not "render to disk" but looks good)
Not sure why you put FPU in your subject.
look around some, its been around for years now.
Oh yeah, I forgot to talk about fpu. Obviously amd is much better at the core of what fpu is used for. I will use computer animation rendering as the the main topic. It is a fact that SSE does not effect this use off the fpu. I don't understand why it can't because I don't know too much about SSE except that it doesn't work in this case. Does anyone know if SSE2 would help?
Those images you showed me are good (for realtime). Sure I'd love a video card like that so my previews ran faster but not even professional cards replace
the cpu for rendering.
Sorry, but you seem to be wrong. I am talking about advanced computer animation. Not flying logo's in realtime or really lame diagrams meant for game shows for tv news. Realtime is at least 2 years behind cpu rendering. You'll probably never see a video card handle rendering the way cpu's do so your claim that the fpu for intel sucks because you don't need it is in error.
huh, full length feature films dont count?
Complete scene morphing.
Whatever dude, stay lost in your shallow world.Its been possible for years. just because you cant afford or ever seen it for that matter doesnt mean that it doesnt exist.
If you think profession studios still render frames of animation with software (complex frame @ 2+ minutes per frame) then preview, tweak, re-render, view, tweak, preview your fricken nuts.
The links I posted were apps for SGI Onyx2 w/ 700+ Gbs on the bus.
That has nothing to do with Intel FPU.
"I am taking advanced computer animation" AHAHAHAHAHA, more like a 3D studio max class.
What modeling program are you using? need a good laugh.
How can I put this politley. Fugger you dont know what you are spouting about. Hudini uses a software renderer based on the Renderman Spcification. All Onyx's use Mips processors.
SGI just recently started to use Nvida chips in there NT systems, following a pattent infringment settlement. The graphics of there higher end systems is compleatly proprietary.
There isn,t a Movie that has been hardware rendered it is impossible at this point in time.The resolution just isn't high enough with hardware rendering. Hardware rendering is apropriate for training fighter pilots and examining the structure or style of a car interior. The best hardware rendering is slightly better than the best video game but no where near that of movie resolution. Most ILM, Digital Domain and so on, use Renderman which is very fast at extremly high resolutions. Single frames of Toy Story 2 run as high as 2Gigs, this is why Pixar has aprox 2000 processors 10 100 processor Sun servers, aprox 300 Octanes and 300 Octane2 and a host of others.
The info is out there Fugger You just have to read it.
Does fugger have a clue or not? Can we settle this? I'm sick of going back and forth. Since I know jack I'd love to know...
Look at how far we've come with ToyStory2 and Dinosaur and the Upcoming Final Fantasy movie. Look at all the CG out there. What's making it?
"I think I brained my damage"
Another inportant aspect of cg for broadcast and film is that 2D t. Most things for example Stuart Little is rendered out in layers then composited. It is easier and faster to seperate the layers. That way if corrections need to be made the whole seen doesn't have to be re rendered. 2D animation and compositing accounts for allot of what finaly seen.
More info can be found on:
Since you dont believe me and you got gimps chiming in for no other reason but to discredit me.
"The SGI Onyx 3000 series is the premier choice for high-end TV and film production and post-production, real-time broadcast effects, digital museums and theme parks, and real-time processing of high-definition satellite images. Graphics and digital media-optimized processing, together with an architecture created for high throughput, allow users to work with high-definition uncompressed video or film-resolution images, create multilayered 2D and 3D effects, and edit simultaneous streams of standard or high-definition video in real time. The scalability of the SGI Onyx 3000 series allows small and large entertainment complexes to flexibly create, share, and tailor small exhibits all the way up to full-scale Digital Planetariums and interactive theme parks. "
Your brain is small and your thinking on the "PC platform" level. your limited by your budget too. so do some research before you post stupid stuff again.
All that and high definition too "real time", gotta love it.
Go back to modeling and rendering your simple scenes.
Why don't you just say it straight, Fugger is a dick who thinks he is right about everything and will never listen to another persons opinion without insulting them first. Fredi, it's about time you banned him, he doesn't help anyone, just likes to get in arguments.
- I don't write Tom's Hardware Guide, I just preach it"
Well if you have read some of FUGGER's work before this or a lot of it like myself. You would know that FUGGER works for SGI and that he doesn't seem to play well with people that question Intel. Which is sad but very fun at the same time. As far as banning him it will never happen. I love reading his posts and so do others Look at the views and posts anything he is in draws a crowd (Very good job FUGGER. Even if that is not your intention)
Opinions are like assholes, everyone has one and most of them stink
Dude, the 500+ processors they use with that doesn't need fpu does it? If those were filled with athlons it would be way better you can be sure to believe that. Just because the P4 doesn't have any fpu worth mentioning doesn't mean you have to say the world doesn't need fpu. Face it, Intel [-peep-] up.
If you want to rely on your case that intel released the p4 crippled to get everyone used to it then go ahead. If that is the case, then intel shouldn't get away with it. They pretty much make you pay for over priced BETA products. You paid hundreds on rambus, hundreds on the P4 and you got [-peep-] performance and the only thing you can say about it is "wait for .13 micron". Why? So you can pay another $800 for a product you should have gotten the first time around?
As it is right now, the athlon is the best cpu around and amd has fair prices which adds to the people that go AMD. Until Intel finally releases a cpu that is they form it should be don't talk about the P4 anymore. As far as I'm concerned it is just a beta product that intel fools people into buying, whether it is gullible people that fallow the blue men like drones or morons like you.
Im glad that shut you guys up, Im happy to prove you wrong yet again.
FPU, blah blah blah.
FUGGER, blah blah blah
just a bunch of whiney bitches.
Althon will never be considered in our clustered environments, AMD makes a shitty CPU without thermal protection = NO good for clusters = no good for mission critical application.
AMD hammer "software simulator" is the bomb. much better approch than final beta P4.
At least developers had a chance to start early on SSE2.
Im glad Intel future is not in the hands of some simulator. would have been alot easier to make a macromedia movie. =)
This is some info on upcoming Shrek movie from PDI using Renderman
Hell of alot of wimpy machines, Origins with R10k at 180Mhz woohoo.
That is peanuts compared to over 700gig per sec bandwidth, from a single box, not 836 boxes. at least they are using fibre network. That page you posted must be old cause O2 is um, the wrong/sad choice for renderfarm. R10k is older cpu, still decent, but R12ka owns it hardcore.
Back to AMD in cluster, one major flaw you overlooked. its called SMP. yeah that ugly acronym that AMD cannot touch. GG have a nice day.
Bandwidth is nice, but many things take fpu as a priority before x amount of bandwidth. It can take half the amount of bandwidth cpu can put out but need an unlimited amount of fpu power. They day intel releases a cpu with highest fpu power will be the day I say they are better. Intel needs to stop relying on software to power their cpu's. If they have a trick to change the way software is coded so it runs faster on their chips then great, looking forward to seeing the increase when the software is out but don't solely rely on that.
Just a comment from the outside because I don't know much about anything you are talking about and don't know you fugger. BUT, just from reading this, I have to ask you fugger - are you this immature in real life or is it just an internet thing?
You can still be right about things without being cocky and demeaning to other people. You must be a very unhappy person...
Even if you were the smartest person in the world, when you act like you do people don't like you or respect you. It's not worth it to be unkind to other people, ever, especially not over subject matter such as this, which is really relatively unimportant when you put into perspective.
Danny, im very well respected.
Not sure why people take what I read and get so worked up over it. But I guess I do a good job at messing with the AMD lemmings, pushing all the right buttons from the replies I get.
Telling me real time compositing and 3D rendering was not possible is like me telling you that man never visited the moon. would you agree that man has never been to the moon?
Seems stupid, but AMD lemmings try to argue a point based upon what they dont know just to discredit me. I love those the best.
I post links to info that I post. I just dont post stuff like...
"Danny your a moron, you suck and you dont know what your talking about, danny your stupid and dont know your ass from a AMD salesman."
That is quit typical of some of the crap people try to discredit me with.
I am calm, I am mature, just read between the lines and you will see a picture of everyone vs me and AMDmeltdown. You will see AMD lemmings chiming in to back each other up over stupid stuff like "AMD does not have thermal problem" and "AMD does not have incompatability issues". when in fact both problems are a AMD joke, AMD is unable to add core logic to protect your investment.
Fugger, I am trying to have a norm conversation with you, I'm not a lemming so plz don't treat me like one, even though I haven't taken any offence since you haven't seemed to be like that. Agreed that their are a lot of people that are morons but that a side.
I didn't mean that any good realtime 3d isn't possible but it still can't compete with software renderings you see. What I am trying to point out here is that fpu is very important especially with today professional applications. If that machine was loaded with 500+ processors with better fpu, whether or not it is from AMD or Intel, then it would be many times better. You can't dispute that.
I'm just trying to say that Intel messed up with the p4 because it has crippled fpu and they tried to just rely on SSE2. SSE with the p3 was good but the p3 still had the best fpu when it was released. I bought it knowing that but was disappointed with the results of SSE. Now that SSE2 is out it will still be quite some time before software takes advantage of it.
Saying that todays benchmarks aren't fair isn't right. SSE/2 should be treated as a luxury because it is, it may be advantage to your app in may not.
I like AMD because of its pure fpu, I don't own one but want to. If the tables were turned I'd like what ever company had the best pure fpu. As it stands AMD is the best because of the fpu (other may see Intel chips as the best for other reasons).
Unless intels real p4 comes out (not the crippled beta Intel suckered people into buying) it better have the same pure fpu as AMD or they will loose a lot of sales from people that need fpu like me. If SSE2 has a big fpu boost like intel says it will than that will be great but even if it makes the p4 perform as well as the athlon I still won't be getting it because the poor base fpu. The athlon with SSE2 will be something I am going to be looking for in the future.
Well you shouldn't say stuff about the p3 fpu "sucks". You know the P6 core is 6 years old. Lets see i say its one of the best fpu's even made even better then the athlon core will ever be. Athlon is a good core but 1995 is so far away
P3 500e@1ghz/i815ep/386mb/36gb/sb mp3+/radeon32mb :cool: k6-2 300/430tx/64mb/13gb/sb512/Sav4 32mb
You dont know your ass from a hole in the wall. You havent got the slightest clue of how an FX studio work or how movies are actualy made.
On one leval there is now way that you could work for SGI in any other capacity than janitor. But on an other leval you could, because the only thing in SGIs sights, are its own feet. SGIs bend of self destruction is only matched by your ignorance.
I read all the article links you posted, but i'm still confused to why you think cinema quality (like those used in movies ranging from Jurassic Park to A Bugs Life to Disneys Dinasaurs) 3D rendering is done in hardware.
They use complicated 3D physics & Calculations. have you seen any 3d hardware that claims to do full raytracing with phong shading?
A proper answer would be appreciated.
"2 is not equal to 3, not even for large values of 2"
For those of you following this thread, this is a link to one of the last pages in Ace's Hardware's review of the P-4 It deals specifically with rendering and wireframe manipulation.
Quote:Back to AMD in cluster, one major flaw you overlooked. its called SMP. yeah that ugly acronym that AMD cannot touch. GG have a nice day.
That's so, eh? Well I will be laughing at you, paying four times as much as me for your dual P4 system, and I will be sitting here with my dual Thunderbird 1.2GHz system, kicking your ass all the way back to Satan Clara.
"648kb is all the space anyone would ever need!"
Bill Gates, 1980s