Sign in with
Sign up | Sign in
Your question

Geforce3 Ultra vs. ATI 8500

Tags:
  • Graphics Cards
  • ATI
  • World Of Warcraft
  • Nvidia
  • Graphics
Last response: in Graphics & Displays
Share
Anonymous
a b U Graphics card
a b Ý World of Warcraft
August 25, 2001 6:20:13 AM

Wow!! I have heard some buzz around about Nvidia announcing successors to the Geforce3, namely the Geforce3 Ultra. If this is in fact true, please help me find some information about it.
But what I am really interested in is will it really be worth waiting for or does the ATI 8500 look like the best choice. I am really torn because of my devotion to Nvidia, but ATI is a great company as well.

I heard the ATI 8500 can bend the polygons, which Nvidia's card cannot as the current version. And I have heard it has higher bandwidth, more room for longer instructions, better quality for video.

I will mostly be using the card for gaming, but I do watch a lot of DVDs on my computer.

Another quick question, would it be possible to have the best of both worlds, (in my opinion): would I be able to run a Geforce3 ultra card and have tv-inputs/outputs with a TV wonder card from ATI. Would that solve the issue of great gaming and great dvd viewing!

Thanks for any information!

Andrew P.

More about : geforce3 ultra ati 8500

August 25, 2001 6:33:15 AM

I've heard nothing at all about a GeForce 3 Ultra, where did you hear this?
The ATI 8500 has better features than the GeForce 3, and will probably be faster than it by the time it's released.
The real question is, do you want to wait for NVidias next release? There will ALWAYS be a better card just over the horizon.

"Ignorance is bliss, but I tend to get screwed over."
August 25, 2001 9:23:46 AM

Rumours have been around the web, and it seems well based - nVidia is soon to release a Ultra and an MX version of the Geforce3. For a first time the Mx version is suposed to have 128bit DDR memory.
The Ultra (or Pro - naming still not final) is an increase in core and memory frequencies over the regular Geforce 3.



How terrible is wisdom when it brings no profit to the wise
Related resources
August 25, 2001 12:42:30 PM

I heard the GeForce3 Ultra will be lower clocked than the Radeon 8500 so the Radeon 8500 might beat it.

AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
August 25, 2001 3:25:35 PM

I heard that there will be two cards based on the nv25: the GF3 Ultra and the GF3 Advanced. As for the specs, I know nothing but rhumors.


<font color=red><i>Tomorrow I will live, the fool does say
today itself's too late; the wise lived yesterday
Anonymous
a b U Graphics card
a b Ý World of Warcraft
August 25, 2001 3:36:52 PM

I heard about it from this discussion board, I believe?
August 25, 2001 5:24:46 PM

Radeon 8500 Maxx is there answers for the Geforce 3 ultra. Using some newer things to get it running like a Chipset managing it instead of Windows. Also HyperTransport will be used. Rumored to hava around 17.6 gb's of f'n bandwith. Sorry Wusy there is going to be a new champion of the 3d world.

Nice Intel and AMD users get a Cookie.... :smile: Yummy :smile:
August 25, 2001 5:55:52 PM

Radeon 8500 MAXX can't compete with a GeForce3 Ultra. In some cases the Radeon 8500 MAXX maybe twice as fast. The GeForce3 technology is a case of too early, too little. Nvidia released the GeForce3 too early and didn't add support for DX8.1. ATI had a few months to learn from Nvidia's mistakes and it's own.

AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
August 25, 2001 6:04:34 PM

Radeon 2 maxx could own the Geforce 3 ultra and the Geforce 4.

Image the R300 !!!!!!!!!

Nice Intel and AMD users get a Cookie.... :smile: Yummy :smile:
August 25, 2001 6:56:19 PM

Something to consider is that the Radeon 8500 is a Hydravision capable video card meaning you can hook up two monitors at once, monitor and TV set and even a HDTV set using a DVI to component output adapter. Meaning watching high quality DVD or playing your games on a HDTV set with a 65" screen or bigger is possible while using your monitor at the same time. The superior DVD playback ability of the Radeon 8500 has to do with the GPU vice the ATI video chip. How ATI treats the decoding of the Mpeg2 of DVD is the reason for the superior playback. ATI uses both motion compensation and IDCT which not only reduces cpu usage which is less important now with the more powerful cpu's but allows for a 9 bit error term vice a truncated 8 bit error term used by non ATI cards meaning better video quality. Also by using the IDCT engine reduces the data needed to transfer over the AGP bus, enables advanced video features such as time-shifting and digital video recording with the extra CPU cycles. In addition the ATI Radeon/8500 uses a 10bit DAC vice a 8 bit DAC meaning much better colors in Video which only high end DVD players have.

Another aspect is image quality, buying a high end card should mean high end image quality. On the Nvidia chipped cards this hasn't always been the case. The image quality varies tremendously between the different manufacturers of Nvidia cards. Right now only the Gainward GF3 had 2d quality comparable to the Radeon while all the other GF3 cards fell short. This was tested by AnAndTek comparing different video card 2d image quality. Now no one can tell me that if the 2d is blurry that then the 3d will be crystal sharp. Not, if the 2d is blurry at high resolutions means that any detailed texture at high resolutions will also be blurry in 3d. Reason, because it is still using the same video circuit, RF filters and DAC.

To me performance is a broad area and not limited to FPS but 2d, video, DVD, color quality, image quality as in sharpness, drivers, reliability, support, features and FPS makes up overall performance.

2d - Radeon 8500 hands down
Video - Radeon 8500
DVD - Radeon 8500
Color quality - Radeon 8500
Image sharpness - Radeon 8500
Drivers - GF3
Relibility - about equal
Support - ATI in general has better support, try to get support from Asus or Abit on a video card failure, ATI will replace it. Some Nvidia cards have better support then what ATI provides such as Hercules. Rage3d.com is probably the best support group of any video card out there dedicated to the Rage/Radeon cards.
Features - ATI Radeon 8500 by far
FPS - Radeon 8500 looks hot, still some benchmarks will be faster on the GF3 while others is on the Radeon 8500.

My biggest beaf is people saying my card is better then yours because I can play QuakeIII at 95FPS while you can only play at 85FPS. Your card is junk, no good etc. etc. yet on another game the same card plays 35FPS while mine is playing 33FPS. Well if my card was junk playing at 85FPS in QuakeIII must mean now that your card is also junk playing at 35FPS or below now. In other words if 85FPS wasn't sufficient why should 35FPS latter on in a different game be sufficient. Bottom line, if your game is playing smooth for you with all the features turned on that you want and looks awesome then who cares if the competing card can do 100FPS faster but looks like crap. FPS idiots get a clue.

ATI's TV Wonder cannot not compensate for poor DVD decoding of non ATI graphics chips. Some people are more picky then others. It does offer a Turner, video/In/Out, software for recording etc..

<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 08/25/01 03:00 PM.</EM></FONT></P>
August 25, 2001 8:48:57 PM

Even if they are equivalent in terms of core and memory frequency, RD8500 may be superior to GF3 (even in ultra version) due to a more advanced feature set and some other tricks it performs (bandwith saving techniques,...).

But it will depend strongly upon drivers and ATI trackrecord isn't all that brilliant in that area.

How terrible is wisdom when it brings no profit to the wise
August 25, 2001 9:10:30 PM

Quote:
Using some newer things to get it running like a Chipset managing it instead of Windows.

Could you please be a bit more clear on this and explain this feature?

BTW hypertransport is going to be used for what? I really don't see aplication for it within a high performance video card - I even doubt that AMD would see it :wink:

Quote:
Rumored to hava around 17.6 gb's of f'n bandwith

Don't think so - for having that kind of actual bandwith it would have to:
-have 550Mhz DDRRAM (actual frequency - more than twice the current GF3);
or
-265Mhz QDR memory (highly unlikely - not even 100Mhz has appeared)
or
-dual 128bit (256bit) memory interface and memory banks
or
-embeded RAM

As none of the above is likely (or even slightly mentioned in the RD8500 previews), you may be refering to an "efective bandwith" concept for comparation with traditional renderers - the bandwith saving techniques can (in Ati opinion) duplicate efective memory bandwith (a bit like tile based architectures do). Then this 17.6GBs figure could make some sense.




How terrible is wisdom when it brings no profit to the wise
August 25, 2001 9:32:55 PM

The Radeon 8500 MAXX will use a technique similar to a dual 128bit (256bit) memory interface. Read about the old Rage Fury MAXX. It had twice the bandwidth of the Rage Fury Pro, because each engine shared the bandwidth.

AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
August 25, 2001 11:35:30 PM

The GF3 does actually support DX8.1. The shortcomings were in fact the other way round, as in DX8 not supporting 3D textures, Pixel shaders upto version 1.1 etc. GF3 supports upto 1.3 and R200 supports upto 1.4.

Saying the GF3 doesn't fully support DX8.1 because of no n-patch would not be entirely true. Both DX8 and DX8.1 require the graphics card to be able to implement higher order surfaces. While DX8 only allowed the polynomial model of HOS, 8.1 added the support for n-patches. ATI chose this method instead of the polynomial route, which is implemented in the GF3.

Now I don't think that many games will actually be using any shaders above 1.1. As for Higher order surfaces, many argue that a descent cpu will still do better tesselation than the Graphics Hardware in both of those cards. The power of the tesselation units in either card has not been proven. We can only find out as games implement them (which I think is unlikely). The Radeon8500 will do its n-patches on most current games, so we might just yet get a glimpse of how well it performs.

Any way, DX9 will add higher level language support for the shaders and full nurbs support, so we might see the limited polynomial vs n-patches argument thrown out the window.

I wonder what STM and the Bitboys are upto. Will STM stick to the deadline they gave them selves? Will bit boys be able to put that paper plan of theirs into silicon, ever? Or will they have the nv50 killer on paper, then the nv100 killer - again just on paper?

I'm still waiting to see a collision detction and inverse kinematics engine built into the graphics/gaming hardware. Now that is what I want to see as the next "ground breaking" feature. Any new graphics feature we see in the new hardware isn't likely to "wow" any of us except for a few fan boys here and there. We all know who they are.


<font color=red><i>Tomorrow I will live, the fool does say
today itself's too late; the wise lived yesterday
August 26, 2001 2:15:25 AM

Quote:
I'm still waiting to see a collision detction and inverse kinematics engine built into the graphics/gaming hardware

.

Me too, allowing true interactive enviroments where you can pick up objects, throw them, construct etc. That would be something for the mouth to hang open for. Seems like N-Patches would be a good solution due to the fact of all of the other video cards lacking any sort of higher order surfaces except for the GF3 would break with parametric higher order surfaces. Seems like it is the only solution that is viable for a developer to use and allow easy playing of their games on new technology and old. If the GF3 supports 1.3 why wouldn't the XBox? Plus isn't the Xbox using DX8.1 as well? Meaning we should see broad support of Pixel Shader 1.3 and not 1.1. My thoughts on the matter.
August 26, 2001 10:36:33 AM

If I remember correcly (memory can be tricky), the Maxx used two similar processors onboard working through AFR (alternate frame rendering) - not really a unified 2x128bit memory interface that no one has implemented yet (AFAIK), but (similar to 3dFx SLI) two different memory banks, each for its processor.

Being true, in the context of nowadays games, the board would have to have 2 processors and 128 MB of DDR memory onboard to be competitive with Geforce3 Ultra. And although it could use slower cores and memory (166Mhz memory would be enough), it would have to use twice as much and, besides the doubled cost of (cheaper) components, the onboard space available, power consumption and heat dissipation would make it a engineering challenge. Voodoo 6000 anyone?



How terrible is wisdom when it brings no profit to the wise
August 26, 2001 12:35:33 PM

I'm not entirely sure but I believe ATI has implemented an enhanced version of AFR that unifies the memory interface of both chips.

AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
Anonymous
a b U Graphics card
a b Ý World of Warcraft
August 27, 2001 12:46:41 AM

why predict? let's just wait and see!!!!!!! I want radeon even if it's slower!! I like Truform technology!
Anonymous
a b U Graphics card
a b Ý World of Warcraft
August 27, 2001 6:40:01 PM

I agree 100% noko about the FPS rage. I'm completely happy with the performance that I get out of my V5 5500. Yeah, I would like a GF3 or maybe even the Radeon 8500,but for now it's not worth spending another $400 after I spent $300 in October for this one. Performance is awesome in all my games. The V3 2000 I had before this one was an excellent card too, and still performs well in one of my lower end pc's!
August 27, 2001 9:52:17 PM

1st, Well like said before the GPU's share the Ram bandwith like the rage128 maxx.

2nd, well its a chipset managing the Gpu's instead of window having to find both gpu's instead it thinks it one gpu tricking it. So Maxx mode wouldn't need drivers or fixes to run correct. Well also it can even trick linux with the right drivers. Maxx mode will be done by hardware instead of windows managing it making it a better design.

3rd what better then using AMD HyperTransport for the transfer from Chipset to Gpu's. Basically the Radeon 8500 maxx would be a SMP for the Radeon kinda.

Nice Intel and AMD users get a Cookie.... :smile: Yummy :smile:
August 27, 2001 10:54:26 PM

Actually, the Rage Fury MAXX doesn't share RAM. Each chip had 32MB reserved to it. So therefore, with the Rage Fury MAXX, each frame in a game only has 32MB and half the bandwidth. Through buffering though, one chip is always 1 frame ahead of the other chip, so if you jump from a complex scene to a simple scene or vise versa (eg. stepping through a portal in Quake III, the frames will be jumpy and it may seem to skip some (it doesn't really skip them, it's just some frames will appear a few milliseconds too long on the screen while others will disappear a few milliseconds too early. Most people won't notice this but some people find it annoying because you get really bogus frame rates, that make the Rage Fury MAXX seem worse than Rage Fury Pro when actually it's better).

Anyway, I read somewhere that ATI has fixed this problem by allowing both chips to access the same unified RAM at the same time. I'm not sure if this is entirely true because the Radeon 8500 MAXX won't come out for another 6 months or so, but that's the rumor I hear.

AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
August 28, 2001 5:57:52 PM

If the GPU's share bandwith, then the board will probably be bandwith limited

If each GPU has its own memory bus, then the amount of memory will have to be duplicated

If Ati implements a dual 128 bit unified memory controler (a la nForce, but with 128 bit ddr - twice the traces...), then I would like to see what they'll use to connect the memory controler to the GPU's.

It will be interesting to see the chiset diagram of that card.


How terrible is wisdom when it brings no profit to the wise
August 28, 2001 6:59:43 PM

Shader 1.3 would be nice, but all the XBox sdk's are still kinda buggy. Although the XBox uses a "special" version of DirectX, which is likely to have all the features of DX8 + 8.1 and more, One of the things is that console licensers make their licensees put restrictions on the features they put on "open" ports (i.e. to the pc). So it is likely that Microsoft is excercising that on themselves by resricting certain features on the PC Direct X (i.e. 3D textures). This could mean the XBox had shader 1.3 support right from the beginning. This could also mean PC developers haven't had access to dx8.1 since the beginning, like aquanox and stuff. I may be wrong though.

<font color=red><i>Tomorrow I will live, the fool does say
today itself's too late; the wise lived yesterday
!