Sign in with
Sign up | Sign in
Your question

GF4 Ti4600 article, discussion please

Last response: in Graphics & Displays
Share
March 4, 2002 4:20:34 PM

I'm starting a new idea, a single thread for each article posted on THG. This will help keep everything together, instead of people posting thoughts in different threads, and having the conversations hard to follow. My thoughts will follow, I'm just now starting to read the article.

The GeForce 4 Titanium series: neither capable of withstanding 4 G's, or made of Titanium. Discuss.

<font color=orange>Quarter</font color=orange> <font color=blue>Pounder</font color=blue> <font color=orange>Inside</font color=orange>
Don't step in the sarcasm!
March 4, 2002 4:28:59 PM

Wow, all four of those cards are ugly as sin. I wouldn't want to put any of them in my computer. The Asus one would look nice if the PCB was a different color.

For those who didn't know, this article mentions that the NV25, or GeForce 4 Ti series, is the same chip as used in the Xbox.

Anyone know the reason for the larger VGA connector on all GF4Ti cards? That's something that has puzzled me.

Kudos to Leadtek for the impressive heatsink (although fairly loud, apparently). That should offer some good overclocking.

244 FPS in Quake 3...does anyone even care anymore?

I'm surprised by how close the Asus and Creative cards were in all benchmarks. They both followed the reference design closely though, so it's not too surprising.

Wow, an extra 14 FPS in Quake 3. Overclocking sure paid off. Guess I was wrong about the overclockability, but these are pre-production cards after all.



<font color=orange>Quarter</font color=orange> <font color=blue>Pounder</font color=blue> <font color=orange>Inside</font color=orange>
Don't step in the sarcasm!
March 4, 2002 5:34:52 PM

i only have two thing to say...
1- they're way too expensive (the ti4600 anyways).
2- they're way to big to be a video card (i thought the R8500 way big enough...until nvidia busts out a card thats half the size of a motherboard).

<b><A HREF="http://gamershq.madonion.com/compare2k1.shtml?2649487" target="_new">P4 + DDR333</A>=<font color=blue>OK</font color=blue></b>
Related resources
March 4, 2002 7:00:31 PM

I think that the Leadtek just looks sexy! Damn! Would get one just to display:) 

Sig of the week.
March 4, 2002 7:06:11 PM

Its okay.... There just no Doom 3, Quake4, or UT2 to test it out on.

THGC, saving 1 pc user from buying a GeForce4 MX at a time.
March 4, 2002 7:08:19 PM

hah! wanna talk size, look at Voodoo5! Especially the 6000!

Sig of the week.
March 4, 2002 10:24:40 PM

Nice sinks.

<font color=red>:</font color=red> <font color=white>:</font color=white> <font color=blue>:</font color=blue>
March 4, 2002 11:24:52 PM

sure the leadtek cooler looks butt-ugly... but it works! and works well!

i admire the way they have done it. huge surface area (good), good airflow cauz of 2 fans (good) and even thermal paste.

2.8ns ram. wow. they gotta be having issues with interference and path lengths running at such rediculous speeds, consider its running at at least 325mhz, when system ram is back at 166 to 200 max.

im just wondering when graphics cards will go dual channel :smile: . its a very logical place to go. can use slower (cheaper) ram, use higher densities, and still get far superior mem bandwidth.


I love helping people in Toms Forums... It reinforces my intellectual superiority! :smile:
March 4, 2002 11:25:52 PM

dont forget serious sam!!!


I love helping people in Toms Forums... It reinforces my intellectual superiority! :smile:
March 5, 2002 12:03:49 AM

Quote:
hah! wanna talk size, look at Voodoo5! Especially the 6000!


It's not how big it is that matters, it's how you use it.

"Ignorance is bliss, but I tend to get screwed over."
March 5, 2002 4:56:00 AM

Quote:
im just wondering when graphics cards will go dual channel


That's a very interesting idea, and I think a very good one. It would help with RAM latency as well as raw bandwidth.

<font color=orange>Quarter</font color=orange> <font color=blue>Pounder</font color=blue> <font color=orange>Inside</font color=orange>
Don't step in the sarcasm!
March 5, 2002 6:44:52 AM

So whats this blurb about the GF4 not supporting DX8? What does that do to the DX based games?

Arguing on the internet is like running in the Special Olympics.
March 5, 2002 7:54:46 AM

Too big? You can never have a video card that's too big! I remember my Voodoo 2 was huge. I loved that thing...

What the hell is the point with these cards anymore? I mean, c'mon, it's a waste of money! I have a Ge Force 256 SDR and it works GREAT! I get over 120 fps in Q3 @ 1024 x 768 and all my other games run just fine. I just don't get what people's obsession is with all this. Is there anybody else who feels the same way? I mean, I enjoy reading these articles and everything, but it seems people are getting out of hand. Alot of the people here are really obsessed with their 3dmark scores. I dunno, I just don't get it (or maybe I just can't afford a new card). :) 

Kidane


Det finns inget dåligt väder - bara dåliga kläder
March 5, 2002 12:27:01 PM

Eh, guys, isn't that exactly what the crossbar memory controller does since the GeForce3 came out? I think <A HREF="http://www.tomshardware.com/graphic/01q1/010227/images/..." target="_new">this image</A> from <A HREF="http://www.tomshardware.com/graphic/01q1/010227/geforce..." target="_new">this page</A> of the article called <A HREF="http://www.tomshardware.com/graphic/01q1/010227/index.h..." target="_new">High-Tech And Vertex Juggling -
NVIDIA's New GeForce3 GPU</A> shows actually what a multi-channel memory-controller does. But correct me if I'm wrong ...

Bikeman

<i>Then again, that's just my opinion</i>
March 5, 2002 12:44:57 PM

Ahhh yes, the feature that keeps on giving!
Did anybody notice this article was one of those gems by THG? Really, when Tom wrote it, he really meant informative and unbiased!

--
For the first time, Hookers are hooked on Phonics!!
March 5, 2002 12:46:58 PM

Yes, that's one excellent feature that's unique to the GF3 and GF4 cards.

AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
March 5, 2002 12:57:23 PM

I could be mistaken here so don't frickin flame...

I believe with nVidias crossbar memory controller that their cards are already effectivly dual channel.
March 5, 2002 1:02:35 PM

Guys as soon as Unreal 2 hits the shelves I'm going to own one of those leadtek cards. I'll push the Ti500 down to my wifes computer and push the GTS 64mb into the reserves. hehe My wife's computer gets all the leftovers and it's still more badass than most computers out there. hehe.

I just think that big heatsink is kewl looking - might not work for crap - too early to tell but it looks cool.

My real hope is that ATI will come out with something to trump the GF4 before U2 comes out...I'd hate to get the latest and greates just to have it get beat (like happened with my Ti500) - I have more money than sense I think (not that I'm a rich man by any stretch)
March 5, 2002 7:00:25 PM

Quote:
I have a Ge Force 256 SDR and it works GREAT! I get over 120 fps in Q3 @ 1024 x 768 and all my other games run just fine.

120 fps - what is that 256 color? what other games? tetris? solitaire? pong? certainly not rtcf or ballistic or aquanox. why don't you bump that res to 1600x1200 @ 16bit in one of those last hree and see what happens to your 256. (choke, sputter, sputter).

I agree, buying a new GF4 right now or with in the next 2-3 mos. would be something like buying a brand new porsche 911 - more money than brains (on that note, a nice used V12 Ferrari Testarossa for about 70k would be purrrfect). Myself, the GF2 Pro is working fine too but I at least admit, when U2 comes out, might mean an upgrade to the orig gf3.

BTW FB, the gf4 cards looked ok, but nothing so far beats that damn red gainward gf3. Now she was sexy.

<font color=blue><i>On Company time... :cool:
March 5, 2002 7:45:08 PM

Quote:
BTW FB, the gf4 cards looked ok, but nothing so far beats that damn red gainward gf3. Now she was sexy.


I agree, that was a nice looking card. I'm not sure which was my favorite, though. Abit's black and silver is nice, and Hercules has always made a classy looking product.

<font color=orange>Quarter</font color=orange> <font color=blue>Pounder</font color=blue> <font color=orange>Inside</font color=orange>
Don't step in the sarcasm!
March 5, 2002 9:04:36 PM

Supposedly a card based off of the Ati R250 will come out in July with 350 Mhz core and 350 x 2 DDR (700 MHz) memory. Then probably 6 months later the R300 will come out, but I really don't have a clue what its specs will be. R250 will only be Direct X 8.1 compliant according to the news I read, as Direct X 9 is quite a ways away (also rumor) Some how I don't see Ati being able to take the cake with R250, as by that time there will probably be a Geforce 4 Ultra to increase clock speeds. I can only hope R250 is better than geforce 4 though, it'll cause some nice price wars and get some better video cards to the average consumer.

"Why can't I be the man? I mean, I DO have harmony balls..." -epoth
March 5, 2002 9:11:29 PM

The Radeon 8500 is already approaching the lower-end GF4 Ti performance so I'm willing to bet that the RV250 will match or exceed the GF4Ti4600. Don't forget that it'll be based on the R200 core so the drivers will already be mature.

AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
March 5, 2002 9:22:27 PM

Screw you guys, I have an ATI Rage II+ and it runs UT at 23fps. IT'S JUST FINE HAHAHA. (IN HIGH SETTINGS)

What's this?...ERROR ERROR ERROR...*CRASH & BURN*
March 5, 2002 9:30:29 PM

I think it's interesting that several companies have started making cards for Ati. Have they wanted to, but Ati wanted to make their own? Or does Ati have something up their sleeve? It's interesting. Supposedly nVidia raised their prices to Hercules for making Kyro II cards, so that would give Hercules a valid reason for giving nVidia the finger (so to speak, though I guess it could've been literal). I just think it's interesting that suddenly Hercules, Powercolor, etc started making Ati's cards.

Could be no more than Ati suddenly deciding to outsource, of course.

<font color=orange>Quarter</font color=orange> <font color=blue>Pounder</font color=blue> <font color=orange>Inside</font color=orange>
Don't step in the sarcasm!
March 5, 2002 9:37:31 PM

I noticed that ATI's codenaming is in sync with the number codenames of the NVs. Notice R200 is NV20 competition, while the R250 is NV25 competition! That makes the R300 or NV30 the Geforce 5...
Makes sense?

--
For the first time, Hookers are hooked on Phonics!!
March 5, 2002 9:40:50 PM

Quote:

I think it's interesting that several companies have started making cards for Ati. Have they wanted to, but Ati wanted to make their own? Or does Ati have something up their sleeve? It's interesting. Supposedly nVidia raised their prices to Hercules for making Kyro II cards, so that would give Hercules a valid reason for giving nVidia the finger (so to speak, though I guess it could've been literal). I just think it's interesting that suddenly Hercules, Powercolor, etc started making Ati's cards.

Could be no more than Ati suddenly deciding to outsource, of course.

I think that ATI is just tired of being a second-best gaming graphics company. They want to compete with nVidia neck and neck the way Intel and AMD are neck and neck. This is of course, better for us, because it means cheaper, more powerful graphics cards.

ATI is positioning the R8500 at the GF3Ti200, GF4MX460 and the Ti4200. Licensing the chip to many companies allows them to gain market share. Improving and expanding their driver team allows ATI to significantly improve performance.

There're just trying to catch up with nVidia in the retail gaming market.

AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
March 5, 2002 10:05:34 PM

Remember a little company not to long ago called 3DFX?
Guess who made voodoo cards? well 3DFX did and thats about it. Now your a consumer, you go into a store and you see 4 Geforce based video cards, all from different brands and one voodoo card from one brand. Which do you think is more likely to get sales?
ATI is just trying to get more people to buy their cards, and if they want that to happen they need to let 3rd parties start making cards for them. If nvidia was the only company building geforce based boards, and 3DFX would have made one or two changes then they'd probably both be around fighting each other today. Instead nvidia is the victor, and now ATI is stepping up as the next challenger.
STMicro something or other (makers of the kyro chipsets) are probably in the process of going the way of 3DFX right now because of the lack of their cards being sold. Its a shame that nvidia can just charge Hercules more just because they expand their product area, but I hope they can pull through with ATI and maybe more Kyro based cards. Nvidia is for the most part the big bad guy on the block (just like intel) and when they flex their muscles the little guy gets the squeeze. (most of the time)

"Why can't I be the man? I mean, I DO have harmony balls..." -epoth
March 5, 2002 10:17:29 PM

STMicro has announced that they're pulling out of the graphics chip market. That doesn't mean that there won't be a Kyro III, as another company is also involved in development of the Kyro series. Someone else could explain more fully, I'm sure.

<font color=orange>Quarter</font color=orange> <font color=blue>Pounder</font color=blue> <font color=orange>Inside</font color=orange>
Don't step in the sarcasm!
March 5, 2002 10:30:51 PM

I think the actual Kyro technology and intellectual rights belong to Imagination Technology. But, I'm not sure as to how imressive the KyroIII will be. The only thing from them that impressed me was the very first PowerVR chip and what it could, or rather, was supposed to be able to do. Since then, its just been a speed bump of the same thing, with the addition of one HSR feature. And with the hype surrounding the feature it was as if they invented the concept of Tile Based Rendering.

With the Ti4600 series, I don't think its a good buy. At ~£400 in the UK it costs around £100 more than the Ti4400. I don't think the 5 to 10 fps increase is justifiable for that kinda price. I wonder if it will be possible to overclock a 4400 to near the performance of its bigger brother.


<font color=red><i>I refugee from Guatanamo Bay,
dance around the border like I'm Cassius Clay
</i></font color=red>
March 5, 2002 10:35:45 PM

not really...
crossbar just allows more efficient mem -> gpu transfers by utilising all available bandwidth to the best of its capabilities, sorta a load balancing method, but the maximum bandwidth obtainable is still single channel.
i.e. 650Mhz X 128bit. = 10.4Gb/sec
having a real dual channel will double the bandwidth, or allow greater bandwidth but using less expensive chips.
i.e. instead of using very costly 2.8ns 650Mhz chips you can get away with using bog standard 5ns 400mhz units and still have almost 20% greater bandwidth.

Question though. how easy would it be to implement dual channel?

twice the number of paths would be required one would think, and is the GPU itself got any bottlenecks?




I love helping people in Toms Forums... It reinforces my intellectual superiority! :smile:
Anonymous
a b U Graphics card
March 5, 2002 10:45:38 PM

GF4 Ti4600 is the best card on the Market.
March 6, 2002 12:27:41 AM

Quote:
Anyone know the reason for the larger VGA connector on all GF4Ti cards? That's something that has puzzled me.

I don't know but I'll take a guess. They've shielded the VGA output stage. This way they don't have to use the image degrading, RF filters of the earlier nVidia designs.

<b>We are all beta testers!</b>
March 6, 2002 1:16:46 AM

ATI products don't have this problem and they don't have a larger VGA connector. How strange...

AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
March 6, 2002 2:04:20 AM

Does it really matter how they look? C'mon they are in your case who cares. One thing that interested me was that the Geforce 4 MX 460 continually beat a geforce3 TI 200 and came close to the TI 500. I was considering getting a geforce3 TI 200 but now i'm not so sure considering they are more expensive than a MX 460.

What's your take on this?
March 6, 2002 2:07:18 AM

Quote:
ATI products don't have this problem and they don't have a larger VGA connector. How strange...

My old ATI All-in-Wonder had miserable video quality until I bypassed the filters on RGB output. Yes, very strange...

<b>We are all beta testers!</b>
March 6, 2002 2:12:32 PM

Personaly, I found it interesting that several of the cards have been rumored to be overdriven by default, but none of the ones in this test were.

I still think the GF4 is a Hit and Run way for Nvidia to cash in on the Geforce and Nvidia name with a product that isn't as revolutionary as people would like to think.

It will be interesting to see if the TI4400 and TI4200 can be overclocked to TI4600 levels. If so, their pricing scheme, as stilted as it is, could backfire on them.

I also have to wonder, how much more driver optimization will the GF3 be getting now that they are dropping that product line and keeping the GF2. I'm sure there will be some generic upgrades, but anything they have no reason to try to squeeze any more performance out of the GF3 specificly.

I do not like it Tom you see,
I do not like green PCB.
March 6, 2002 2:20:58 PM

Actualy, that is just a coincidence. The R100 was the Radeon, the R200 was the next step, and the R250 is a half step up to the R300.

I do not like it Tom you see,
I do not like green PCB.
March 6, 2002 2:22:18 PM

The GF4 MX series is not fully DX8 capable.

I do not like it Tom you see,
I do not like green PCB.
March 6, 2002 2:38:44 PM

If you start getting into dual chanel Mem, you get closer to other issues. I heard somewhere that one way to go dual chanel memory on a GPU is split the memory controler onto a seperate chip. I don't know how effective that would be however.

As for GPU bottlenecks, that depends on the code and how the GPU is optemized.

Also, I do find it interesting that the ATI boards are running Syncranous GPU and memory speeds while the Geforce boards are usually running independantly (Usually, memory is faster)? Just how the Architecture is? Or is there a theory behind the different approaches? Also, with that theory, is there some part of the card that is better to push more than the other in OCing it?

I do not like it Tom you see,
I do not like green PCB.
March 6, 2002 2:47:27 PM

Quote:

Also, I do find it interesting that the ATI boards are running Syncranous GPU and memory speeds while the Geforce boards are usually running independantly (Usually, memory is faster)? Just how the Architecture is? Or is there a theory behind the different approaches? Also, with that theory, is there some part of the card that is better to push more than the other in OCing it?

The original Radeon had to run the core and memory synchronized but the Radeon 8500 and Radeon 7500 can run them independently. In fact, the Radeon 7500 runs at 290/230.

AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
March 6, 2002 6:49:51 PM

What r u talking about? In the beyond x box article, the R8500 was clearly behind the Ti500!

Sig of the week.
March 6, 2002 6:59:29 PM

That article is old news using old drivers, ATI updates drivers nearly once a week. Of course, they are leaked not released officially.

AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
March 6, 2002 11:17:02 PM

Also of conspiratorial note, have you noticed how Elsa went belly-up? That slapped Nvidia's ass in the European market. Plus, last quarter ATI outsold Nvidia (thanks to marketplace domination in portables and longstanding OEM contracts).

Couple those two with the face that Nvidia exceeded earnings estimates by a huge degree last quarter (thanks, in huge part, to initial Xbox sales), and I predict that Nvidia starts to lose its place as the dominant provider of high-end 3D graphics cards to ATI.

Call me crazy, but the 3D market is cutthroat, and few companies (if any) have maintained the pole position for more than 2-3 years. Nvidia's time is running out.

And notice how they've also resumed a 6-month product release cycle? Before the initial Radeon, after putting 3dfx to bed, Nvidia adopted a 1-year product cycle, but intensifying competition from ATI forced (although Nvidia won't admit it) Nvidia to adopt a more aggresive release schedule.

Just some thoughts and observations...

"Put your desk in the corner." - Stephen King
March 7, 2002 11:44:24 PM

GF4!!!

HO HUM.

I don't see the GF4 as a big technical jump in capability.

<b>Just some:</b>

Fine tuning,

dual monitor support,

no DX8.1 pixel shaders??? bummer dude.

Which is probably OK since very few games use vertex and pixel shaders.

To me the GF4 is like what the Ultra was to the GF2, a big speed up, that is about it.
March 8, 2002 1:07:33 AM

Does it really have to be a big jump in tech ability? I mean look at the numbers it pulls out. It obviously is the best consumer card out there at the moment, even if it is very expensive. For the time being, Nvidia hold the performance crown. We'll see what ATI (or anyone else) comes up with in response.
March 8, 2002 3:39:30 PM

Indeed, I think PCB design's complexity is a big factor playing here. Imagine that the chip containing the gpu and ramdac etc. having 128 pins (or more, adress bus, read/write pins, clock pins, ...) extra... Your graphic cards' processor would be as big as your CPU. And think of the chips silicon design, too. I think these factors would influence the price this much, it would annihilate the lower prices of the memory chips.
A seperate memory controller could solve some of these problems, but that would also make a 376+ pins chip (128 bit wide data path to GPU/RAMDAC plus 2x 128 bits towards the two memory banks) And then again, I don't know wether memory bandwidth still is a limiting factor as it were a year ago. But I could be totally wrong on that one. I guess the overclockers on this forum might tell us something about it.
Now I'm talking about it, maybe the pcb- and chip-complexity might be one of the reasons why so little (only one, the nForce, I guess) dual channel DDR-mobos exist. DDR is 64 bit wide, while RD-RAM is only 16-bit-wide. Anyone who has an opinion about this?

Greetz,
Bikeman

<i>Then again, that's just my opinion</i>
March 8, 2002 3:55:05 PM

Not to mention being one of the only cards to ever hold such a tremendous jump in performance when introduced as the "top of the line". It is up to 30% better than a Ti500 in almost anything, and for a new card, this is big time big performance jump. In many cases it is up to TWICE better than a Ti200!!! Imagine how it is when playing games that aren't playable on Ti200, but suddenly see light on the Ti4600! That's powerful indeed for a "6 month later" card. Though the price is too much. The Ti4400 in Canada is about 469$!

--
For the first time, Hookers are hooked on Phonics!!
March 8, 2002 5:20:30 PM

Well, I just bought and installed the GF4 TI-4600 from VisionTek last night. Installed great, setup was done in 2 minutes or less. Very nice HQ cooling fan and its quiet too. DVI, VGA, and S-Vidio connections are good and the card is beautifully laid out and gives the impression of some real fine engineering work.
1st test I did was run 3DMARK2001SE with options set to 1024x768 triple frame buffering, no AA. With the TI 4600 I scored ~8,657 3dmarks. With my origional GF3 board from Hercules I had scored ~6,745 3dmarks.
2nd test was C&C Renegade multiplayer, with the GF3 I had to run at almost minimum settings to get multiplayer to run smoothly in renegade, high settings caused massive warping and slow frame rates because renegade multiplayer is extremely busy with all the buildings vehicles and people...always making explosions... With the GF 4 however, I was able to run multiplayer at the very HIGHEST graphics settings without a bump, it ran smooth and never warped me around. The colors were also much more brilliant they were with the GF3, the picture was sharper, the textures just looked more real and everything. (not cause of the settings, I had run renegade in single player with full options just fine on GF3, its not nearly as intensive in single player, so I know the difference.
3rd test was the nvidia demos, it ran em all fine but honestly...the demos sucked, oh well.
Haven't tried the PowerDVD 4.0xp it came with yet...will post experience with it later.

Overall i highly recommend this card. Its a 35% technical performance boost over the GF 3 origional. In reality its much more than that cause of the revised memmory architecture and the extra video ram, you can now run smooth with a lot more detail. Over the TI 500 its about a 20% boost in performance in current games and looks better, and over the TI 200 its about 100% boost and looks better.
Any tests you'd like me to run on it and post the results of let me know where to get the tests...

My computer is:
Asus P4T
P4 1.7Ghz
Western Digital ATA100 100GB
SB Audigy
VisionTek GF4 TI-4600

<P ID="edit"><FONT SIZE=-1><EM>Edited by williamc on 03/08/02 02:36 PM.</EM></FONT></P>
March 8, 2002 8:26:23 PM

Someone told me nVidia is making a statement (perhaps against both ATI AND Microsoft) by not including the extra pixel shader support. Thus that be intentional

<font color=red><i>I refugee from Guatanamo Bay,
dance around the border like I'm Cassius Clay
</i></font color=red>
!