ATi to battle the Nforce in an Intel vs. AMD war !

<A HREF="http://gamespot.com/gshw/stories/news/0,12836,2769423,00.html" target="_new">http://gamespot.com/gshw/stories/news/0,12836,2769423,00.html</A>

Quote:
ArtX's other team, which developed an integrated graphics chipset with ALi, is working on ATI's upcoming chipset based on the Radeon graphics core that will compete with Nvidia's just-announced nForce.

Nice Intel and AMD users get a Cookie.... :smile: Yummy :smile:
39 answers Last reply
More about battle nforce intel
  1. Have you got any links that are not from a games sites. My firewall won't let me access game category sites :(

    <font color=blue>Keep your hands above the covers at all times!</font color=blue>
  2. well its mostly about the Gamecube but that section i picked say ati is aiming at the nforce. Remember ATi has a p4 bus lincense.

    Nice Intel and AMD users get a Cookie.... :smile: Yummy :smile:
  3. Have you seen the benchmarks for SiS's integrated graphics chipset? I believe it's called the SiS735. Tom did a review. I can't find it at the moment- but the graphics chip could barely do 30FPS in Quake 3 at 640x480 16bit color. That's pretty sad. I don't think this "ArtX's" team is all that credible.

    -MP Jesse

    "Signatures Still Suck"
  4. Wasn't that on a K6-2 platform?

    Well to eat your <b>C :smile: :smile: kie</b> and have it too, gotta get <b>Rade :smile: n II</b>
  5. No. The SiS735 was designed for the Athlon platform and supports DDR.

    -MP Jesse

    "Signatures Still Suck"
  6. The SIS 735 doesn't have intergrated graphics, the SIS 730S has intergrated graphics. Is that what you mean and your right the 730S was for the Athlon platform. Here is a link for the 730S chipset:

    <A HREF="http://www.sis.com.tw/products/slota/730sfea.htm" target="_new">http://www.sis.com.tw/products/slota/730sfea.htm</A>

    I really havn't heard to much about this chipset.

    Well to eat your <b>C :smile: :smile: kie</b> and have it too, gotta get <b>Rade :smile: n II</b>
  7. the SiS 735 is for high-end athlon machines.

    Sorry i was wrong the SiS north to south bridge is 1.2gb's not 1.0gb's that means its 400mb's faster then the NoForce.

    Nice Intel and AMD users get a Cookie.... :smile: Yummy :smile:
  8. Close enough. =) Like I said, I wasn't sure on the exact model number of the chipset. All I know is that it sucked (in the way of 3D graphics).

    -MP Jesse

    "Signatures Still Suck"
  9. How did it compared to Intel's intergrated graphics? The integrated graphics of the NoForce chipset by the time it comes out will be way behind the typical Retail graphics card. As far as I see it the NoForce is still a vaporware product. Not sure what maketing sector it is ment for, budget? the chipset cost to much. Performance? the mx isn't a performance item. Professional? A chipset out of the blue from a company that never produced a chipset before having a more reliable performer then Intel, AMD, SIS, VIA and ALI. I kinda doubt it. I beginning to believe we won't see this chipset until next year around this time, by then ATI will probably unveil their chipset hopefully with a Radeon core. I just hope ATI is smarter and allows the intergrated chipset to be able to work with the AGP card for a duel monitor setup. Also I hope ATI goes for quality as in 2d (maybe a Hydravision design) and not try to load everything at once into the core but design 8x AGP for real power users, ethernet, USB2 or IEEE1394 support. Keep the costs down, add quality features. As the NoForce chipset ages those built in abilities as in Graphics and Sound becomes more of an handicap especially if people don't use them. Since the Artx team is helping to design this chipset then the 3megs of S-Ram on the Flipper chip design could be used which allows a whopping 20gb/sec data transfer rate inside of the graphics chip. Still I hope ATI designs the chip for expandability and reliability vice cramming it full of high cost features which by the time it is released would be pointless.

    Well to eat your <b>C :smile: :smile: kie</b> and have it too, gotta get <b>Rade :smile: n II</b>
  10. Well i think if ATi makes a P4 chipset. It will be value and i810 sytle value. Look say this for a example:

    supports Northwood P4 in socket 478
    FSB :(4x100)400mhz / (4x133)533mhz
    Ram: upto 2gb / 3 dimm's of PC-100/133/1600/2100/2700
    Expandsion: 8x AGP / Bus Mastering PCI /PCI-X
    Hard disk: ATA 33/66/100
    i/o : ps2 / serial / parrell / USB 2.0
    Bridges: ATi northbridge / Intel ICH2 southbridge
    Video: Radeon VE upto 128mb of onboard memory
    Sound: 4.1 Audio DSP

    Nice Intel and AMD users get a Cookie.... :smile: Yummy :smile:
  11. Come on Noko.....your smarter than this.

    <font color=red> As far as I see it the NoForce is still a vaporware product. </font color=red>

    Vaporware product? ASUS, Gigabyte, MSI, and Abit, not to mention Siemens, all have products in production ( keyword in production). This is a far cry from Vaporware.

    As for cost, Use a parellel comparison, the Tyan Thunder K7. You have to look at what you are getting, sound, graphics,10/100 Lan, etc. While you have a point about the graphics you can still drop in your own AGP card. Sure, maybe you won't use the integrated graphics but look at it this way, when you decide to upgrade your machine, you can still use this board in a secondary machine where graphics are no longer as important, just simply take out your graphics card and utilize the integrated video.

    <font color=red>A chipset out of the blue from a company that never produced a chipset before having a more reliable performer then Intel, AMD, SIS, VIA and ALI. I kinda doubt it. </font color=red>

    first valid point admittably. However how many tries has it taken to For Via to get it right? Can it be worse than the recent issues with the 686b southbridge? SIS? We have got to see about that chipset as well, past expeirances with there socket 7 chipsets have left an even worse taste in my mouth than VIA.

    Then you really start to loose credibility. After casting doubts about Nvidia and there first chipset offering you go on about What may or may not happen with ATI and there first chipset offering? I mean wouldn't the same apply? As much as I do like my Radeon, come on be honest here, who has the better driver team nvidia or ATI? Look at ATI's win2k support ( finally getting better) Linux? Hardly! Things nvidia has had gotten right while ATI still struggles. If anything is vaporware it is an integrated chipset from ATI!

    A little bit of knowledge is a dangerous thing!
  12. Remember ATi own's ArtX and FireGL both of them made some good strong drivers.

    Nice Intel and AMD users get a Cookie.... :smile: Yummy :smile:
  13. ATI just recently bought FireGL. All ther expeirance is in the high end grahics design video card arena. As to what can be gleaned from there over to an intergrated chipset solution is a reach to say the least. I am sure the FireGL people will stay working right where they are desinging high end professional video cards and would consider working on integrated motherboards as laughable.
    Artx is a bit of a wildcard, they have shown promise, but in the terms of actual product have not had much success to date.

    A little bit of knowledge is a dangerous thing!
  14. OK, all of what you said is certainly plausable. However, stating that the nForce won't be available until this time next year is madness. 3/4 of all the components that make up the nForce are already in mass production for the Xbox. This chip will be available by the end of the summer. It's priced pretty high- which mean it'll be for the performance PC. You can always disable the graphics portion of it. Infact, you'll have to if you want to make use of the 128bit data bus.

    nVidia has already announced they'll be integrating the GeForce 3 with the nForce (like the GF2 MX) by this time next year.

    -MP Jesse

    "Signatures Still Suck"
  15. <font color=red>Infact, you'll have to if you want to make use of the 128bit data bus.</font color=red>

    NO NO NO NO NO! The nforce is not a true dual channel memory bus. It is dual channel in the nature it has a channel for the cpu and one for the AGP, GPx. Disabling the integrated video does not supply the CPU with a dual channel 128 bit memory interface.

    A little bit of knowledge is a dangerous thing!
  16. So it has no bandwidth advantage over any other DDR chipset to the CPU you are saying, right? You are now saying the boards are in production? Where did you get that source of information, I guess AnAndTech was all wrong then. Anyways ATI already produced a north bridge chip, intergrated:

    <A HREF="http://www.ati.com/na/pages/technology/hardware/s1-370.html" target="_new">http://www.ati.com/na/pages/technology/hardware/s1-370.html</A>

    This won't be something new for them to do again. This chip was the first by the way with T&L, not nVidia NoForce chipset. It was also 8x AGP. Oh it also had 128bit memory bus. Sounds like nVidia copied ATI chipset in ways. Also WHQL certified which I would think had some stability. Maybe not a killer chipset but is available now. Not Vaporware like NoForce :lol: :lol: .

    Well to eat your <b>C :smile: :smile: kie</b> and have it too, gotta get <b>Rade :smile: n II</b><P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 06/07/01 02:41 AM.</EM></FONT></P>
  17. I think nVidia drivers are over rated, the official 6.5 drivers are not even DX8 drivers. The slew of leaked drivers all have problems. ATI has had official DX8 drivers for some time. The official nVidia drivers don't even support the GF3 fully as in Vertex/Pixel Shading. In other words there are no official drivers for their released GF3, meaning also no official support. Nvidia does have better Linux drivers but ATI has better Mac Drivers plus the Beos community has better ATI drivers then nVidia's. Since ATI owns FireGL now you can't even say nVidia has better professional drivers either. It is just another nVidia mirage. Please don't get me wrong, nVidia has good drivers but no ware as good as is bragged about.

    Well to eat your <b>C :smile: :smile: kie</b> and have it too, gotta get <b>Rade :smile: n II</b>
  18. Then why does nVidia claim it has a 4.2GB/s system bus? How can you be certain?

    -MP Jesse

    "Signatures Still Suck"
  19. <A HREF="http://images.anandtech.com/reviews/chipsets/nvidia/nforce/preview/twinbank.jpg" target="_new">http://images.anandtech.com/reviews/chipsets/nvidia/nforce/preview/twinbank.jpg</A>

    AMD's implementation of the Alpha EV6 bus is 64bit.NO matter how many 64bit or 128 bit busses you have between the Memory banks and the NB memory controller the data throughput of the CPU to NB is NOT going to increase WITHOUT directly increasing the frequency of the FSB.Even though BOTH 64bit busses are being used they CAN NOT directly effect the CPU -> "Bus Interface unit". With the crossbar arangement, GPU, CPU and overall system requests could be serviced on a much more simltanious basis, no cueing for the network card or graphics unit to wait on the CPU requests to process first. what this chipset does manage to do is to take the load of the Cpu to N-bridge bus, and decreases the penalty for latency hits.


    A little bit of knowledge is a dangerous thing!
  20. <font color=red> You are now saying the boards are in production? </font color=red>

    Let me clarify this. By saying that they are in production I mean they are currently working on workin silcon, not drawing lines on papers. Working reference boards are out thee as well. This, in my opinion is well beyond vaporware. Vaporware is an ATI p4 motherboard. as for your reference to the ATI board does anyone actually have this board? Has anyone reviewed it? And if so do you have links?

    A little bit of knowledge is a dangerous thing!
  21. <font color=red>I think nVidia drivers are over rated, the official 6.5 drivers are not even DX8 drivers. The slew of leaked drivers all have problems. ATI has had official DX8 drivers for some time. The official nVidia drivers don't even support the GF3 fully as in Vertex/Pixel Shading. In other words there are no official drivers for their released GF3, meaning also no official support.</font color=red>

    OOPS! wrong again! <A HREF="http://www.nvidia.com/Products/Drivers.nsf" target="_new">http://www.nvidia.com/Products/Drivers.nsf</A>

    <font color=red>Since ATI owns FireGL now you can't even say nVidia has better professional drivers either. </font color=red>

    Lol, yeah nvidoa developed theres and what? Ati bought theres?

    <font color=red>The slew of leaked drivers all have problems.</font color=red>

    A practice ATI has choose to copy I might add. Better MAC drivers? I won't even touch that one. Sorry, you fail to convince me about the quality of ATI drivers as compared to the quality of Nvidia's. Furthermore has ati developed a southbridge as well? Nope.




    A little bit of knowledge is a dangerous thing!
  22. Yes it was viewed by TomsHardware working:

    Quote:
    <font color=purple>ATi has attacked this problem with their "S1-370 TL" by borrowing lessons learned from the "ALi Aladdin 7" Socket 7 chipset. The Aladdin 7, which has an integrated 3D core from the ArtX subsidiary of ATi, has a 128-bit interface to SDRAM, thus doubling bandwidth. The ATi S1-370 TL also has a 128-bit pipe to main memory, which should greatly improve high resolution 3D performance. At ATi's booth, an S1 was shown spanking the pixels out of Intel's i815e on Quake III demos. The difference was dramatic.</font color=purple>
    <A HREF="http://www6.tomshardware.com/business/00q4/001118/comdex-03.html#ati_enters_the_piranhafilled_integrated_chipset_waters" target="_new">http://www6.tomshardware.com/business/00q4/001118/comdex-03.html#ati_enters_the_piranhafilled_integrated_chipset_waters</A>

    If you do a search on the web you can see some reviews. I don't know if the chip had problems, costs to high or ATI just got bored with it. For its time it looked to be pretty good except was obsolete to the changes in the Pentinum 3. In other words this will not be new for ATI or the ARTx team to learn from and have a good chipset come out.

    Well to eat your <b>C :smile: :smile: kie</b> and have it too, gotta get <b>Rade :smile: n II</b>
  23. I stand corrected 6/6/01 nVidia finally released DX8 drivers and now supports there GF3. Ahmmm DX8 has been out for awhile. Yea great support. Do you know how many buggy leaked god knows where they came from drivers I had to install to get my MX400 to work with my monitor? Six!!! None where supported. ATI had DX8 drivers last year. Your right you can't talk to much about Mac drivers. I didn't say nVidia drivers where poor just very over rated. Look at it this way, GF3 was released without any official drivers except DX7 drivers. Well thanks for the link, I will upgrade my Leaked alpha drivers for the official drivers now.

    Well to eat your <b>C :smile: :smile: kie</b> and have it too, gotta get <b>Rade :smile: n II</b>
  24. Well i am still waiting for a driver from ATI that will run the Max Payne demo in 3dmark 2001 without ghost images and artifats.........hmmmm?

    A little bit of knowledge is a dangerous thing!
  25. The 3102 W2K drivers (Official) and the 7115 Win9x SP drivers does not have this issue, at least as reported and by my own testing. Still the point sprite test with the Radeon is a joke. I've have yet to hear of anyone running Linux in 3d on a Radeon, hence reason for my MX400 purchase for one of my other computers. Seems like in the pc business drivers are always a significant issue maybe because to compete the hardware has deadlines in which all the features and or bugs are never implemented or fixed all the way. At Beyond3d.com Reverend talked to a nVidia Engineer and he was forthright in saying that 3dtextures was schedule to be part of the GF3 chip but time limitations prevented that feature from being fully implemented in hardware. 1.1 pixel shaders may not be implemented in the Radeon2 chip, just 1.0 version, reason time and the rather new standard implemented in the GF3 chip uping the pixel shader version from 1.0 to 1.1 this year. In short the chips we use particularly graphic chips are chips in the works with momentary stops for production. 3dfx seemed to have waited until the chip was right before release or to a higher standard then nVidia and ATI with small number of problems. All of this is my opinion by the way.

    Well to eat your <b>C :smile: :smile: kie</b> and have it too, gotta get <b>Rade :smile: n II</b>
  26. Quote:
    <font color=blue><i><b><font color=black>Chipsets from ATI Will Come Out</font color=black> [5:13 pm] Gavric
    Do you still remember Intel and ATI having signed a cross-licensing treaty? In spite of no visible moves, ATI goes on developing its own chipsets. Unsurprisingly, they’ll be designed for Intel’s processors. The latest news is ATI’s roadmap for this year envisages two products code-named A3 and A4.
    A3 is an integrated Socket370 chipset with PC133 SDRAM support. This core logic will be marketed as a Value solution, so it is likely to feature a core with Rage 128 Pro architecture as integrated graphics. A3 is scheduled to start sampling in September. Its mass production is awaited in the fourth quarter.
    The second chipset in development will be intended for Pentium 4 CPUs. Since A4 will support not only PC133 SDRAM but also DDR SDRAM, Intel demands it to appear no earlier than in 2002. That’s why ATI plans to launch A4 in Q1 2002. This chipset will be equipped with integrated graphics as well, but for A4 the developers have selected RADEON VE.
    In comparison with the previous roadmap having leaked away from ATI, the chipsets’ launching dates have been put off for a good quarter, so ATI makes a notable stress not at mainboard chipsets but at its graphics cards – their launching dates are postponed very seldom. Nonetheless, ATI A4 should become the first integrated chipset for Pentium 4 CPUs. It’s not for nothing that Intel granted its license to ATI earlier that to the Taiwanese chipset manufacturers.
    <A HREF="http://www.xbitlabs.com/news/" target="_new">http://www.xbitlabs.com/news/</A>

    </font color=blue></i></b>

    The Pentium4/VE chipset I might be interested in. Depending if the Northword performance is any good. The current Pentium4 has much to be desired and if the Northword performs the same then AMD I will stay, maybe even a NoForce chipset too.

    Well to eat your <b>C :smile: :smile: kie</b> and have it too, gotta get <b>Rade :smile: n II</b>
  27. So what you're saying is that if I make a Northbridge to Southbridge that's 8 times what it will ever need, it'll be better than something that is 4 times what a northbridge would ever need?

    Let's add this up:

    ATA-100: 100MB/sec
    PCI 64/66: 533MB/sec
    USB: Not worth mentioning
    Serial: Similarly pitiful to USB
    Paralell: See Serial.

    Hmmm... so what you're saying is that if you push the potential to twice what is needed it will be better than having 25% more than what could possibly be used? Unfortunately your argument holds about as much water as a nickel... none. Don't try adding AGP because that would really show off your ignorance, as AGP works off the Northbridge.

    NVidia built a little bit of expandability into the NForce, it's the first version of the chip (I'm sure the later versions will be even better with potentially even more bandwidth between the two bridges, as needs dictate) and there's no reason to go beyond more than what will be out there any time within the next year on motherboards.

    BTW, does the SiS chipset even support PCI 64/66? If not, it's bandwidth is REALLY wasted.
  28. "reliable performer then Intel, AMD, SIS, VIA and ALI."

    You're kidding me right? OK, Intel and AMD are pretty stable and reliable, but SiS? VIA? ALI? Are you out of your mind? The only reason that I ever use anyone other than AMD is because everyone seems to want to use VIA's southbridge and I still end up having problems with that part of the chipset on systems...

    Anyone trying to push the system performance higher is good to me and all you people talking about how adding dual band DDR SDRAM isn't worth it, hardware prefetch isn't worth it all at the same time crying about how Intel's P4 performance is based on the higher memory bandwidth and hardware prefetch need to get a bit of a reality check. I'd upgrade to a board with this chipset for just the increased memory bandwidth and hardware prefetch and take the LAN (a $20 component generally) and sound (better than SB Live! from everything I've heard EVERYWHERE, so it's at least a $60 equivalent component) as just a little bonus and getting a backup video in case of emergencies (no, I wouldn't replace a Quadro... duh) is great. I've lost 3 video cards over the past 7 years and one of them was in the middle of the night during a large project. If I didn't have my old video card still, I woulda been screwed, the ability to simply switch the card back on and be back up and running with little more than a video driver update is great, when added to all the other features that come with this board.
  29. "The nforce is not a true dual channel memory bus. It is dual channel in the nature it has a channel for the cpu and one for the AGP, GPx."

    Who the hell told you that crap? It can pull from both memory controllers to supply ANY component, whether to sent to the CPU, video (integrated or add-in), southbridge or whatever. The only thing that it can't do is pull from both channels from the same controller.

    "Disabling the integrated video does not supply the CPU with a dual channel 128 bit memory interface."

    Actually, whether you have the integrated video or not, you will never get a dual channel 128bit memory interface, you'll get 2 64-bit memory interfaces that will allow you to pull up to 4.2GB/sec total, sending to any component, or even the same component (southbridge, video, CPU).

    "A little bit of knowledge is a dangerous thing!"

    You're right, your little bit seems to be spreading a lot of FUD.
  30. You are a complete moron. Read before you stick your foot in your mouth.

    A little bit of knowledge is a dangerous thing!
  31. Perhaps you can explain to me how you could possibly supply 128 bits of date to a cpu that onlly has a 64 bit interface??

    A little bit of knowledge is a dangerous thing!
  32. Very good point.

    Well to eat your <b>C :smile: :smile: kie</b> and have it too, gotta get <b>Rade :smile: n II</b>
  33. Actually, whether you have the integrated video or not, you will never get a dual channel 128bit memory interface, you'll get 2 64-bit memory interfaces that will allow you to pull up to 4.2GB/sec total, sending to any component, or even the same component (southbridge, video, CPU).


    ::dosent want to get in this but has to::

    this statement doesnt say the cpu can get 128 bits of data, he actually said what you did ncogneto, but merely stressed the fact the memory controler can server more than one 64 bit chunk of data at the same time(albeit to different components)

    ~Matisaro~
    "Friends don't let friends buy Pentiums"
    ~Tbird1.3@1.55~
  34. Seems like it will cut down latency and increase memory performance in the end anyways. Which of coarse is a very good thing. Seems like an ideal setup for a SMP memory configuration doesn't it?

    Well to eat your <b>C :smile: :smile: kie</b> and have it too, gotta get <b>Rade :smile: n II</b>
  35. Well thats the i810 for the Pentium 4. Smart move now intel doesn't have to make a value chipset when ATi already did there job.

    Nice Intel and AMD users get a Cookie.... :smile: Yummy :smile:
  36. All this heated battle about dominate chipset says one thing to me. Prices will go down. So be happy there is an opposing arguement and that somebody, be it nVidia, SiS or ATI, is making a chipset to knock the fruity VIA's out of the market.

    Your Signature Sucks
  37. Ah... I see. Then how do you account for the enourmous bandwidth capability of the P4 w/ RDRAM. It's my understanding that the i850 has a very similar design to the nForce (in terms of memory interface). Is it because RDRAM runs at 600 or 800mhz?

    -MP Jesse

    "Signatures Still Suck"
  38. AMD's implementation of the Alpha EV6 bus is 64bit thus at 133mhz and 8Bytes (8bits*8Bits=64bits) and being that it is a Double Data Rate bus (DDR, transfer data on rising and Falling edges of Xfer) it is simple math (133X8X2=2.184 or 2.1GB/s). NO matter how many 64bit or 128 bit busses you have between the Memory banks and the NB memory controller the data throughput of the CPU to NB is NOT going to increase WITHOUT directly increasing the frequency of the FSB (in this case 133mhz).

    Pentium 4's system bus is only clocked at 100 MHz and also 64-bit wide, but it is 'quad-pumped', using the same principle as AGP4x. Thus it can transfer 8 byte * 100 million/s * 4 = 3,200 MB/s.




    A little bit of knowledge is a dangerous thing!
  39. <font color=red>you'll get 2 64-bit memory interfaces that will allow you to pull up to 4.2GB/sec total, sending to any component, or even the same component (southbridge, video, CPU).</font color=red>

    Actually the way I read this is he is in fact saying that you will have 4.2 gigabytes of bandwith to the CPU. If not then I apologize. The fact is the cpu to northbridge will never be able to pull more than 2.1 gigabytes of bandwith without increasing the front side bus.

    this is exactly what I said:

    <font color=green>Disabling the integrated video does not supply the CPU with a dual channel 128 bit memory interface. </font color=green>

    This is proven right here:<A HREF="http://images.anandtech.com/reviews/chipsets/nvidia/nforce/preview/twinbank.jpg" target="_new">http://images.anandtech.com/reviews/chipsets/nvidia/nforce/preview/twinbank.jpg</A>

    Furthermore, it is my understanding that a true dual memory channel would be like the alpha processor that does have a 128 bit cpu to northbridge bus. This I suppose is open to debate, depending on ones interpretation of the term "true dual memory channel" Yes in its most simpliest terms, the nfoorce is a 128bit DDR BUT only in the sense that (CPU <-64bit-> MEM) + (AGP <-64bit-> MEM) = 128bit

    Now to sum things up, I beleive that the nforce chipset is a great step forward and shows alot of promise. The parrellel memory channels from the north bridge to the memory banks, along with the crossbar controller will provide much lower load than on a normal chipset as it isn't sharing the bus as much, as there are two busses. And it will increase the bandwith to the CPU, but not to the level of 4.2 gig (double) as alot of people currently think. Also, the crossbar memory controller is very suspiciously like the 4 way memory interleave on VIA chipsets and this is what I believe they are threatening to take Nforce to court over.


    A little bit of knowledge is a dangerous thing!
Ask a new question

Read More

Motherboards Chipsets Graphics AMD Intel ATI