upgrade video card on old machine...advise?

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

I have an Athlon 700 Mhz machine with AGP 2x 32 MB card.
The machine has 256 MB of RAM, and I am running Windows XP - SP2.
The machine occassionally will have problems running a DVD and/or
editting photos (lack of available memory).
I dont play games (except some old ones I have from time to
time...which run fine).

I purchased a NVIDIA FX5200 128 MB/AGP 8x video card for $40 new.
I haven't received it yet.
I am thinking compared to what I have now, this should help elieviate
my problems until I decide to drop some serious money on a new PC
(sometime in 2006).

Based on my use for this card, and the minimal cost, am I correct in
believing that this should help with my problems?

(I dont have non-essential background pgms running...the machine is
pretty clean that way).

Thanks for any advise/comments!
DGC
37 answers Last reply
More about upgrade video card machine advise
  1. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    By the way - my existing video card is an NVIDIA TNT2 32mb 2x agp card.

    Thanks for any advise/comments
    Dom
  2. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    MCGrandpa,

    My doc says the motherbd AGP slot is AGP66/133 mhz 3.3v device support.
    Is the 3.3v the rating you refer to (1.5v signaling)?
    If so - will it support it? I dont know if the number being higher is
    a bad thing or good thing?

    THX!
    Dom
  3. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    I'm not optimistic that an AGP 3.0 card like the 5200 will be
    back-compatible with your system. Be sure to check the notches in the card
    against the AGP slot to see whether it would fit physically. Don't force it.

    This link may be confusing, but it may give you some idea of what you're
    facing:

    http://www.ertyu.org/~steven_nikkel/agpcompatibility.html

    If you have an original AGP 1.0 slot (supports 1X and 2X), you may need a
    card no newer than a Geforce 4. (I used a GF4 4200 in an AGP 1.0 mainboard,
    successfully.) Unfortunately, they may be hard to find new.

    It seems like a 32 MB TNT2 card ought to be adequate for viewing DVDs and
    photo editing. (More video memory may be required for 3D gaming, which you
    say you don't want.) For example: a 1600X1200 pixel image in 32 bit color
    would require less than 8 MB of memory to display. 32 MB of video memory
    ought to support several layers of buffering on that, which means that it
    ought to be adequate for all 2D functions. (I know this is a naive way of
    looking at it, but it ought to be basically correct.)

    I didn't use it much for watching DVDs, but my old desktop machine at work
    had a 32 MB TNT2 graphics card, and it played DVDs smoothly. (It was a 1 GHz
    PIII machine, with 512 MB of RAM. It was a fairly nice machine when it was
    new, in the fall of 2001.)

    I suspect that you need more main RAM. 256 MB isn't a lot for XP. One
    symptom of that would be excessive hard disk activity, indicating the use of
    virtual memory. That would give jerky DVD playback, and delays in editing
    still images.

    HTH.

    Address scrambled. Replace nkbob with bobkn.

    "Dom" <dgcam55@yahoo.com> wrote in message
    news:1110849627.755764.158240@f14g2000cwb.googlegroups.com...
    > MCGrandpa,
    >
    > My doc says the motherbd AGP slot is AGP66/133 mhz 3.3v device support.
    > Is the 3.3v the rating you refer to (1.5v signaling)?
    > If so - will it support it? I dont know if the number being higher is
    > a bad thing or good thing?
    >
    > THX!
    > Dom
    >
  4. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    "Dom" <dgcam55@yahoo.com> wrote in message
    news:1110813888.329155.213110@g14g2000cwa.googlegroups.com
    > I have an Athlon 700 Mhz machine with AGP 2x 32 MB card.
    > The machine has 256 MB of RAM, and I am running Windows XP - SP2.
    > The machine occassionally will have problems running a DVD and/or
    > editting photos (lack of available memory).
    > I dont play games (except some old ones I have from time to
    > time...which run fine).
    >
    > I purchased a NVIDIA FX5200 128 MB/AGP 8x video card for $40 new.
    > I haven't received it yet.
    > I am thinking compared to what I have now, this should help elieviate
    > my problems until I decide to drop some serious money on a new PC
    > (sometime in 2006).
    >
    > Based on my use for this card, and the minimal cost, am I correct in
    > believing that this should help with my problems?
    >
    > (I dont have non-essential background pgms running...the machine is
    > pretty clean that way).
    >
    > Thanks for any advise/comments!
    > DGC

    The only problem I see is that the FX5200 is a AGP 4x/8x card only.
    Better check to see if your motherboard can do the 1.5v signalling. If
    the card won't fit easily into the slot, don't try to force it. That
    means it's not meant to work in that older slot.
  5. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    Funny - I went to the XFX website and there is no specific information
    on this.
    I did a google on v1 (2x) and v2 (4x) and learned that only "universal"
    cards will be backward compatible.
    I learned that most RADEONs are universal type cards.

    I didn't see anywhere about the FX cards being backwards compatible.
    If this information is actually available somewhere online, I would
    like to see it.

    Any ideas/comments?
  6. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    "Dom" <dgcam55@yahoo.com> wrote in message
    news:1110849627.755764.158240@f14g2000cwb.googlegroups.com
    > MCGrandpa,
    >
    > My doc says the motherbd AGP slot is AGP66/133 mhz 3.3v device
    > support. Is the 3.3v the rating you refer to (1.5v signaling)?
    > If so - will it support it? I dont know if the number being higher is
    > a bad thing or good thing?
    >
    > THX!
    > Dom

    No, the FX5900 I have is 1.5v only, and if you were able to put it in
    the 3.3v slot and turn the power on, you'd burn the card or/and mobo.
    There is usually a little warning note about this right on top of
    everything when you open the cards box. I recall reading however, that
    SOME manufacturers built some of the lower end cards to support both
    3.3v and 1.5v. But this is something you really have to be careful of.
    So far, what you *know* is that the motherboards AGP slot is 3.3v only.
    McG.
  7. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    thats great news.
    Thanks again for all your help and suggestions.
    It looks like I am on the right track with this upgrade afterall.

    Dom
  8. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    McGrandpa wrote:

    > The only problem I see is that the FX5200 is a AGP 4x/8x card only.
    > Better check to see if your motherboard can do the 1.5v signalling. If
    > the card won't fit easily into the slot, don't try to force it. That means
    > it's not meant to work in that older slot.

    All GeforceFX cards do both 3.3V and 1.5V...

    Benjamin
  9. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    Bob Knowlden wrote:

    > I'm not optimistic that an AGP 3.0 card like the 5200 will be
    > back-compatible with your system.

    That's no problem (I have an FX5200 in an old computer w. AGP1x slot for
    myself, and I know of several FX5600, FX5700 and two FX5900 running in AGP2x
    boards with 3.3V). All Nvidia GeforceFX GPUs do both 3.3V and 1.5V
    signalling. Not always the card manufacturers also list AGP1x/2x or 3.3V,
    and some (very few!) cards also don't have the AGP1x/2x coding on the AGP
    connector, but they definitely do support 3.3V...

    The problem is _not_ putting a new card in an old mobo, the real problem
    indeed is that some older cards (pre-Radeon/pre-Geforce aera and some Matrox
    G400's) are coded for AGP4x but don't support the 1.5V signalling level of
    modern mainboards...

    Benjamin
  10. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    Dom wrote:

    > Funny - I went to the XFX website and there is no specific information
    > on this.

    As I said, some manufacturer simply don't mention it. They only mention that
    the card does AGP8x because that attracts customers. Most manufacturers
    simply don't care about people with old hardware...

    > I did a google on v1 (2x) and v2 (4x) and learned that only
    > "universal" cards will be backward compatible.

    AGP 2.0 means 3.3V/1.5V capable. The GeforceFX series is AGP 2.0...

    > I learned that most RADEONs are universal type cards.
    >
    > I didn't see anywhere about the FX cards being backwards compatible.
    > If this information is actually available somewhere online, I would
    > like to see it.
    >
    > Any ideas/comments?

    Well, for example Asus mentions it:

    From the little Asus FX5200...
    http://www.asus.com/products4.aspx?modelmenu=2&model=269&l1=2&l2=7&l3=9
    .... to the FX5900Ultra:
    http://www.asus.com/products4.aspx?modelmenu=2&model=259&l1=2&l2=7&l3=5

    And as I already wrote I myself run a FX5200 (ABit Siluro FX5200DT 128MB) in
    an old HP Kayak Dual PII with i440LX chipset and AGP1x, and I have several
    friends/relatives with different FX cards in AGP2x computers...

    Benjamin
  11. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    Benjamin Gawert wrote:
    > Dom wrote:
    >
    >
    >>Funny - I went to the XFX website and there is no specific information
    >>on this.
    >
    >
    > As I said, some manufacturer simply don't mention it. They only mention that
    > the card does AGP8x because that attracts customers. Most manufacturers
    > simply don't care about people with old hardware...
    >
    >
    >>I did a google on v1 (2x) and v2 (4x) and learned that only
    >>"universal" cards will be backward compatible.
    >
    >
    > AGP 2.0 means 3.3V/1.5V capable. The GeforceFX series is AGP 2.0...
    >
    >
    >>I learned that most RADEONs are universal type cards.
    >>
    >>I didn't see anywhere about the FX cards being backwards compatible.
    >>If this information is actually available somewhere online, I would
    >>like to see it.
    >>
    >>Any ideas/comments?
    >
    >
    > Well, for example Asus mentions it:
    >
    > From the little Asus FX5200...
    > http://www.asus.com/products4.aspx?modelmenu=2&model=269&l1=2&l2=7&l3=9
    > ... to the FX5900Ultra:
    > http://www.asus.com/products4.aspx?modelmenu=2&model=259&l1=2&l2=7&l3=5
    >
    > And as I already wrote I myself run a FX5200 (ABit Siluro FX5200DT 128MB) in
    > an old HP Kayak Dual PII with i440LX chipset and AGP1x, and I have several
    > friends/relatives with different FX cards in AGP2x computers...
    >
    > Benjamin
    >
    >

    I've got a GFX5600 running in a 440LX board also (my ftp server). It's
    only a AGP 1.0 spec motherboard, but the FX functions perfectly fine.
  12. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    "Benjamin Gawert" <bgawert@gmx.de> wrote in message
    news:39ob5fF6319q5U1@individual.net
    > Bob Knowlden wrote:
    >
    >> I'm not optimistic that an AGP 3.0 card like the 5200 will be
    >> back-compatible with your system.
    >
    > That's no problem (I have an FX5200 in an old computer w. AGP1x slot
    > for myself, and I know of several FX5600, FX5700 and two FX5900
    > running in AGP2x boards with 3.3V). All Nvidia GeforceFX GPUs do both
    > 3.3V and 1.5V signalling. Not always the card manufacturers also list
    > AGP1x/2x or 3.3V, and some (very few!) cards also don't have the
    > AGP1x/2x coding on the AGP connector, but they definitely do support
    > 3.3V...
    > The problem is _not_ putting a new card in an old mobo, the real
    > problem indeed is that some older cards (pre-Radeon/pre-Geforce aera
    > and some Matrox G400's) are coded for AGP4x but don't support the
    > 1.5V signalling level of modern mainboards...
    >
    > Benjamin

    That's good to know. SO...even a 6800 would work in an ABit BE6-II 1.2
    (BX440) mobo? Cool. That means when I put the new fan on this old
    FX5900-128 it will have a good home waiting for it :)
    McG.
  13. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    Impressive. Thanks for the correction.

    I still doubt that the original poster's life will improve with the addition
    of an FX5200 card, even if it runs perfectly. However, even if there's some
    disappointment, $40 (US) can't be too much of a waste, even if the card has
    a 64 bit memory interface.

    "Benjamin Gawert" <bgawert@gmx.de> wrote in message
    news:39ob5fF6319q5U1@individual.net...
    > Bob Knowlden wrote:
    >
    >> I'm not optimistic that an AGP 3.0 card like the 5200 will be
    >> back-compatible with your system.
    >
    > That's no problem (I have an FX5200 in an old computer w. AGP1x slot for
    > myself, and I know of several FX5600, FX5700 and two FX5900 running in
    > AGP2x boards with 3.3V). All Nvidia GeforceFX GPUs do both 3.3V and 1.5V
    > signalling. Not always the card manufacturers also list AGP1x/2x or 3.3V,
    > and some (very few!) cards also don't have the AGP1x/2x coding on the AGP
    > connector, but they definitely do support 3.3V...
    >
    > The problem is _not_ putting a new card in an old mobo, the real problem
    > indeed is that some older cards (pre-Radeon/pre-Geforce aera and some
    > Matrox G400's) are coded for AGP4x but don't support the 1.5V signalling
    > level of modern mainboards...
    >
    > Benjamin
    >
    >
  14. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    McGrandpa wrote:
    > "Dom" <dgcam55@yahoo.com> wrote in message
    > news:1110849627.755764.158240@f14g2000cwb.googlegroups.com
    >
    >>MCGrandpa,
    >>
    >>My doc says the motherbd AGP slot is AGP66/133 mhz 3.3v device
    >>support. Is the 3.3v the rating you refer to (1.5v signaling)?
    >>If so - will it support it? I dont know if the number being higher is
    >>a bad thing or good thing?
    >>
    >>THX!
    >>Dom
    >
    >
    > No, the FX5900 I have is 1.5v only, and if you were able to put it in
    > the 3.3v slot and turn the power on, you'd burn the card or/and mobo.
    > There is usually a little warning note about this right on top of
    > everything when you open the cards box. I recall reading however, that
    > SOME manufacturers built some of the lower end cards to support both
    > 3.3v and 1.5v. But this is something you really have to be careful of.
    > So far, what you *know* is that the motherboards AGP slot is 3.3v only.
    > McG.
    >
    >

    As far as I know, most NVIDIA AGP cards are dual 3.3V/1.5V compatible.
    According to the AGP spec, such cards are supposed to have two notches
    in the contact area of the card edge, while cards that support only one
    voltage will have a notch in one position or the other. The idea is that
    the AGP slot on the motherboard can have protrusions in it which will
    block incompatible AGP cards from being physically inserted into the
    slot. The pictures I've seen of FX5900 cards seem to have both notches
    on them. Some 6600GT cards appear to have only the 1.5V notch, however.

    Apparently some old 3.3V-only cards were made which had the wrong
    notches on them, and which can be inserted into a motherboard which
    supports only 1.5V. This apparently can destroy the chipset on the
    motherboard. I've not heard of any cards that supported only 1.5V and
    which weren't notched properly, however.

    --
    Robert Hancock Saskatoon, SK, Canada
    To email, remove "nospam" from hancockr@nospamshaw.ca
    Home Page: http://www.roberthancock.com/
  15. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    "Robert Hancock" <hancockr@nospamshaw.ca> wrote in message
    news:F6NZd.688633$Xk.437492@pd7tw3no
    > McGrandpa wrote:
    >> "Dom" <dgcam55@yahoo.com> wrote in message
    >> news:1110849627.755764.158240@f14g2000cwb.googlegroups.com
    >>
    >>> MCGrandpa,
    >>>
    >>> My doc says the motherbd AGP slot is AGP66/133 mhz 3.3v device
    >>> support. Is the 3.3v the rating you refer to (1.5v signaling)?
    >>> If so - will it support it? I dont know if the number being higher
    >>> is a bad thing or good thing?
    >>>
    >>> THX!
    >>> Dom
    >>
    >>
    >> No, the FX5900 I have is 1.5v only, and if you were able to put it in
    >> the 3.3v slot and turn the power on, you'd burn the card or/and mobo.
    >> There is usually a little warning note about this right on top of
    >> everything when you open the cards box. I recall reading however,
    >> that SOME manufacturers built some of the lower end cards to support
    >> both 3.3v and 1.5v. But this is something you really have to be
    >> careful
    >> of. So far, what you *know* is that the motherboards AGP slot is
    >> 3.3v only. McG.
    >>
    >>
    >
    > As far as I know, most NVIDIA AGP cards are dual 3.3V/1.5V compatible.
    > According to the AGP spec, such cards are supposed to have two notches
    > in the contact area of the card edge, while cards that support only
    > one voltage will have a notch in one position or the other. The idea
    > is that the AGP slot on the motherboard can have protrusions in it
    > which will block incompatible AGP cards from being physically
    > inserted into the slot. The pictures I've seen of FX5900 cards seem
    > to have both notches on them. Some 6600GT cards appear to have only
    > the 1.5V notch, however.
    > Apparently some old 3.3V-only cards were made which had the wrong
    > notches on them, and which can be inserted into a motherboard which
    > supports only 1.5V. This apparently can destroy the chipset on the
    > motherboard. I've not heard of any cards that supported only 1.5V and
    > which weren't notched properly, however.

    I'll keep that in mind then. Come to think of it, I haven't seen any of
    those warning slips about voltage in the CARDS boxes in a while, just in
    every mobo's box that's got a 1.5v only AGP slot. I've got a few of
    those.
    That the FX's are 'backward compatible' is nice. After getting the
    6800GT I was starting to get a sad face that the ol $400 FX5900 wouldn't
    get any use, and it's a perfectly good card. I've got several old ABit
    440BX Slot1 mobos that could use a good vid card. Quake classic
    anyone? :o)
    It'd sure be easier if the card makers would make that spec easier to
    find.
    McG.
  16. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    McGrandpa wrote:

    > That's good to know. SO...even a 6800 would work in an ABit BE6-II
    > 1.2 (BX440) mobo?

    If I remember right the 6800 supports AGP2x/3.3V (but most cards don't have
    the AGP coding anyways so they won't fit in the AGP slot), but the AGP->PCIe
    brigde chip used on the 6600(GT) models doesn't...

    Benjamin
  17. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    Bob Knowlden wrote:

    > Impressive. Thanks for the correction.
    >
    > I still doubt that the original poster's life will improve with the
    > addition of an FX5200 card, even if it runs perfectly. However, even
    > if there's some disappointment, $40 (US) can't be too much of a
    > waste, even if the card has a 64 bit memory interface.

    Well, the OP didn't mention what card he has in his old Athlon but the
    FX5200 is really slow. The 128bit versions are barely faster than a GF4MX440
    which should be obtainable for half the price or even less...

    I'd never invest 40USD in an old Athlon 700 any more...

    Benjamin
  18. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    "Benjamin Gawert" <bgawert@gmx.de> wrote in message
    news:39q2j6F648ab2U1@individual.net
    > McGrandpa wrote:
    >
    >> That's good to know. SO...even a 6800 would work in an ABit BE6-II
    >> 1.2 (BX440) mobo?
    >
    > If I remember right the 6800 supports AGP2x/3.3V (but most cards
    > don't have the AGP coding anyways so they won't fit in the AGP slot),
    > but the AGP->PCIe brigde chip used on the 6600(GT) models doesn't...
    >
    > Benjamin

    Right, hey it's a total waste of card to even think of a 6800 in an old
    crate like the BF6 or BE6-II or BH6. The BE6-II has the Ti4600 in it,
    under Linux, and it's doing great. I was concerned that the FX5900
    wouldn't work because of that voltage thing.
    McG.
  19. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    McGrandpa wrote:

    > That the FX's are 'backward compatible' is nice. After getting the
    > 6800GT I was starting to get a sad face that the ol $400 FX5900
    > wouldn't get any use, and it's a perfectly good card. I've got
    > several old ABit 440BX Slot1 mobos that could use a good vid card. Quake
    > classic anyone? :o)

    Well, if I were You I'd think twice about putting a FX5900 in an old i440BX
    system. Not because of the AGP voltage, but simply because the FX5900 is a
    waste in such a system, and You should still get some good money for the
    card on ebay. If I were You I'd sell the card and get something like a GF2
    or Radeon 7500 for the BX machine...

    Benjamin
  20. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    Dom wrote:

    > thats great news.
    > Thanks again for all your help and suggestions.
    > It looks like I am on the right track with this upgrade afterall.

    But I'd think twice! The FX5200 works in Your old system but after all still
    is a ultraslow card. If I were You I'd go for something like an ATI Radeon
    8500/9000/9200 which should be available in the same price range and offers
    a much better performance...

    The only reason on which I'd prefer an FX5200 over one of these ATI cards is
    if I would plan to run Linux...

    Benjamin
  21. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    "Benjamin Gawert" <bgawert@gmx.de> wrote in message
    news:39r3psF66bdi6U1@individual.net
    > McGrandpa wrote:
    >
    >> That the FX's are 'backward compatible' is nice. After getting the
    >> 6800GT I was starting to get a sad face that the ol $400 FX5900
    >> wouldn't get any use, and it's a perfectly good card. I've got
    >> several old ABit 440BX Slot1 mobos that could use a good vid card.
    >> Quake classic anyone? :o)
    >
    > Well, if I were You I'd think twice about putting a FX5900 in an old
    > i440BX system. Not because of the AGP voltage, but simply because the
    > FX5900 is a waste in such a system, and You should still get some
    > good money for the card on ebay. If I were You I'd sell the card and
    > get something like a GF2 or Radeon 7500 for the BX machine...
    >
    > Benjamin

    Right. Or perhaps a newer mobo for one of my daughters. I realize it'd
    be a waste in one of the old Slot1's. :)
    McG.
  22. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    I have a nvidia Riva TNT2 32mb used with Win XP & 256 mb ram (the gfx
    card originally came with the machine back in 2000).
    I ordered the AGP version of the FX5200 (not the PCI), and people are
    stating that this card will work fine in my athlon 700. I hope this
    is the case (Benjamin & McGrandpa make notes to this).
    If not, I would like to understand why.

    Others are stating that the FX5200 may be slow - I dont play any of the
    new games on this old pc. I only play old games and use office apps -
    which shouldn't be too much of a problem for the new card - being that
    my existing card is so old.
    I do use photo editing software, and watch some video - and that is
    where I am really running into memory problems - and - the reason why I
    am seeking to upgrade the card hoping that the bump in
    memory and alittle speed (compared to what I have now) will work.

    Personally, I dont think spending $40 is all that bad if I can squeak
    out another year or two on the pc.
    To me, if this addressed the problem, I think its quite a value
    considering what I use the pc for.

    I don't want to dump money for a new machine until
    later in 2006. The big change to 64 bit is slated for that time...and
    when buying new, will want to get the most I can afford for my $.
  23. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    On 17 Mar 2005 03:47:26 -0800, "Dom" <dgcam55@yahoo.com> wrote:

    >I have a nvidia Riva TNT2 32mb used with Win XP & 256 mb ram (the gfx
    >card originally came with the machine back in 2000).
    >I ordered the AGP version of the FX5200 (not the PCI), and people are
    >stating that this card will work fine in my athlon 700. I hope this
    >is the case (Benjamin & McGrandpa make notes to this).
    >If not, I would like to understand why.
    >
    >Others are stating that the FX5200 may be slow - I dont play any of the
    >new games on this old pc. I only play old games and use office apps -
    >which shouldn't be too much of a problem for the new card - being that
    >my existing card is so old.
    >I do use photo editing software, and watch some video - and that is
    >where I am really running into memory problems - and - the reason why I
    >am seeking to upgrade the card hoping that the bump in
    >memory and alittle speed (compared to what I have now) will work.
    >
    >Personally, I dont think spending $40 is all that bad if I can squeak
    >out another year or two on the pc.
    >To me, if this addressed the problem, I think its quite a value
    >considering what I use the pc for.
    >
    >I don't want to dump money for a new machine until
    >later in 2006. The big change to 64 bit is slated for that time...and
    >when buying new, will want to get the most I can afford for my $.


    Make sure your getting the 128bit memory bus version. The
    64bit version may be faster than an old TnT2, but not buy much. The
    128bit version will get you near GF3 Ti200 performance, but being your
    still using an old Athlon 700, you may not get the best the card could
    push.
  24. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    Dom wrote:

    > I have a nvidia Riva TNT2 32mb used with Win XP & 256 mb ram (the gfx
    > card originally came with the machine back in 2000).
    > I ordered the AGP version of the FX5200 (not the PCI), and people are
    > stating that this card will work fine in my athlon 700. I hope this
    > is the case (Benjamin & McGrandpa make notes to this).
    > If not, I would like to understand why.
    >
    > Others are stating that the FX5200 may be slow - I dont play any of the
    > new games on this old pc. I only play old games and use office apps -
    > which shouldn't be too much of a problem for the new card - being that
    > my existing card is so old.
    > I do use photo editing software, and watch some video - and that is
    > where I am really running into memory problems - and - the reason why I
    > am seeking to upgrade the card hoping that the bump in
    > memory and alittle speed (compared to what I have now) will work.
    >
    > Personally, I dont think spending $40 is all that bad if I can squeak
    > out another year or two on the pc.
    > To me, if this addressed the problem, I think its quite a value
    > considering what I use the pc for.

    The only things you're going to get out of the 5200 for what you're
    describing would be DXVA (DirectX video acceleration) and possibly the
    ability to run at higher resolutions and refresh rates and for that it's
    probably worthwhile. If you're running out of memory, that's more likely
    your 256 meg in the machine than the video memory.

    > I don't want to dump money for a new machine until
    > later in 2006. The big change to 64 bit is slated for that time...

    ????? The "big change to 64 bit" happened over a year ago. Are you talking
    about the release of Microsoft's 64-bit OS? That has been out for many
    years for the Itanic, and the AMD64 version has been available as a free
    beta for quite some time.

    > and
    > when buying new, will want to get the most I can afford for my $.

    --
    --John
    to email, dial "usenet" and validate
    (was jclarke at eye bee em dot net)
  25. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    If you're running out of memory, that's more likely
    your 256 meg in the machine than the video memory.

    John, question about your comment above. By having only 32 mb of video
    memory (currently), and working with photoediting software, by upping
    to 128 mb, wont I get some relief from memory issues at all? 256 mb
    (photoimpact 7), norton AV, and xp is all that is running on this
    machine.
  26. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    Dom wrote:

    > If you're running out of memory, that's more likely
    > your 256 meg in the machine than the video memory.
    >
    > John, question about your comment above. By having only 32 mb of video
    > memory (currently), and working with photoediting software, by upping
    > to 128 mb, wont I get some relief from memory issues at all? 256 mb
    > (photoimpact 7), norton AV, and xp is all that is running on this
    > machine.

    With photoediting software, the RAM on the video board affects your color
    depth and your screen resolution and nothing else.

    32 MB is sufficient for any monitor on the market for under $2000 to display
    its max resolution in 32-bit color.

    A newer board may give you a sharper display but for photo editing that's
    about it.

    Put another stick of RAM in and you'll see a good deal more benefit.

    --
    --John
    to email, dial "usenet" and validate
    (was jclarke at eye bee em dot net)
  27. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    Well... I got the card today and installed it.
    It was a snap to install, and seems to be running fine.
    There is "some" relief in the photo editing - but not much. Enough to
    keep me satisfied for another year or so until I am ready to buy a new
    machine.
    Increasing my systems ram from 256 to more would probably be most
    beneficial - but - I attempted to do that a year or so ago and had some
    wacky system issues with the ram in the lower 2 slots (also two 128 mb
    sticks). I wound up returning it because I didnt want to deal with it.
    I spent numerous days trying to figure out the problem and just gave
    up.


    That being the case, I returned the ram and didnt w
  28. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    "Dom" <dgcam55@yahoo.com> wrote in message
    news:1111060045.972649.280140@f14g2000cwb.googlegroups.com
    > I have a nvidia Riva TNT2 32mb used with Win XP & 256 mb ram (the gfx
    > card originally came with the machine back in 2000).
    > I ordered the AGP version of the FX5200 (not the PCI), and people are
    > stating that this card will work fine in my athlon 700. I hope this
    > is the case (Benjamin & McGrandpa make notes to this).
    > If not, I would like to understand why.
    >
    > Others are stating that the FX5200 may be slow - I dont play any of
    > the new games on this old pc. I only play old games and use office
    > apps - which shouldn't be too much of a problem for the new card -
    > being that my existing card is so old.
    > I do use photo editing software, and watch some video - and that is
    > where I am really running into memory problems - and - the reason why
    > I am seeking to upgrade the card hoping that the bump in
    > memory and alittle speed (compared to what I have now) will work.
    >
    > Personally, I dont think spending $40 is all that bad if I can squeak
    > out another year or two on the pc.
    > To me, if this addressed the problem, I think its quite a value
    > considering what I use the pc for.
    >
    > I don't want to dump money for a new machine until
    > later in 2006. The big change to 64 bit is slated for that time...and
    > when buying new, will want to get the most I can afford for my $.

    Well, like Benjamin has been saying, the FX's are 'backward compatable'
    with the older 1x/2x AGP slots. For any 2D work (Desktop stuff,
    imaging, etc.) you should see some improvment in speed and perhaps even
    quality. I ran Quake classic for over a year on a Diamond Viper
    V770Ultra... a TnT2 Ultra card with 32 megs. I think anything in the FX
    series would be an improvment over the old TnT2 cards.
    Don't worry, I've been busy learning a couple things new to me too :)
    McG.
  29. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    There was no such warning sheet on the top of the bag.
  30. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    I am sorry....ignore my prior reply...you said taped to the MAIN board
    - not vid board.
  31. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    "Benjamin Gawert" <bgawert@gmx.de> wrote in message
    news:39ob5fF6319q5U1@individual.net
    > Bob Knowlden wrote:
    >
    >> I'm not optimistic that an AGP 3.0 card like the 5200 will be
    >> back-compatible with your system.
    >
    > That's no problem (I have an FX5200 in an old computer w. AGP1x slot
    > for myself, and I know of several FX5600, FX5700 and two FX5900
    > running in AGP2x boards with 3.3V). All Nvidia GeforceFX GPUs do both
    > 3.3V and 1.5V signalling. Not always the card manufacturers also list
    > AGP1x/2x or 3.3V, and some (very few!) cards also don't have the
    > AGP1x/2x coding on the AGP connector, but they definitely do support
    > 3.3V...
    > The problem is _not_ putting a new card in an old mobo, the real
    > problem indeed is that some older cards (pre-Radeon/pre-Geforce aera
    > and some Matrox G400's) are coded for AGP4x but don't support the
    > 1.5V signalling level of modern mainboards...
    >
    > Benjamin

    Hence, the warning sheet taped to the top of the bag of every main board
    I've bought that has 1.5v only AGP slot. They simply warn that the vid
    card must be capable of 1.5v operation.
    McG.
  32. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    "Dom" <dgcam55@yahoo.com> wrote in message
    news:1111193404.807247.256830@l41g2000cwc.googlegroups.com
    > There was no such warning sheet on the top of the bag.

    Is this a newer 1.5v only motherboard?
  33. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    "Dom" <dgcam55@yahoo.com> wrote in message
    news:1111193523.229606.234570@z14g2000cwz.googlegroups.com
    > I am sorry....ignore my prior reply...you said taped to the MAIN board
    > - not vid board.

    yes, right. I just read one of the sheets out of one of my Gigabyte
    mobos, it does warn that if you use a 3.3v vid card in the 1.5v AGP
    slot, it will burn the motherboard. But, even the old Diamond Viper
    V770Ultra had a jumper to switch between AGP 2X (3.3v) and AGP 4X
    (1.5v). You just had to make sure it was switched.

    Thing is, the AGP 3.0 specification does not require backward support.
    http://members.datafast.net.au/dft0802/specs/agp30.pdf

    Checking the websites at nVidia and eVGA I can't find even a clue that
    an AGP FX card will work in a 3.3v slot which would be AGP 1.0 spec
    exclusively. In my eVGA Users Guide is a tiny bit of useful info:

    "AGP 2X/4X: An AGP 2.0 (or higher) compliant motherboard.

    Note:
    Some motherboards violate the AGP specification and therefore some cards
    may not physically fit in some systems.

    All AGP chipsets require special drivers to be installed in order to
    make AGP function correctly. These drivers are available from your
    motherboard or motherboard chipset manufacturer.

    AGP 8X cards require an AGP 3.0 bus slot to function at maximum
    capacity, however they are backwards compatible and will function in an
    AGP 2.0 (4X) bus slot. AGP 4X/8X cards will operate at 1.5v."


    What they don't state is that the AGP 3.0 spec calls for 0.8v
    signalling. The AGP 3.0 spec does call for backward compatibility to
    AGP 2.0 spec, or 1.5v signalling. From all I can find to read tonight,
    it appears that only AGP 1.0 (AGP 1X) is 3.3v. So a card or board that
    is AGP 1X/2X (if they made one?) supports both 3.3v and 1.5v. But what
    I recall seeing even on my Viper V770Ultra is 2X/4X.
    The big thing is, if the motherboard is AGP 2.0 spec or not. If it is,
    then the FX will work. If it's AGP 1.0 then I'd think no, it wouldn't.

    I hope some of this is useful to you.
    McG.
  34. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    McGrandpa wrote:
    > "Benjamin Gawert" <bgawert@gmx.de> wrote in message
    >
    >> The problem is _not_ putting a new card in an old mobo, the real
    >> problem indeed is that some older cards (pre-Radeon/pre-Geforce aera
    >> and some Matrox G400's) are coded for AGP4x but don't support the
    >> 1.5V signalling level of modern mainboards...
    >>
    >> Benjamin
    >
    > Hence, the warning sheet taped to the top of the bag of every main
    > board I've bought that has 1.5v only AGP slot. They simply warn that
    > the vid card must be capable of 1.5v operation.

    Of course. Please re-read what I wrote again. The warning sticker on
    _mainboards_ is because modern boards can't run _old_ cards. But the problem
    this thread is about is a _modern_ gfx card in an _old_ mainboard. And
    unlike modern mainboards which only do 1.5V modern gfx cards do both 1.5V
    and 3.3V...

    Benjamin
  35. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    McGrandpa wrote:

    > I have, and see that the thread states FX will work in an AGP version
    > 1.0 slot, which thread also states the FX will work with 0.8v, 1.5v
    > and 3.3v. Thread also says you and others are using FX's in AGP 1.0
    > slots. I think I'll quit offering suggestions and advice on this issue
    > entirely, until I'm satisfied I know what I need to about the
    > specification AND the actual AGP specs of the motherboards themselves.
    > I have three ABit 440BX Slot1 mobos in here right now, and the only
    > spec I see concerning AGP is "has 1 AGP slot". Likely it's the
    > first AGP spec ;) and that's why it doesn't say which version AGP.
    > Anyway, the technical details aren't the issue, it's whether the newer
    > card works ok in the older motherboard. And they do. Even on my end
    > they have to, because one of my own machines is running a newer
    > Radeon, another is running a Ti4600. These boards have just the
    > plain "AGP" slot.
    >
    > So why did/am I going on about it? I just don't like not *knowing*
    > some of the things that turn out to be important that board makers
    > don't tell us. Sorry to be "Figuring it out" in public.

    It's not You who is to blame, it's the card manufacturers and especially the
    GPU manufacturers like ATI and Nvidia which don't offer any informations
    about this issue for the public. For them, old systems simply aren't
    important because they want You to buy the newest and latest. They don't
    care about people that still run an old Athlon or P3 with AGP2x. These types
    of customers must be the horror for them, because they keep their hardware
    longer than average and don't follow all the hypes they created.

    The only good ressource are the technical documents from Nvidia which sadly
    aren't available to everyone. We have them at work because we use Nvidia
    chips for some of our applications. And these spec sheets say that the FX
    line can do both 1.5V and 3.3V.

    Benjamin
  36. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    "Benjamin Gawert" <bgawert@gmx.de> wrote in message
    news:3a50hkF689tvvU1@individual.net
    > McGrandpa wrote:
    >
    >> I have, and see that the thread states FX will work in an AGP version
    >> 1.0 slot, which thread also states the FX will work with 0.8v, 1.5v
    >> and 3.3v. Thread also says you and others are using FX's in AGP 1.0
    >> slots. I think I'll quit offering suggestions and advice on this
    >> issue entirely, until I'm satisfied I know what I need to about the
    >> specification AND the actual AGP specs of the motherboards
    >> themselves. I have three ABit 440BX Slot1 mobos in here right now,
    >> and the only spec I see concerning AGP is "has 1 AGP slot". Likely
    >> it's the first AGP spec ;) and that's why it doesn't say which
    >> version AGP. Anyway, the technical details aren't the issue, it's
    >> whether the newer card works ok in the older motherboard. And they
    >> do. Even on my end they have to, because one of my own machines is
    >> running a newer Radeon, another is running a Ti4600. These boards
    >> have just the plain "AGP" slot.
    >>
    >> So why did/am I going on about it? I just don't like not *knowing*
    >> some of the things that turn out to be important that board makers
    >> don't tell us. Sorry to be "Figuring it out" in public.
    >
    > It's not You who is to blame, it's the card manufacturers and
    > especially the GPU manufacturers like ATI and Nvidia which don't
    > offer any informations about this issue for the public. For them, old
    > systems simply aren't important because they want You to buy the
    > newest and latest. They don't care about people that still run an old
    > Athlon or P3 with AGP2x. These types of customers must be the horror
    > for them, because they keep their hardware longer than average and
    > don't follow all the hypes they created.

    Yeah, but thing is, someone is going to be using the old hardware. If I
    have something that sits around long enough, I'll give it to someone who
    can use it. I have enough 'junk' in the other room to build 4 or 5
    boxes. I'll do a couple for the kids and keep enough later model items
    for spares. But it's time to clean out the old stuff, and it works, so
    I won't put it in the trash.
    >
    > The only good ressource are the technical documents from Nvidia which
    > sadly aren't available to everyone. We have them at work because we
    > use Nvidia chips for some of our applications. And these spec sheets
    > say that the FX line can do both 1.5V and 3.3V.
    >
    > Benjamin

    Cool. It'll work out. Used to, you could hunt around on Nvidias site
    and get the GPU specs, but not now. Have to ask for them I guess :)
    I've found white papers on ATI GPU's and boards. And I have the stuff
    on most of my mobos. It's good to know. Thanks for the info.

    I've enjoyed this discussion Benjamin, you have a fine weekend :)
    McG.
  37. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    McGrandpa wrote:

    > Yeah, but thing is, someone is going to be using the old hardware. If I
    > have something that sits around long enough, I'll give it to
    > someone who can use it. I have enough 'junk' in the other room to
    > build 4 or 5 boxes. I'll do a couple for the kids and keep enough
    > later model items for spares. But it's time to clean out the old
    > stuff, and it works, so I won't put it in the trash.

    Same here. I still have some older computers here, like the mentioned P2
    system (a HP Kayak XU with 2x PII-333 and 512MB running NT4). Heck, we still
    also use computers from down to 1972 at work ;-)

    > Cool. It'll work out. Used to, you could hunt around on Nvidias site
    > and get the GPU specs, but not now. Have to ask for them I guess :)

    I might be wrong but if I remember right Nvidia had the tech docs for
    download at times of the Riva128. Don't know why they removed them..

    > I've found white papers on ATI GPU's and boards. And I have the
    > stuff on most of my mobos. It's good to know. Thanks for the info.

    IMHO the in this respect most open manufacturer is intel. You can get a
    sh*tload of informations and tech docs from their website about almost
    everything they make and made, i.e. CPUs, chipsets, network adapters, etc.

    > I've enjoyed this discussion Benjamin, you have a fine weekend :)

    I enjoyed that, too. Have a good time!

    Regards from Bavaria/Germany

    Benjamin
Ask a new question

Read More

Nvidia Graphics Cards Graphics