Sign in with
Sign up | Sign in
Your question

How much better?

Last response: in Graphics & Displays
Share
July 7, 2006 8:42:16 PM

Hi,
How much better is the eVGA E-GEFORCE 7900GT 256M DDR3 PCI-E
in comparision to the ATI Rad. X1600 Pro 512MB PCI-E Sapphire. Is it really worth spending double the price? Perhaps nVidia will have Geforce8 GPU's and it will be better waiting for those.
Thanks,
Matt

More about : question

July 7, 2006 8:57:42 PM

Quote:
Hi,
How much better is the eVGA E-GEFORCE 7900GT 256M DDR3 PCI-E
in comparision to the ATI Rad. X1600 Pro 512MB PCI-E Sapphire. Is it really worth spending double the price? Perhaps nVidia will have Geforce8 GPU's and it will be better waiting for those.
Thanks,
Matt


No personal experience with that specific x1600. Based on the numbers and what I'm told from x1600 users, I'd probably try the GeForce 7600GT before trying that one. The 7900GT is much better--again, the x1600 is supposed to compete with the GeForce 7600 line not the 7900 line.

If you prefer ATI, go with the x1800GTO and that'll be a great deal for the money--it's a tad higher than the x1600 but it has consistently high performance. Read Cleeve's stickey.
July 7, 2006 8:58:05 PM

The 7900GT eats the X1600 for breakfast, lunch, and dinner at the same time. If you're on a budget, and you want to upgrade to a next-gen card, 7600GT would be perfect for you.

7900GT is a lot better than a X1600 and worth getting if you can spend some money.
Related resources
a b U Graphics card
July 7, 2006 8:58:15 PM

>>How much better is the eVGA E-GEFORCE 7900GT 256M DDR3 PCI-E
in comparision to the ATI Rad. X1600 Pro 512MB PCI-E Sapphire.<<

For starters the 7900Gt uses DDR3 and is a 256 bit system. The ATI X1600 pro uses DDR2 and is a 128 bit system. Not to mention the 24 pipes the 7900GT has. The difference.
July 7, 2006 8:58:57 PM

You mind telling me if there's a diference betwenn DDR2 and DDR3 on a video card?


:roll:
a b U Graphics card
July 7, 2006 8:59:29 PM

That's budget vs. entry level high end. You can't even begin to comare the two cards. ATi's comparable cards to 7900GT are X1800XT and X1900GT. Nv's comparable cards to the X1600 pro/XT are the 7600GS and 6600GT.

If you are extremly budget limited get a X800 or 7600GS. Better yet, spend a little more on a 7600GT. If you can afford it, X1800XT and 7900GT offer alot of perfomance for the money. And with X1900XT down around $330 AR, it is soon near that $300 range where it will without question rule.

It just depends on your games, resolution/detail requirements, budget and system specs but without question there should be better choices for you than a 512MB X1600pro.
a b U Graphics card
July 7, 2006 9:06:54 PM

I finally won a 7800Gt over at Estupid. LOL! Anyway I'll be using two 7800GT 470/1100 next week sometime. I know you planned on going SLI with the EVGA 7800GT you have and ran into the same problem as I did. No stock at EVGA.

I'll be using the two EVGA 7600 GT's 580/1500 in a socket 754 SLI rig with EPOX boad I'm bulding. I'm using the socket 754 Clawhammer 3400 2.2 1 gig L2 cache on this computer. With this MSI Platinum NF4 board and Corsiar XMS RAM, the processor dosn't want to run stable when I O'c it. I think it's the RAM with this board. I was hoping to bump it to 2.4/2.6 range in with the SLI setup on the 754 Epox board. Anyway, thanks for your posts. Very informative
July 7, 2006 9:48:00 PM

Quote:
...7900GT is a lot better than a 7900GT and worth getting if you can spend some money.


Might want to fix that typo, prozac... :lol:  :lol:  :lol: 

Also, the x1800gto is just above the x1600 in price--sure the x1800XT is awesome, but--heck, just look at Cleeve's stickey like I suggested before: Click here for Cleeve's thread
July 7, 2006 10:01:17 PM

My bad. :oops: 


:lol:  :lol:  :lol:  :lol:  :lol:  :lol:  :lol: 
a b U Graphics card
July 7, 2006 11:07:19 PM

Quick ? for ya. Is the X1800GTO alot better than the X1800GTO? :tongue: :lol: 



*just couldn't resist*
a b U Graphics card
July 7, 2006 11:13:54 PM

Quote:
You mind telling me if there's a diference betwenn DDR2 and DDR3 on a video card?


:roll:


Just off the top of my head I like to say, DDR III, likely to be called DDR III SDRAM (Double Data Rate Three Synchronous Dynamic Random Access Memory), is the name of the new DDR memory standard that is being developed as the successor to DDR2 SDRAM.

The memory comes with a promise of a power consumption reduction of 40% compared to current commercial DDR2 modules, due to DDR III's 90 nanometer fabrication technology, allowing for lower operating currents and voltages (1.5V, compared to DDR2's 1.8V or DDR's 2.5V). "Dual-gate" transistors will be used to reduce leakage current.

DDR3 prefetch buffer width is 8 bit, whereas DDR2 is 4 bit, and DDR is 2 bit.

Theoretically, these modules could transfer data at the effective clockrate of 400-800 MHz (for a single clock bandwidth of 800-1600 MHz), compared to DDR2's current range of 200-533 MHz (400-1066 MHz) or DDR's range of 100-300 MHz (200-600 MHz). To date, such bandwidth requirements have been mainly on the graphics market, where vast transfer of information between framebuffers is required.
a b U Graphics card
July 7, 2006 11:19:35 PM

Quote:
You mind telling me if there's a diference betwenn DDR2 and DDR3 on a video card?


:roll:


Just off the top of my head I like to say, DDR III, likely to be called DDR III SDRAM (Double Data Rate Three Synchronous Dynamic Random Access Memory), is the name of the new DDR memory standard that is being developed as the successor to DDR2 SDRAM.

The memory comes with a promise of a power consumption reduction of 40% compared to current commercial DDR2 modules, due to DDR III's 90 nanometer fabrication technology, allowing for lower operating currents and voltages (1.5V, compared to DDR2's 1.8V or DDR's 2.5V). "Dual-gate" transistors will be used to reduce leakage current.

DDR3 prefetch buffer width is 8 bit, whereas DDR2 is 4 bit, and DDR is 2 bit.

Theoretically, these modules could transfer data at the effective clockrate of 400-800 MHz (for a single clock bandwidth of 800-1600 MHz), compared to DDR2's current range of 200-533 MHz (400-1066 MHz) or DDR's range of 100-300 MHz (200-600 MHz). To date, such bandwidth requirements have been mainly on the graphics market, where vast transfer of information between framebuffers is required.

:roll: Just off the top of your head or googled and quoted from Wikipedia? :oops:  4U
a b U Graphics card
July 7, 2006 11:51:40 PM

>>Just off the top of your head or googled and quoted from Wikipedia? 4U<<

I'm just happy to have a few video cards that are 256 bit, 256 meg DDR3 with a core in the 400 to 500 range. When I got my new ASUS 5950 Ultra ,running in this machine as I wite, a hot card three fans on it), I though the picture quality was phenominal compared to the FX5200 I had replaced. Granted the 5950 Ultra has 256 DDR, it does have a 475 core and 800 mhz DDR. The 7800gt was a huge step up from te ASUS GeForce 5950. Funny the ASUS doesn't do well on my son's WoW and Counterstrike. I fact, last time it wouldn't even load up counterstrike if I can remember right. I took it out of his machine and he uses an ATI X800GTO. Wity the 7800GT the boost in performance of going from 475/800 DDR to 470/1100 DDR3 is qute a lot.

Now I'll be taking my 7800GT and pairing it with another and putting it in my main rig, the AMD 4800X2. That means I'll have to buy a new PCIE card to replace the 7800GT in my Pentium D 805 Smithfield micro build I did recently. I have to check my space in the micro case, Aspire X-Q Pack, but I really want to get an ATI x1800xt. I doubt it will fit in there. If not, I'm thinking X800GTO only because it will fit and that's the best ATI card I see that will fit. Any thoughts on this scenario? Thanks.

The reason I have so many computers, i.e. graphics cards, I take care of every day is we have two offices and home. Three wireless networks and we are planning on buyng expensive digital, graphic intensive skull x-ray equiptment next year. Here is a short list of the video cards I am running everyday. Nobody to help me!

7800gt
7600gt
5950 ultra
x800gto
6600 GT
ATI radeon 9200
5700 LE
9600XT
some Gforce 4000 Diamond crapper in the oldet of our dozen or so computers running
Notebook with 9600/9700 ATI Mobility (ATI has never made an updated driver for this!)
Notebook with on board graphcs that steals my RAM and I don't care

I keep a retired 5200FX to troubleshoot with. Oh and there is my Matrox with 4 megs of RAM in my Pentium 100.
I also have a Gateway 450 mhz with a graphics card and for the life of me, I can't remember what that might be.

My background is not in computers, but I find myself figuring them out all day long. This forum givs me some great insight and a place to find answers.
a b U Graphics card
July 7, 2006 11:55:33 PM

Quote:
:roll: Just off the top of your head or googled and quoted from Wikipedia? :oops:  4U


LOL!

But be nice, after all he's only a wee ba-be! The other other White meat!

And the funny thing is really shouldn't he really have said;

"GDDR3 (Graphics Double Data Rate, version 3) is a graphics card-specific memory technology, designed by ATI Technologies.

It has much the same technological base as DDR2, but the power and heat dispersal requirements have been reduced somewhat, allowing for higher-speed memory modules, and simplified cooling systems. Unlike the DDR2 used on graphics cards, GDDR3 is unrelated to the upcoming JEDEC DDR3 specification. This memory uses internal terminators, enabling it to better handle certain graphics demands. To improve bandwidth, GDDR3 memory transfers 4 bits of data per pin in 2 clock cycles.

Despite being designed by ATI, the first card to use the technology was nVidia's GeForce FX 5700 Ultra, where it replaced the DDR2 modules used up to that time. The next card to use GDDR3 was nVidia's GeForce 6800 Ultra, where it was key in maintaining reasonable power requirements compared to the card's predecessor, the GeForce 5950 Ultra. ATI began using the memory on its Radeon X800 cards. GDDR3 is Sony's choice for the PlayStation 3 gaming console's graphics processor, although the main system memory will be comprised of XDR DRAM. Microsoft's Xbox 360 is also shipped with 512 MB of GDDR3 memory, and is helping to pioneer the use of this memory as standard system memory rather than only video memory.

DDR3 memory is a different technology from GDDR3."


http://en.wikipedia.org/wiki/GDDR3

Because as well ALL know there is no current graphics card shipping with DDRIII on it, only GDDR3.

:twisted:
a b U Graphics card
July 8, 2006 12:11:16 AM

Quote:
Hi,
How much better is the eVGA E-GEFORCE 7900GT 256M DDR3 PCI-E
in comparision to the ATI Rad. X1600 Pro 512MB PCI-E Sapphire.


I'd say it's about 2.140827 x 10° times better.

But that's just a rough estimate. :mrgreen:
a b U Graphics card
July 8, 2006 12:15:59 AM

Does this mean buying a video card with DDR3 is out of the question if one with DDR2 is available? Just as good in a work environment, no need for DD3 in a graphics intensive software like Dolphin?

I have read your recent post and you certainly have an inside track to modern software and new technology. Have you heard of this technology? I will be at at the helm with this software technology and the hardware to obtain the cephalometric x-ray input. I have one friend capable of knowing how to use (I use that term loosly) and maintain it. I will have to learn.

Dolphin Imaging 10

http://www.dolphinimaging.com/new_site/home.html
a b U Graphics card
July 8, 2006 12:26:16 AM

:lol:  LOL, yes that would have been the better response.

Quote:
Because as well ALL know there is no current graphics card shipping with DDRIII on it, only GDDR3.

Of course, common knowledge, we all know that. Just off the top of my head for starters "DDR2 and DDR3 memories used on video card have different characteristics than the DDR2 and DDR3 memories used on the PC – especially the voltage. That’s the reason they are called GDDR2 and GDDR3 (the “G” comes from “Graphics”).

GDDR2 memories continue to work at 2.5 V. Since they run at higher clock rates compared to DDR memories, they generate more heat. This is the reason why only a few video cards used GDDR2 memories – only GeForce FX 5700 Ultra and GeForce FX 5800 Ultra used this kind of memory. Shortly after GeForce FX 5700 Ultra was released many video card manufacturers released a GeForce FX 5700 Ultra using GDDR3 memories, maybe to lower the heat and power consumption effects". :wink:
a b U Graphics card
July 8, 2006 12:41:34 AM

Quote:
I have read your recent post and you certainly have an inside track to modern software and new technology

Yeah, Grapes the man around here, understanding the tech details way better than most people could hope to do. We all do ourselves a favor paying attention to his comments/links. I know he's straightened out my thinking more than a few times. :wink:
a b U Graphics card
July 8, 2006 1:50:42 AM

Quote:
Does this mean buying a video card with DDR3 is out of the question if one with DDR2 is available? Just as good in a work environment, no need for DD3 in a graphics intensive software like Dolphin?


Actually it's more a question of what the memory is called that the utility, etc.

GDDR3 is what you find on card, and DDR3 is what you'll find in desktops (if it ever makes it there :roll: :lol:  :roll: ), so you would only find GDDR3 in cards, and technically DDRII or what has become GDDR2. Now the thing is GDDR2 is usually cheaper, but more power hungry and hotter than GDDR3, as well as generally slower, with some overlap, as there will also be with GDDR4.

Now for the application you refer toit would depend alot on their method of rendering (which I might be able to figure out if I had more time tonight [going out to the Stamped in 30 mins, but based on the description it looks mainly CPU based [no mention of VPU requirement]). Is it a real-time render/view done with VPU/GPU assist or is it something that's CPU generated and then just sent to the graphics card as more of a 2D representation of the image generated solely by the CPU. If it's the former a good fast graphics card, likely benifiting from GDDR3 would be wise, and maybe even required to achieve the desired effect. However if it's the later which I think it it, you might even be able to run it on a lowely Radeon LE or nVidiaTNT card.

Quote:
Have you heard of this technology?


Yep, in fact one of my friends from University had been doing a similar thing with 3D mapping for rapid prototyping which of course they use for dentestry among other things. However I've never seen that particular software before.

Quote:
I will be at at the helm with this software technology and the hardware to obtain the cephalometric x-ray input. I have one friend capable of knowing how to use (I use that term loosly) and maintain it. I will have to learn.


And that's the fun part, although with X-rays and similar hardware it's not a good idea to rely on trial and error. :wink:
July 8, 2006 2:39:42 AM

Quote:
Quick ? for ya. Is the X1800GTO alot better than the X1800GTO? :tongue: :lol: 



*just couldn't resist*

In fact, the X1800GTO is about twice as fast as a X1800GTO, it's just plain nature. :p 

:lol: 
a b U Graphics card
July 8, 2006 3:00:29 AM

Quote:
Quote:
Does this mean buying a video card with DDR3 is out of the question if one with DDR2 is available? Just as good in a work environment, no need for DD3 in a graphics intensive software like Dolphin?


Actually it's more a question of what the memory is called that the utility, etc.

GDDR3 is what you find on card, and DDR3 is what you'll find in desktops (if it ever makes it there :roll: :lol:  :roll: ), so you would only find GDDR3 in cards, and technically DDRII or what has become GDDR2. Now the thing is GDDR2 is usually cheaper, but more power hungry and hotter than GDDR3, as well as generally slower, with some overlap, as there will also be with GDDR4.

Now for the application you refer toit would depend alot on their method of rendering (which I might be able to figure out if I had more time tonight [going out to the Stamped in 30 mins, but based on the description it looks mainly CPU based [no mention of VPU requirement]). Is it a real-time render/view done with VPU/GPU assist or is it something that's CPU generated and then just sent to the graphics card as more of a 2D representation of the image generated solely by the CPU. If it's the former a good fast graphics card, likely benifiting from GDDR3 would be wise, and maybe even required to achieve the desired effect. However if it's the later which I think it it, you might even be able to run it on a lowely Radeon LE or nVidiaTNT card.

Quote:
Have you heard of this technology?


Yep, in fact one of my friends from University had been doing a similar thing with 3D mapping for rapid prototyping which of course they use for dentestry among other things. However I've never seen that particular software before.

Quote:
I will be at at the helm with this software technology and the hardware to obtain the cephalometric x-ray input. I have one friend capable of knowing how to use (I use that term loosly) and maintain it. I will have to learn.


I attended the national ADA convention recently and met with three or four represenatives of companies like Kodak, Sirona and a few others. The hardware we need is currently in the 80k to 93k range. That's because we need a panoramic x-ray head as well as the Cephlometric full skull shot. With the capability of soing both of these views plus the intraoral films you dentists would take during a cavity check-up the cost soars. We are waiting until next year in hopes the price drops where not only universities and hospityals can afford it. Our conventional ceph/Pan Siemans equiptment is 17 years old and still widely used. Only 10% of orthodontists use the Digital imaging equiptment wiyh software like Dolphin's or Kodak. Kodak has purchased, from what I'm told, a hal;f billion dollars aquiring compabies solely for the copyrights to technology in the field. The kodak system that was priced out to me was 93K. That included the hardware I described and Kodak's own version of the Dolphin-type skull analysis/growth projection software.

The Dolphin software allows me to take a full skull xray and have the x-ray image appear through a R45 input into my PC and on to my hard drive. With the click of a mouse I can obtain hundreds of skull measurements instantly. Not only that many measurements, but dozens of different widely used analysis formats used in the industry. Like I could click on Steiner's measurments or Jarabek's measurments and instantly have hundreds of measuremens at my fingertips cmple with the tracing necesssary to achieve the measurements. It takes us a couple of hours to manually do a cephlometric tracing of the skull (measure the bones of the face and skull) and draw up a treatment plan to present to the patient. With Dolphin I can input digital close-up (what is available out there anyway) color images (photographs we take of the face and teeth) and integrate the color photos with the skull x-ray and make growth or treatment projections. We can show these to our patients and impending course of treatment and all that is involved becomes clear. To see what their face might look like after we rearrange things through comprehensive treatment while under our care.

At the national ADA convention some of the hardware companies, the companies who sell and service the digital x-ray equiptment, were giving away free Dell PC's with purchase of their equiptment if we bought it at the show. To my amazement (I was really, really surprised) the Dell they were demonstrating on had onboard graphics. So, you are right in you original assumption. Please explain a little further...

Is it a real-time render/view done with VPU/GPU assist or is it something that's CPU generated and then just sent to the graphics card as more of a 2D representation of the image generated solely by the CPU. If it's the former a good fast graphics card, likely benifiting from GDDR3 would be wise, and maybe even required to achieve the desired effect. However if it's the later which I think it it, you might even be able to run it on a lowely Radeon LE or nVidiaTNT card.

In fact, Kodak was demonstrating their 93K package on notebooks with the latest mobile graphics processors. I think mit's too much trouble to carry along PC's with advanced graphics to a convention floor for demonstration purposes when a notebook will do a satisfactory job. I think a high quality graphics solution would be sensational when viewing digital x-ray with the lucrative video options of the software.

Anyway, thanks for taking the time to look into this a bit. It's a good thing you and I undrstand these things like we do women. LOL!
July 8, 2006 3:17:39 AM

OMG, finally, badge learns there is a quote button!

god i was gettin pissed at the >> blah blah blahblahblahblah <<

no offense btw.
a b U Graphics card
July 8, 2006 6:19:43 AM

Quote:
OMG, finally, badge learns there is a quote button!

god i was gettin pissed at the >> blah blah blahblahblahblah <<

no offense btw.


Yeah, I guess I have it down about perfect now ;) .
July 8, 2006 12:45:40 PM

7900 more powerfull card than x 1600,it has more pixel pipelines
July 8, 2006 1:21:05 PM

Quote:
The 7900GT eats the X1600 for breakfast, lunch, and dinner at the same time.


Sounds like it has an eating disorder.
July 9, 2006 1:27:09 AM

Quote:
The 7900GT eats the X1600 for breakfast, lunch, and dinner at the same time.


Sounds like it has an eating disorder.

LMAO! At least it isn't FAT (or FAT32) :p  :D  :)  :? :x :evil:  :twisted:
July 9, 2006 1:53:30 AM

Quote:
The 7900GT eats the X1600 for breakfast, lunch, and dinner at the same time.


Sounds like it has an eating disorder.

LMAO! At least it isn't FAT (or FAT32) :p  :D  :)  :? :x :evil:  :twisted:

Excellent point: My GPU has eaten all the other GPUs in the neighborhood and now it's chomping my hard drive!
July 9, 2006 5:53:00 AM

You really don't need 512MB with that X1600... that's just a waste IMO. You only need 512 in higher end cards.

Whethear its worth it or not is really up to you if you want the performance. But personally, the ATI's low and mid-range cards suck. I'd go with an eVGA 7600GT rather than any X1600 card.
!