Sign in with
Sign up | Sign in
Your question

Not sure if these are the same card, can anyone help?

Last response: in Graphics & Displays
Share
April 4, 2010 2:09:21 AM

I'm looking at buying a 275 gtx, and my local store has this one available.

http://microcenter.com/single_product_results.phtml?pro...

I also found this one on new egg.

http://www.newegg.com/Product/Product.aspx?Item=N82E168...

Both are made by the same company, seem to have almost all the same specs etc. But I'm seeing the micro center one lists the Memory Bandwidth as 129GB/sec. Where as the one on new egg lists the memory interface as 448. Is what new egg is calling memory interface actually the memory bus? Or is the card on new egg just a better card?

Thanks for any help you guys can give me.

More about : card

April 4, 2010 2:33:13 AM

The memory bus, and memory interface are the same thing. Essentially it's how many bits can travel at once over the path. The card on micro center calculated the amount of information that is processed and pushed through the card(the bandwidth) where newegg did not. They're the same card, I'd go with which ever one is cheaper.
m
0
l
April 4, 2010 5:16:34 AM

They're almost the same. The Mfr. Model numbers don't match - BFGRGTX275896OC at Microcenter, BFGEGTX275896OCE at Newegg. Specs look identical, though. I went to BFG's website and looked up their GTX275 models. I'm gonna call the first the R model and the second the E model because of the first letter in the model #'s following BFG.

BFG's specs for the "R" model:
http://www.bfgtech.com/bfgrgtx275896oce.aspx
Performance
GPU NVIDIA® GeForce® GTX 275
Core Clock 648MHz (vs. 633MHz standard)
Shader Clock 1440MHz (vs. 1404MHz standard)
Shader Model 4.0
Texture Fill Rate 51.8 Billion/sec.
Processor Cores 240
Memory
Video Memory 896MB
Memory Type GDDR3
Memory Data Rate 2304MHz (vs. 2268MHz standard)
Memory Interface 448-bit
Memory Bandwidth 129GB/sec

BFG's specs for the "E" model:
http://www.bfgtech.com/bfgegtx275896oc2e.aspx
GPU NVIDIA® GeForce® GTX 275
Core Clock 684MHz (vs. 633MHz standard)
Shader Clock 1512MHz (vs. 1404MHz standard)
Shader Model 4.0
Texture Fill Rate 54.7 Billion/sec.
Processor Cores 240
Memory
Video Memory 896MB
Memory Type GDDR3
Memory Data Rate 2430MHz (vs. 2268MHz standard)
Memory Interface 448-bit
Memory Bandwidth 136.1GB/sec

Either Newegg got the model number or the specs wrong because they do not match up at all.
m
0
l
Related resources
April 4, 2010 4:24:07 PM

You're welcome.

I'd contact Newegg to verify the model number and specs of the one they offer. It may actually be an "E" model, which has a higher factory overclock than the actual specs they list. Or you could just skip all that and buy the one from MicroCenter since it's $35 cheaper online. (It's probably even cheaper in-store. I miss living near a MicroCenter.) There should still be some overclocking headroom on that one.
m
0
l
April 5, 2010 4:59:19 PM

I went ahead and bought the microcenter one figuring I could easily overclock it if i want to the same specs of the more expensive one from newegg.

After installing it, I'm sadly getting the same if not less performance than I was with my 8800 gt. Still playing around with all of the settings. But it seems like I should be getting leaps and bounds of increase over my old card. The weird part is, when running rivatuner, after I noticed verry little if any increase in fps in games and what not. I started running the hardware monitor in the background to see what my clocks and temps were at in game.

My tweaks page has the clocks listed at default settings at.

ROP: 648
Shader: 1440
Memory: 1152

My hardware monitor lists them at:

ROP: 399.60
Shader: 799.20
Memory: 297.00

Something fishy, no?
m
0
l
April 5, 2010 5:14:49 PM

Are you using an older CPU, or playing at a low resolution in which performance is usually CPU bound?
m
0
l
April 5, 2010 5:28:12 PM

I'll just list off a quick system specs, see if any of this helps.

Win 7 64 bit os @1920x1200
Core 2 duo E8400 - 3.00 GHZ clocked to a very stable 3.84 GHz
8 GB PC2 8500
275 gtx
PSU: OCZ700FTY 700watt 12v@56A
2x WD1001FALS 7500 rpm HD

Hope any of this helps
m
0
l
April 5, 2010 7:58:38 PM

That thing should be smoking an 8800GT in that setup...

As far as the clock speed readings, the card is downclocking due to the built-in energy saving management. It works like SpeedStep does for your CPU, reducing clocks in order to save power and reduce temps while idle.

A few questions:

What driver version are you using, 197.13?
Is your PCIe x16 slot actually operating in x16 mode? (A GPU-Z validation would show that.)
What are you using to compare it's performance? Benchmarking programs, games?
Have you used GPU-Z to monitor the card's readouts to verify the clocks are increasing while it's under load?
m
0
l
April 5, 2010 11:37:47 PM

I was using furmark to benchmark, but I've never tried GPU-Z going out for a while. I'll try it when I get home and get back to you.
m
0
l
April 5, 2010 11:56:10 PM

Though I tend to hate synthetic benchies, 3DMark 06 or Vantage, as well as Unigine's Heaven or Tropics benchmark programs, should show different results.
m
0
l
April 6, 2010 2:32:43 AM

Ok I downloaded GPU-z and double checked everything. According to the Card tab the card is using x16 2.0. and the clocks show up right on it. But when under load the sensors are showing something else.

So i triple checked it using the hardware monitor's on both GPU-Z and rivatuner. And the clocks are never increasing from their idle frequencies. The clocks stay low, never fluctuate under load or idle.

Did I just get a bogus card? Thinking I may have to just go back to the store tomorrow and talk to them about it. Perhaps they will just replace it.
m
0
l
April 6, 2010 4:42:31 AM

Is this while it's doing something 3D intensive? Check GPU-Z's Sensor tab's readings while running something like Unigine's Tropics demo or Furmark in a window. Or check the box at the bottom of GPU-Z's "Sensors" tab to make GPU-Z refresh the readings while in the background, then check go back and review it after a full 3DMark 06 or Vantage run. If you click on each sensor reading, it will toggle through the current, lowest, highest, and the average readings it's recorded since initializing the program.
m
0
l
April 6, 2010 11:26:20 AM

Yep this is after running both Furmark and Unigine's Tropics demo. The clocks never change.
m
0
l
April 6, 2010 12:27:34 PM

After some advice from an old friend I uninstalled rivatuner and removed all of it's saved data. And the problem just completely went away. Apparently it was locking my cores at their idle clock somehow even if the program wasn't running it's self.

Needless to say i'm a happy camper now.
m
0
l
!