GTX 650 Ti Boost on PCI-E x16 1.1 (problem)

U6b36ef

Distinguished
Dec 9, 2010
588
1
19,015
I bought a Gigabyte GTX 650 Ti Boost. My motherboard is quite old, with a PCI-E x16 1.1 slot. I get poor performance from the graphics card. The CPU is E6700 core 2 Duo 2.66GHz, and RAM is 4GB DDR2. (I planned to upgrade this later.) I doubt the system CPU and RAM are holding back the graphics card this much.
My GPU-Z reads PCI-E 1.1x16@x16 1.1

I think it should say PCI-E 3.0 x16@x16 1.1, while 3D rendering, but it doesn't.
I suggest that because of two reasons. Partly because I have searched the internet, and everyone's forums comments say their cards do. Secondly, when I installed the card, Windows did its thing and started looking for drivers. During that time, the screen resolution was low, and fuzzy. Anyway I opened GPU-Z at that point, and it did say PCI-E 3.0 x16@x16 1.1.

Now with a recent Nvidia driver installed, I see PCI-E 1.1x16@x16 1.1
However I get the laggy performance, and a weak 3D Mark '06 score. (I have been reading that other users suffer with poor performance when their cards got stuck in 1.1 mode.)

I did lots of reading before buying a card and the following page assured me I should be OK. It fairly clearly states PCI-E 1.1 at x16 should be OK, and not bottleneck graphics performance. http://www.techpowerup.com/reviews/Intel/Ivy_Bridge_PCI...

It says specifically, quote, "Everything down to x16 1.1 and its equivalents (x8 2.0, x4 3.0) provides sufficient gaming performance even with the latest graphics hardware, losing only 5% average in worst-case."

As I have x16 1.1, I expected a good result. Any ideas anyone please?
 
Solution
Only other thing I can recommend is that you watch video card usage with something like afterburner to see how hard it is working.

I wonder how overclocked some of those systems are on the 3d marks website.

This shows the pci-e scaling all the way back to 1.1. So you should loose very little on such a card. Closest test I can do would be 2.0 X8 with a 670(by 650ti is kind of a pain to remove from its system for testing.).
http://www.techpowerup.com/reviews/Intel/Ivy_Bridge_PCI-Express_Scaling/23.html

U6b36ef

Distinguished
Dec 9, 2010
588
1
19,015


Thankyou for your replies, RobCrezz and nukemaster

My original card which I still have is a GT320 and my 3D Mark 06 score = 7150.
The 3D Mark 06 score for the 650 Ti Boost = 9170.

I had forseen that C2D E6700 CPU would hold back the gaming performance. However I did not imagine it would do it this much. I read an article from a December 2010 copy of computer shopper. The article is about upgrading, 'other components'.
Anyway this is the crux of it.
"If you're upgrading your graphics card there are a few things to consider. First, if your computer is more than a couple of years old or you have a slow processor, you may not get the best performance from a graphics card. In fact, when we tested a range of processors a year ago using an ATI Radeon 4870, we found a big differenc in performance. Using Call of Duty 4, we got just 38.5 fps using a Celeron Dual Core E1200 preocessor; upgrading to a core 2 Quad Q8200, we got 58.3 fps."

When you look at the results the Celeron Dual core is pulling about 65% of the gaming performance of the Quad in this example. The Celeron E1200 is 1.6GHz, and the Quad Q8200 is at 2.33GHz.
My E6700 is more powerful than the Celeron E1200, so in that test it would do better.

In my test I am getting about 33% of what the Ti Boost can make. As you can see I researched everything I could to determine if I was making the right buying choice. It had to be worth getting a good mid range card for this PC. I had planned to move the graphics card on later to a better board and CPU.

I may have my facts wrong, but I thought I had covered everything. You folks may be right. Some of my other reasoning was I thought the 9800GT pulled about 12500 in 3D MARK 06. I think at the time it would have been benchmarked with a Core 2 Duo. I imagine the tests on the 9800 GT then, would have been with a better Core 2 Duo than I have. Still I expected more.



 
Only other thing I can recommend is that you watch video card usage with something like afterburner to see how hard it is working.

I wonder how overclocked some of those systems are on the 3d marks website.

This shows the pci-e scaling all the way back to 1.1. So you should loose very little on such a card. Closest test I can do would be 2.0 X8 with a 670(by 650ti is kind of a pain to remove from its system for testing.).
http://www.techpowerup.com/reviews/Intel/Ivy_Bridge_PCI-Express_Scaling/23.html
 
Solution

U6b36ef

Distinguished
Dec 9, 2010
588
1
19,015


Thankyou again for replying.

Yes, I saw this page actually, when I was doing my reseach to see if the card would be OK for PCI-E x16 1.1 http://www.techpowerup.com/reviews/Intel/Ivy_Bridge_PCI-Express_Scaling/23.html

It does conclude that x16 1.1 looses only 5% performance with a GTX680. That confers with what the quote I made from the techpowerup page link, I added further up this page.

Very conclusive and helpful to remember this link. It also leaves the CPU as prime suspect. However when I look up the 3D mark 06 of the GTX280M paired with a 3.0GHz Core 2 Duo, it makes 11750 marks. It seems odd that my graphics card is making so many less marks for just 0.3GHz less on the CPU. (I looked for this example because it was the only high power card I can imagine paied with C2D.) I am still not ruling out that it's the CPU that is killing my marks though, like you folks tell me.

I need to explore why GPU-Z reads this PCI-E 1.1 x16@x16 instead of PCI-E 3.0 x16@x16. I have been digging at this. it's not unusual to expect you GPU-Z to show this. I have been reading that data switches up when put under 3D rendering load. Then will return to PCI-E 1.1 x16@x16 at idle.

Like I said at the beginning. Before my drivers were installed GPU-Z definitley showed PCI-E 3.0 x16@x16. If I hovered my mouse over it, there was a message read the following. "This graphic card reports that it supports: PCI-Express x16 v3.0. After I installed the drivers it's reports: PCI-Express x16 v1.1. I know my slot is 1.1 but the card should still say supports 3.0.

This user had similar issues. http://www.tomshardware.com/answers/id-1731210/stuck-pci-x16-mode.html

He describes the same thing. Stuck in 1.1 and clocks not clocking up to full.

I have found something that backs up this idea aswell. I am looking at NTunes.
It says my core clock is 549GHZ, (it should be 1033GHz). It says the RAM is 3004 MHz, (it should be 6008MHz.)

Thanks again nukemaster, and for sure don't go uninstalling your hardware for my benefit. Keep it safe.




 
Ok. GPU-Z gets the specs from a data base with card information. This is why it shows pci-e 3.0 x 16 @ x16 1.1 before drivers.

Once the drivers are installed the drop should be normal. You card can not pass 1.1 so it will never say anything else again.

I can confirm that on any Kepler based cards I have used on other systems do the same thing as yours. My 2.0 system will say 2.0 when loaded and 1.1 idle. It will never say 3.0 again with the drivers installed.

The memory listing as 3004 is just because the way they list it. It is 6008.

The core speed will vary depending on how hard the card has to work. I have played some very light games that leave the card at lower speeds well under the rated speed. If your gpu usage is lower, the card will not jump or stay at a high clock speed.

If you set Prefer Maximum Performance in the Nvidia Control Panel (3d settings -> power management mode -> Prefer Maximum Performance), you should be able to get it upto a higher clock speed(if it was being held back) when gaming or benching.
 

U6b36ef

Distinguished
Dec 9, 2010
588
1
19,015
Sorry if my posts are large. I tried to cover everthing to save others from haveing to ask for info, if they want to help.
You're right that the measurements, I'm convinced now, are correct.
I've run extensive monitoring of the card and it seems to be doing what it should. All clocks are running at normal speed. (Ntune just seems to have got it wrong.) It does point right at the CPU being the bottleneck. However I still can't get over the article I read in 2010 Computer Shopper. It seemed to suggest I should get more push with my CPU. I judge that since they used a lower powered C2D Celeron than this E6700. However that is speculation on other factors which I can be sure of. Eg what would my system do with the game software they were running. I think it was a version of Battlefield they measuered fps with.

I'd like to add one tiny note to something you said. I have downloaded and installed Nvidia Inspector. Here it reads the interface: PCI-E 3.0 x 16 @ x16 1.1.
Within GPU-Z (before drivers) the first part PCI-E 3.0 x 16, represents, "The graphics card reports that it supports: PCI-E 3.0 x 16". Whereas the part after @ shows: "it is currently running at: PCI-Express x16 v1.1".
This would be the same I think in Nvidia Inspector, because the format is identical. It just doesn't offer the additional info if you hover over it.

I think it is safe to give up now and assume it's a CPU bottleneck. Still I am stunned by my example of the GTX 280M. It can still pull more 2000 points more in 3D Mark '06, with just a 3.0GHz dual core.
Thankyou too for the 'Power Settings' hint. I had explored that option and have it set to 'high performance'.

I can't tell you how dissapointed I am with my result though, as I can imagine you guessed. Plus I can not say how thankful I am that you tried to help. Since saying thanks over the internet rarely seems to say it all, does it?

 
Nvidia Inspector shows the info GPU-Z does before driver install. I think GPU-Z limits that info once the driver is install to prevent users from wondering why the pci-e 1.1 or 2.0 board is not running at 3.0.

I have come across many users wanting to get a new board and cpu just for 3.0 and this was with slower cards than the 650 ti boost.

I can say that I do notice quite a boost with the GTX 650 ti in my 2600k(I was very surprised to be honest for a card that takes so little power and runs silent) system vs my i5 750 system(it was in while my GTX 670 was in RMA).

I have to wonder if they had made changes to 3dmarks after those reviews or something. I mean a 650 is a HUGE upgrade from a 280m.

I still have a Q6600 that is rather easy to disable some cores and adjust the speed to get e6700 speeds and get some 670 numbers if you wish. Its just the 650ti being in my SFF media center and also being the NAS for other systems in the house that makes it more difficult to just remove it for testing.

The system I am on now looses all my files/desktop/save games/music/ect if that system is shut down. While that seems like a bad idea. It has made central backups much more easy.
 

U6b36ef

Distinguished
Dec 9, 2010
588
1
19,015
Totally agree with you about the 650 Ti's being fabulous cards. I was waiting for a particular model of 650 Ti to come in stock on Amazon. Anyway a well priced 650 Ti Boost came in stock so I jumped at it.

No no !! - don't do any changing your Q6600, or any of your systems. I don't need any further benchmarks. Whatever don't mess your systems up. It might sound ungrateful, but it's not meant to.

Actually I have worked out exactly what is happening, so I don't need any more results you see. It's kind of complicated to explain, but I will try to make it simple, because it's a simple idea.

For now - tonight, let me leave this post, because I urgently wanted to post the message, 'please don't do anything to alter your systems - for results for me'.

It's 5.27am here in the UK and I am beaten for tonight. I will post tomorrow what I have learned and why I think everything is OK. Plus I will make as good an explanation of why I have arrived at this conclusion.

Other than that, I think I am myself going to be looking to get a Q6600 or Q6700 (socket 775).

Thanks for now!
 


Overclocking will do wonders IMHO. My friend has a 6600@3.4 with a P5Q PRO and 4GB of RAM I gave him running as his backup/home server, it was a PCI-E 2.0 slot though.
 

U6b36ef

Distinguished
Dec 9, 2010
588
1
19,015
Hi there then,
Nice little rig you have. Mine is quite similar. It started as a single core P4 PC from Acer. However I think I have 50% chance I may be able to get a quad to work. The Q6600, Q6700, or QX6700, or QX6800 might work in it. The spec sheet says up to dual cores, but the chipset supports core 2 quad; I verified for sure.

However I think it doesn't run the (45nm) E7000 series because I tried. It's possible the CPU I tried was faulty but I can't verify that. I guess then it won't run E5000, or E8000 series. That is likely a limitation in the BIOS, based on one of a couple of things. Either it doesn't recognise the reduced fabrication size of the 45nm cores. Or it is not specced in the BIOS to recognise that range of CPUs. The E7000 series was released after the release date of the PC BIOS. Whereas most of the Q CPU's I mentioned here were released before the PC BIOS date.

The Q and QX series CPUs are in the same 65nm fabrication tech size. I can hope one might work. My PC is specced to be able to run enough power for any of them, on the voltage power circuits. If one works I think it would be a worthy upgrade, even considering the PC runs DDR2 system RAM. Don't you think?



Anyway back to the Ti Boost. It appears a mini apology is needed from me at first about this CPU GPU issue. Only you'll see how I made my mistake.

The GPU arrived only recently and the first thing I did was benchmark. I got about average 9200 in 3D Mark '06. Anyway I went off and launched a game which I previously played in medium settings. There was maybe a little performance improvement with the Ti Boost, but barely noticable. At that point I started scouring the internet for info, because I thought the CPU would be OK, as I said. After lots of searching I posted here, on Tom's hardware.

Later I returned to the game and just out of curiosity, I added a x2 anti-aliasing. (That game really needed some AA.) I noticed the frame rate didn't seem to drop. Basically this one game was running the same frame rate that it did with my GT320, and the 650 Ti Boost. After playing a while I went back to the graphics options, and I just maxed everything, AA, the lot. When playing the game again, the frame rate was the same as ever.

Off to Crysis then. Exactly the same thing happened. To recap with my GT320, I could play Crysis on medium and a half. The FPS would dip below 25 sometimes, but generally hover at 25 to 30. I got much the same with the Ti Boost at the same settings. Anyway I turned up the graphics to high and I found it played still at the same frame rate, but looked better. I found I could max Crysis right up to 4AA without loosing any frame rate. Still running at 25-30 fps, but looking lots better.

The conclusion then is my PC is not bottlenecked by 3D rendering, (until the gaming graphics limit of the card is reached). Rather it is bottlenecked by something probably related to the CPU. It must be how the CPU, is putting the game together, or it has a particular function to do which it is lagging at.

The E6700 is a bit of a whizz for desktop work. Clearly though it seems it is a weak CPU for gaming, by today's standards. All my games which I have now tried do just the same. I can make them look fab, but not go any faster. Anyway this is a big post, so I should stop. There are more things I wanted to add, generally, but later for that, or you might get bored and give up.

Finally though I don't OC any of my kit, so the E6700 will stay stock. If I can get a qaud in, it should make a good difference.
 
The big supporting factor for some of the newer cpus also included the jump to 1333 fsb up from 1066. Some chipsets just did not deal with it well. My P5W DH from Asus officially supported 1333 cpus(while I am almost sure the chipset 975x never officially from Intel did), but was actually unstable with them.

The Q6600 will be slower on any games that can not use more than 2 threads(and that is quite a few even to this day). You have to watch out for this.

http://www.techspot.com/review/36-intel-core2-quad-q6600/page5.html

I do not want you to over expect. If you had the ability to overclock it could catch the x6800.

Here is a review with some of the first generation i5/i7 chips that shows the q6600 in the lineup as well.

http://www.anandtech.com/show/2832/16

Again if overclock is an option the q6600's tend to hit 3.0-3.2 without even needing more voltage or cooling.
 

U6b36ef

Distinguished
Dec 9, 2010
588
1
19,015
Those links you added are just the right ones. I mean I was aware that Quad CPU's advantages are dependant on software being optimised. I didn't understand it to be as clear cut as your links show though. Brilliant links. Plus I understand that later generation quads are more powerful, than earlier gens at identical frequency. 3D Marks to be gained there then. Thankyou for the heads up.

It looks like an excellent idea at this point to see what happens if I ran this E6700 at 3GHz. That would give me an idea about what I can expect from an X6800, (which is legitimately overclockable aswell). I'm sure this PC BIOS has no OC options. There are ways to do it from desktop though, and I already searched and found one called SysTool. Sadly Win 7 wants digitally signed drivers, so it won't run. (Incidentally there are Dual Core Pentium D's that go up to 3.6GHz. Their only downside being an 800MHz FSB, and 130W power requirement.)

Overall, it does all point that I might be better moving to a new board. Staying with this PC though, it depends what I could gain. I am still basing my hopes on luck, that this PC will run a quad core. According to your links, the Q6600 could make some stuff go slightly slower. That leaves the Q6700, QX6700, and QX6800. In all reality though, probably half of software will run at the same speed as E6700 and half faster.

I worry I may only just be able to contain the heat I think quads make too. The QX's run at a whopping 130W. It's got to be worth a try to get my paws on one though. I want to test if will work on this PC, test if I can cool it, and test its performance.

Please what 3D Mark 06 do you pull with your Q6600 and Galaxy GTX? That seems like an interesting place to start. Please could you run it for me, and post a score. I know about 3D Mark v100, and 3D Mark '11, but I am much more familiar with '06.

 

U6b36ef

Distinguished
Dec 9, 2010
588
1
19,015
I clocked my 2.66GHz E6700 up to 3GHz, and I saw more improvements than I anticipated. I think if I find even a higher clockable dual core like X6800, I'd be almost in the clear. It did leave me wondering though about a Q6700, and what I would gain. 3D Marks would increase but e.g. what would happen in say Crysis, or Warhead.

Adding to my cooling worries about a high performance CPU, I should say I have a mod cooler. However I couldn't fit anything as efficient as a Coolermaster Hyper 212. Most mod coolers were too wide for my set-up. I can improve case fans though.

Anyway just in case anyone's interested, I checked up what operations the CPU has in games. I own some games that specifically separate performance of CPU, GPU, and GPU RAM. CPU runs stuff like the following. [Max corpses, max moving corpses, max debris, enable dynamic decals, disable ragdoll effects, debris casts shadow, max effective sounds, cascaded step factor, near shadow distance, new triangles per frame, max active sounds, simple character destructions.]

I am CPU hunting now, and I found this on ebay. They all sold quickly, and I wasn't sure a server CPU would have been as good for me. http://cgi.ebay.co.uk/ws/eBayISAPI.dll?ViewItem&item=161116734806&ssPageName=ADME:B:SS:GB:3160
I will keep watching for a bargain like that though. I still think it is worth an upgrade of CPU. I appreciate newer gen chips are more powerful, and a new board would be best. Still if I can fit even an old quad it should give a new lease of life to all my games and make me happy. Especially if I find one cheap.

Thankyou for staying in contact on this. It's been fun.

 
Only thing I can find with Crysis. Seems to show it as not being multi threaded.
http://www.lostcircuits.com/mambo//index.php?option=com_content&task=view&id=79&Itemid=1&limit=1&limitstart=19

This is not to say the extra cores are not good to keep background programs running smoothly.

The server chips for LGA 775 are just re-branded normal cpus that in most cases are the top quality of the batch.

As for cooling, would a Freezer 7 pro fit? it only has a 92mm fan on it.
http://www.arctic.ac/en/p/detail?sArticle=4.%3F
 


I am a bit weary of his mobo, I think he might run into problems OCing.
 

U6b36ef

Distinguished
Dec 9, 2010
588
1
19,015
I have been thinking specifically about Crysis recently as an example of trying to understand what to expect from a quad. Using my CPU at 3.0GHz I can pretty much keep Crysis 'Warhead' above 25fps. Crysis fairly flies in high setting DX9, and I want to figure out how to get it to go very high settings in DX9. I don't know what change in the frame rate could be dependant on the CPU functions at very high.

My aim then must be to aim for 3.0GHz. Or like you say a 2.66GHz quad would allow other cores to take care of windows processes. (Thus leaving two cores for the game.) That's good thinking, because I had overlooked other resources cutting into gaming performance.

I think a quad upgrade might take care of Crysis Warhead then. Another game I have called Orion: Dino Horde struggles. That's the first game I tried when I found I could push up the graphics and not lose fps. It struggles the same with the Ti Boost as it did with the GT320. I have a Arcania Gothic 4 aswell, which I have to work out how to get going fast enough. It system requirements are min 2.8GHz dual core, or quad core. With the E6700 stock I dip to about 14fps, and lower in the odd place. It improves by about five fps generally, overclocking to 3GHz. It still has places where it dips low fps though, and only gains about two fps there.

I suppose if it says recommended a quad core it might mean it's optimised for quad CPUs.

I now wish I'd had the confidence to step forward and get on of those server CPUs. They sold at a rate of one a day, because they were a good price. I would have found it easy to resell at that price, had it not worked for me. That seller does have bargains I noticed; E6700's for £20*.

Ref the CPU cooler, I have the Freezer 7 Pro. That's the only one I could find that would fit. I have about 5mm between the CPU cooler and the PSU. I can get slightly better temps when I put the case fan on max. It faces the CPU cooler vent, which is in perfect position. I wish I could upload a pic for fun, only I don't see where the option is to add a photo. (I think the reason I have a narrow space there is because it was once a stock Acer PC. There may be another model CPU cooler that will fit, but I hunted wide looking for one, when I bought mine.)

I appreciate moving to a more modern DDR3 board will make things easier. I appreciate also your help. You see, I think that if this PC can support a quad it will remain usefull for time to come. At the very least an X6800 (legitimately overclockable CPU) should push me towards clear for most of my games.
 

You might as well get a P7P55DD PRO board and a Xeon X3340 with 8GB of DDR3, and a good case later on..
 

U6b36ef

Distinguished
Dec 9, 2010
588
1
19,015
@ The quad 6660 Inside,

It's like I said I do appreciate moving to a DDR3 board is an option. Only a quad could open this PC up.
(Plus I am generally not an overclocker, but I tried it this once to see what effect it would have.)


@ nukemaster, (or anyone),

A sort of edit afterthought about quad compatibility made me look up core types.

The
E6700 is a Conroe core
Pentium D dual cores are Presler core.
These chips work, as would previous generation chips like E4000 series.

The E7400 that didn't work was a Wolfdale
Quads that fit socket 775 and 1066MHz FSB, are Kentsfield. (All of the quads that will fit, except the Q6700 were released before the date of this PC BIOS.)


I am wondering about BIOS compatibity. I know the chipset will support quads. I just had the afterthought about how the BIOS chooses which CPUs it will support. Is it by core type maybe? I thougth the E7400 I tried didn't wokr because the BIOS wouldn't recognise the diminished fabrication technology.

Anyway I am going to see what I can find online about it.




 
Intel just took 2 Conroe's to make the Q6600 :)

It was not even a single die.

As for bios support, your guess may be as good as any. Sometimes you get lucky and support newer hardware(Many times AMD boards would just call it an unknown cpu, but still work) and others you get nothing.
 

U6b36ef

Distinguished
Dec 9, 2010
588
1
19,015
That's encouraging. Ohhh now I should have bought the Xeon server CPU I saw. (It hurts...)

I was actually astonished when I looked up benchmarks for the server CPU on 'cpubenchmark' website. They outperformed their 'normal' CPU equivalent. Like the X3230 2.66GHz = 3,715 CPU score.
Q6700 2.66GHz = 3,356 CPU score.

It's really astonishing when you look further through the benchmarks. Newer generation chips out performing older chips at the same speed. Improvements in architecture being behind this. In fact improved architecture made the E6700 cooler and use less power, than the Pentium D 631 single core, from this PC. Both chips fabricated in 65nm tech.

I found this page. It says CPUs somtimes get rejected by BIOS's because they fall out of the voltage range the board and BIOS can supply. http://www.yale.edu/pclt/PCHW/bios_and_cpu_support.htm
The 45nm E7400 I tried was within the same voltage range of the E6700 though. (?)

Anyway when you go looking for Q CPU's on ebay, sellers have the stock Intel cooler for sale. The Freezer 7 Pro is better than that, so I might have a chance to cool one.