1.12gig of ram? WTF? SLI!!!! *shakes stick*

pimpsmurf

Distinguished
Dec 8, 2007
43
0
18,540
I recently upgraded from a TERIBLE 7900gt (1 card) to an 8600GT x 2 SLI setup.

The 7900 had serious overheating issues, and I love my new cards, but unfortunately now, windows says I have 1.12 gigs of ram, when the bios states 2 gigs (as it should.)

This is the only hardware change I've done, and I tried moving the ram around to different slots (thanks p5n32-sli-crapmb) and it still shows 2 gigs. Is this normal? I've never used SLI before and I'm kinda sad that with World of Warcraft running, I now have to wait for stuff to swap around when I start IE or something of that nature. Anyone have any insight? I'm at the point that a E6750 w/ new MB/Ram is looking like my only out!!!

Thanks!
-PimpSmurf
 

metrazol

Distinguished
Apr 16, 2007
94
0
18,630
Putting 8600GT's in SLI was a crappy idea to begin with.

Try using CPU-Z to see what it says about your RAM.
 

MikosNZ

Distinguished
Nov 22, 2007
84
0
18,630
Take one 8600 GT out and see whether your system goes back to 2gb ram. If not something may have been damaged in the installation.



yup its amazing what a little bit of marketing will convince people to part with their $$ for. One good card is always better than 2 average ones.

 

rammedstein

Distinguished
Jun 5, 2006
1,071
0
19,280
wow, this is the second time in a few weeks soeone has told/done something like this, a aquaintance was trying to tell me that his 7800gtx was worse than his 8600M, i swear, sorry mate, but that lag you are seeing in wow is'nt from the lack fo ram but probably from the inferir cards, the ram just highlights it, stick you old 7900gt back in and makesure you ahve airflow and clean fan and the fan works...
 

metrazol

Distinguished
Apr 16, 2007
94
0
18,630


I don't know how people who do their own builds can just plunk down the cash without doing a bit of research first. I can understand someone reading a few threads and reviews while still deciding to purchase hardware at a less than optimal price-performance point if they are a fanboy or stupid rich, but just leaping into SLI with 8600's is a different level of mind boggling.
 

x_2fast4u_x

Distinguished
Nov 22, 2007
286
0
18,780
[to get us out of the flame war.]so i belive you problome is with 1, broken hardware or 2. virtual memory. see windows can onhly recognize so mush system memory whats your os? beacsue no matter how much other may tell you a single 8600 is better than a single 7900 it had been proven time and time again, and dual 8600 in sli is a bad dea beacuse the preformace is not worth the cost you would have been better off with a single 8800gt. hope this helps.
 

pimpsmurf

Distinguished
Dec 8, 2007
43
0
18,540
CpuZ sees both of the ram chips (identical)

1) My 7900gt CO is from eVga and they discontinued it because the ram was crap and overheats very easily (leave WoW /AFK and come back to your screen going apeshit with graphical errors. Alt-tab, and let the board cooldown and then everything is cool.

2) I had some cash and wanted an upgrade (to be rid of my 7900 constently **** itself) and best buy had 2 8600gt (BFG Tech) so I got um.

3) going back to the 7900 isn't an option because it now has 20 .308 holes in it.

4) I'm thinking that I'm going to take one of the 8600gt cards back and get a G35 MB w/ E6750 and some ddr3 1600 (cas 7-6-6-18)

Does that sound like a good idea? I can't afford to return both of the cards and get an 8800. I'll also be selling my watercooling rig. (and shooting the Asus P5N32-SLI deluxe POS MB.)

-PimpSmurf
 

Rolenio

Distinguished
Jul 20, 2007
58
0
18,630

Nice shooting, sir! :lol:

But if you are on tight budget, why DDR3? I think G35 doesn't support ddr3 anyway.
8600GT isn't that bad, I have one myself, but I really think you should go for something better. At least an ATI 3850
 

geotech

Distinguished
Jul 25, 2006
192
0
18,680
Terrible 7900?

I have a 7600gt and its awesome, I obviously can't play the latest, but HL2 is all the way maxed at 60fps, Crysis 1440x900, medium/low graphics is 20fps! It takes what I throw at it. The 7x series is to undershot.

(until Christmas rolls around I I get my 3850)
 

russki

Distinguished
Feb 1, 2006
548
0
18,980
that's a weird issue. Yes, it often eats up a lot of address space under the 4Gb limit - which is a physical limit, for memory mapping and such, but it really should not do it with 2 Gb if everything is properly configured.
 

pimpsmurf

Distinguished
Dec 8, 2007
43
0
18,540
Yup, turns out it wasn't the 8600s after all. I'm running on 1 8600 now (returned the other) on a E6750/P5k-E@3.5ghz and I'm getting 35+fps in Shattrath with everything cranked up. =)

 

MrCommunistGen

Distinguished
Jun 7, 2005
1,042
0
19,310
Why did you *shoot* the card? 1. a 7900GT is faster than an 8600GT (its not even close) 2. you could have SOLD the card to someone and gotten money out of it.
Did you ever stop to consider that the problem might have been airflow in your case? Also, if you spent the money from the motherboard and processor on a real graphics card you'd be getting even better frame rates.

-mcg
 

qmalik

Distinguished
Oct 20, 2007
382
0
18,780
"8600GT x 2 SLI setup."

what a waste of money.....crap performance for that price. i would return those 2 cards if u still have the chance and get 1 8800GT for the same price.
 

pogsnet

Distinguished
Aug 15, 2007
417
0
18,780
You can replace 7900GT's original heatsink/fan like ZALMAN for video card. Makes performance better and can be overclocked too.
 

pimpsmurf

Distinguished
Dec 8, 2007
43
0
18,540
1) It's an eVGA 7900GT 256meg CO Superclocked board
2) It's not on the market anymore because the memory would overheat and cause serious issues. I've tried using an aftermarket heatsink/fan and it still overheated and caused insane graphical issues in 3d apps. I eventually had to underclock the board.
3) I was joking about it being shot, it's actually boxed up and being returned under the lifetime warantee for repair (it's a known issue... the tech support guy at eVGA said "that card is a piece of ****". no lie.)
4) I did return one of the 8600s, and the single BFGTech 8600gt is humming along nicely at 1680x1050 8xAA w/ everything maxxed out in WoW. Minimum of 35fps anywhere in Shatrath. It's beautiful.
5) It's not a waste of money just because it's not what you would have bought. I wanted SLI for bragging rights, and the ability to do 16xAA (which looks good on my board.) The only games I play are F.E.A.R. (single player) and WoW. Maybe I'll pick up another 3d game (been thinking about the microsoft/activision mech-warrior type game.)

I <3 my 8600. It doesn't seem to be better than the 7900gt at dx9, but as I plan to go to vista ultimate64 (dx10) and overclock the board even more, I expect the performance to be much better than the 7900 (as the 7900 doesn't support dx10.)

I need to figure out what temps I should stay under on the board before I OC it more. It's at 565core bus/700memory bus ATM.

-PimpSmurf
 

pimpsmurf

Distinguished
Dec 8, 2007
43
0
18,540
"You can replace 7900GT's original heatsink/fan like ZALMAN for video card. Makes performance better and can be overclocked too. "

http://www.legionhardware.com/document.php?id=657&p=1
8600 can be overclocked quite a bit itself.

http://www.legionhardware.com/document.php?id=693&p=5
8600gt 256 beats 7900gt 512 at UT3 beta

It appears that on other games, the 7900gt is slightly ahead of the 8600gt at lower resolutions, and the 8600gt is ahead of the 7900gt at higher AA/AF/resolutions. Thats how I like to play anyway. Of course, the 7900gt would have probably been better for WoW for now. Maybe. Maybe I should benchmark it, but it's kinda hard to ;)
 

rodney_ws

Splendid
Dec 29, 2005
3,819
0
22,810

Something tells me you haven't seen any DX10 benchmarks... what is playable in DX9 mode is virtually unplayable in DX10 mode on many DX10 capable cards... so your argument that the 7900 can't push DX10 games... well, neither can that 8600 if you want to be perfectly honest. Nvidia really missed the boat on its 8600 series and people that read the reviews BEFORE they bought the cards knew it. Enjoy your card :)
 

TRENDING THREADS