Sign in with
Sign up | Sign in
Your question
Closed

Great news

Last response: in Graphics & Displays
Share
November 5, 2008 4:29:28 PM

Hi guys after a lot of research I managed a good bios update n guess what my 8600gt which was only 16 Stream Processors and 64 bit interface , now has 112 SP`s and 192 bit Interface.

More about : great news

November 5, 2008 4:43:36 PM

your right that is good, good on you
November 5, 2008 5:02:11 PM

yup yup congrats on figuring out the issue by yourself

i hate how people dont use their brains and post random problems on forums and being spoon feed
Related resources
November 5, 2008 5:14:28 PM

That's some good work... With Photoshop! :non:  :lol: 
November 5, 2008 5:28:43 PM

Tell us the difference in FPS. That's just as important :) 
November 5, 2008 6:33:41 PM

hey come on i did not use photoshop or any other imaging software, i just flashed a P.O.V card with a custom made bios.

by the way there is a huge jump in FEAR from 50 fps @ 1280x1024 4xAF and 2xAA To 98 Fps.
November 5, 2008 6:59:24 PM

orangegator said:
That's some good work... With Photoshop! :non:  :lol: 

Could be.

@OP: Mind posting a GPUZ validated link?
November 5, 2008 7:29:25 PM

Shadow703793 said:
Could be.

@OP: Mind posting a GPUZ validated link?


I bet he makes up some lame excuse why he can't post a validation link. Or, he just doesn't respond at all.
November 6, 2008 9:34:21 AM

orangegator -

learn to respect someones hard work
November 6, 2008 3:51:22 PM

Nice work! Well done, but your card's still DDR2, which I think might be a bottle neck.
November 6, 2008 4:07:42 PM

The Nvidia G84 chip physically only has 32 stream processors, a 128bit memory bus, and 289M transistors. You cannot unlock what is physically not there.

The G92 has 128 stream processors, 256bit bus, and 754M transistors. If you told me you unlocked a G92 chip to enable all of these, then I'd believe you.

But I do give you props for somehow being able to hack the bios so that it tricks gpu z. Or for hacking gpuz.
November 6, 2008 6:29:42 PM

no, this is really an old 8600gt, and due to the unlock I got a huge performance boost.
November 6, 2008 6:33:58 PM

no, the first of the G84 had more processors than those of today, I guess nvidia just revised the G84 to cut the extra locked performance potential.
November 6, 2008 8:15:00 PM

vicky797 said:
no, this is really an old 8600gt, and due to the unlock I got a huge performance boost.


How about you run 3dmark06 and post the validation link of your results?
November 6, 2008 10:30:22 PM

^ lol, still not convinced are you? :) 
a b Î Nvidia
November 6, 2008 11:00:01 PM

Photochopped or Simply Messed up DeviceID info.

Since when did the G84 have a 192bit bus and anything more than 32 SPs / 4 Rops?
Was that revision 2, 3, 4 or pi ?

http://www.beyond3d.com/content/reviews/11/2
http://techreport.com/articles.x/12285

The specs posted are G92-like, which is like saying I succesfully moded my HD3850 into an HD4850 by downloading a new BIOS.

Here's a proper GPU-Z for your device (I chose it for it's acronym ;)  );
http://www.techpowerup.com/gpuz/9fcuk/
November 7, 2008 5:51:40 AM

oh for god`s sake, i did not mess with any device id info or something like that, the specs given above are a result of a bios flash posted at tecgpowerup.com.
November 7, 2008 5:57:12 AM

its A2
November 7, 2008 6:12:33 AM

Dude, seriously stop.

This is so lame it should be removed by an admin pronto.

It's bad enough that so many ppl are already mislead by marketing, fanboys, and the media, last thing anybody needs is this guy.

The only way this is possible is by doing alot of Modifications to a graphics PCB which would simply not be worth it.

So vicky797, if this is a prank stop right now...b/c You would have to do a lot, and I mean a lot to prove this little feat of yours how you magically made a value based GPU into a pretty powerful card...



a b Î Nvidia
November 7, 2008 6:30:03 AM

What ever you think you have don eyou have not the chip even if it is A2 just dosent have the archetecture the screenshots are showing.

One A2 revision review

http://www.legitreviews.com/article/486/1/

Mactronix
November 7, 2008 10:24:26 AM

ok thats it thread closed officially.
November 7, 2008 11:58:58 AM

Why don't you run 3dmark 2003/2005/2006? I am not calling you a liar, just show us how fast it is and let's see if it compares to the real deal.

I still think your card is 64 bit ram, you can't add more physical pins with a bios flash. Even the 9500---->9700 mod is still 128 bit, it just unlocks the other 4 pipes. You unlocked additional pipes/shaders which resulted in a huge performance boost.
a b Î Nvidia
November 7, 2008 2:44:14 PM

vicky797 said:
oh for god`s sake, i did not mess with any device id info or something like that, the specs given above are a result of a bios flash posted at tecgpowerup.com.


Doesn't really matter the source of it, the result is improper reporting of information, so the end result is that the BIOS flash F's up the tool, regardless of motivation or cause, that's what's happening, not growing transistors (how are you going to fit 80 more shaders, more ROP and larger memory controller on the same die without actually taking the chip and re-etching it?). Just like OrangeGator points out what you did is to break the validation software, not enable stuff that wasn't there.

What I didn't pick up initially in your original post is that you have something that is completely out of whack, and I usually pick up with fake 'new unreleased part info' rumours.

How do you have 192bit memory interface with 512MB of VRAM?

If you can't figure out why that doesn't work from a design perspective then you will never understand people's issues with your post/thread.

Quote:
its A2


Which actually confirms no physical changes.

A0 - A1 - A2 = Same design different spins.
B0 - B1 - B2 = Design revision (without base# change [like GF84 -> GF8x]), and thus a REV2 of the product.

Most A2s became the GF8500, not magical transforming G84s that were G92s 'living among us'.

The number usually that comes in between them describes the cut-down of the part with a higher number meaning higher on the 'full feature' list. IIRC the G84 600 was the GT and GTS and the 300 was the GF8500.


vicky797 said:
ok thats it thread closed officially.


Yes, it is now.

Whether or not you purposely made misleading statements in the original thread is not really the issue other than for your own pride, that what you are saying is happening is something that is physically impossible to happen from just a bios flash is the important thing. That you don't recognize the issues people are highlighting to you shows that's you dont' understand what's going on, so your assumptions in the first post and declarations of these huge changes are misleading to n00bs with as little or less knowledge than yourself, thinking they can go out and buy a crap GF8600 and magically turn it into a GF8800GT/GS. That's why you're getting the huge negative reaction to your post, just look at all the early readers amped to find out about this. I'm just sorry I didn't get here sooner.

Don't take it personally, but what you're claiming is incorrect. Could be an honest mistake, but it's a mistake none the less.
a b Î Nvidia
November 7, 2008 2:52:10 PM

TheGreatGrapeApe said:
Doesn't really matter the source of it, the result is improper reporting of information, so the end result is that the BIOS flash F's up the tool, regardless of motivation or cause, that's what's happening, not growing transistors (how are you going to fit 80 more shaders, more ROP and larger memory controller on the same die without actually taking the chip and re-etching it?). Just like OrangeGator points out what you did is to break the validation software, not enable stuff that wasn't there.

What I didn't pick up initially in your original post is that you have something that is completely out of whack, and I usually pick up with fake 'new unreleased part info' rumours.

How do you have 192bit memory interface with 512MB of VRAM?

If you can't figure out why that doesn't work from a design perspective then you will never understand people's issues with your post/thread.

Quote:
its A2


Which actually confirms no physical changes.

A0 - A1 - A2 = Same design different spins.
B0 - B1 - B2 = Design revision (without base# change [like GF84 -> GF8x]), and thus a REV2 of the product.

BTW, you'll notice in that TechReport review I posted up above that they are specifically reviewing the A2 spin (so yours is the same as that one);
http://techreport.com/r.x/geforce-8600/g84-chip2.jpg

The number usually that comes in between them describes the cut-down of the part with a higher number meaning higher on the 'full feature' list. IIRC the G84 600 was the GT and GTS and the 300 was the GF8500.




Yes, it is now.

Whether or not you purposely made misleading statements in the original thread is not really the issue other than for your own pride, that what you are saying is happening is something that is physically impossible to happen from just a bios flash is the important thing. That you don't recognize the issues people are highlighting to you shows that's you dont' understand what's going on, so your assumptions in the first post and declarations of these huge changes are misleading to n00bs with as little or less knowledge than yourself, thinking they can go out and buy a crap GF8600 and magically turn it into a GF8800GT/GS. That's why you're getting the huge negative reaction to your post, just look at all the early readers amped to find out about this. I'm just sorry I didn't get here sooner.

Don't take it personally, but what you're claiming is incorrect. Could be an honest mistake, but it's a mistake none the less.

!