Sign in with
Sign up | Sign in
Your question

Cant Disable Integrated GPU in BIOS, Help?

Last response: in Graphics & Displays
Share
December 21, 2011 5:38:12 PM

Hey. I just recently purchased an Acer Aspire 5560G-SB448 for my girlfriend.
The product page boasted an A8-3500M with an integrated 6620G GPU as well as dedicated Radeon HD 6650M GPU.

The problem is, I cant seem to get any of my games to utilize the Radeon HD 6650M.

They all seem to be using the integrated graphics.

Ive tried going into the BIOS to switch from the integrated GPU to the dedicated, but it only gives me two options, integrated and switchable graphics.

Ive tried using the switchable graphics option through AMD's "Vision ENGINE Catalyst Control Center". Supposedly, you can designate which GPU certain programs use. But unfortunately, this does not seem to work at all. It gives you two options. A program can use "Power Saving" graphics (Integrated GPU) or you can set it to "High Performance" graphics (dedicated Radeon HD card), but even if I set, say, Skyrim to "High Perfomance" graphics, it still uses only the integrated.

Ive even tried updating the BIOS, and every other AMD or Acer drive. No luck.

At this point Im been wondering if maybe I should just disable the integrated graphics through the device manager, but then again, Im not sure if the Radeon HD 6650M card would kick into gear, or if my screen would just go black. I dont want to screw up her laptop a day after recieving it, lol.

This is really disappointing, considering the main reason I purchased this laptop was because of the graphics capabilities, but apparently it turned out to be pretty much bullshit.

Can anyone help me out? Thanks.
December 21, 2011 5:55:18 PM

Have you tried calling Acer and having them tell you how to fix it?
m
0
l
December 21, 2011 7:56:26 PM

I contacted their support department, twice, once through their online chat support, and I also submitted a question to their tech support.

Both came back with the answer that I could switch graphics settings in the BIOS, which I had already tried. When I informed them that I could only switch to integrated graphics through BIOS, they told me that my question was "out of their scope of knowledge", and that I'd have to call Answers by Acer, which is a pay per call service.

Here is their fees: http://support.acer.com/answers.aspx

Unfortunately, I do not have $99 dollars to purchase a package just so I can ask them a question. Honestly,$100 dollars to get support seems ludicrous to me.
m
0
l
Related resources
December 21, 2011 8:40:00 PM

I agree, the prices are too high.

Just one question: How do you know that games don't use the dedicated card?

Also, did you check any settings in catalyst control center?
m
0
l
December 21, 2011 10:43:18 PM

Normally if you add a graphics card the motherboard graphics card is disabled automatically without having to change anything in the BIOS. It is only on old motherboards where you had to disable the onboard graphics in the BIOS. If you have the monitor connected to the graphics card connector and not the onboard graphics connector and it works (you get a display) then you are certainly using the graphics of the dedicated graphics card.
m
0
l
December 21, 2011 10:53:00 PM

As far as i know, you can't disable the integrated graphics as it is on cpu itself, this is meant as a power saving feature, the only thing you can do is set the game you want t play in the panel of the graphics properties.
m
0
l
December 21, 2011 11:11:26 PM

i have a similar model with the AMD switchable graphics if you use windows power management and selct "high performance" it should enable the GPU after a reboot also mine has a button on the case just above the keyboard marked with a "P" this switches the GPU and CPU into power saving mode. you may have to reboot and use either or both of these options to get the results you desire as both suffer if there are any programs running hope this helps

furthermore do not disable the opn board graphics via device manager as from my experience the GPU "DOES NOT !!!" kick in. expensive mistake
m
0
l
December 22, 2011 8:51:15 AM

Thanks for the replies.

I am aware of the Catalyst Control Center and its Switchable Graphics features.
So to test whether or not my graphics card was being used, I did three 3DMark 11 tests.

One with settings on "Power Saving" mode.
One with settings on "High Performance" mode.
And one with settings on High Performance, but I also enabled Dual graphics, where the integrated and the discrete are supposed to work together.

Interestingly enough, I got the exact same results on power saving mode and high performance mode, which leaves me to believe that either high performance mode doesnt make a difference at all, or my radeon card just isnt being utilized. My result for these tests was P1217 3DMarks.

But with dual graphics enabled, there was a increase in performance I noticed, although not a huge one. My results for that test were P1685 3DMarks.

So Im not sure what to make of this.

For instance, I have the same processor as the user in this video (http://www.youtube.com/watch?v=oF-DWaXU73Q), but I have more ram, and a better discrete graphics card. Yet they can get playable FPS in Skyrim (even while recording with Fraps), but I cant even manage high FPS on low settings.

:( 
m
0
l
April 11, 2012 7:33:49 AM

Turnupthedaft said:
Thanks for the replies.

I am aware of the Catalyst Control Center and its Switchable Graphics features.
So to test whether or not my graphics card was being used, I did three 3DMark 11 tests.

One with settings on "Power Saving" mode.
One with settings on "High Performance" mode.
And one with settings on High Performance, but I also enabled Dual graphics, where the integrated and the discrete are supposed to work together.

Interestingly enough, I got the exact same results on power saving mode and high performance mode, which leaves me to believe that either high performance mode doesnt make a difference at all, or my radeon card just isnt being utilized. My result for these tests was P1217 3DMarks.

But with dual graphics enabled, there was a increase in performance I noticed, although not a huge one. My results for that test were P1685 3DMarks.

So Im not sure what to make of this.

For instance, I have the same processor as the user in this video (http://www.youtube.com/watch?v=oF-DWaXU73Q), but I have more ram, and a better discrete graphics card. Yet they can get playable FPS in Skyrim (even while recording with Fraps), but I cant even manage high FPS on low settings.

:( 

TO: Turnupthedaft about your post about having the same gpu has the guy in the video but getting low fps with low settings :/  sorry to hear... maybe it will work with you what i did... idk what it is about the lcd screens on laptops but they suck for graphics... i can barely play my skyrim on medium settings on my laptop with 6310 amd hd gpu but i use the vga to a HD tv.... i can play my skyrim at MAX settings on everything all day long with nice fps .... its the screen sometimes keeping laptops from reaching actuall fps -.- laptops need better lcd screens lol
m
0
l
April 11, 2012 9:31:32 AM

Can you disable the one that you do not want in Windows, namely, Control Panel\Device Manager ?
m
0
l
April 11, 2012 9:43:33 AM

Nice necro, it's 4 months old post.
m
0
l
April 21, 2012 2:44:32 AM

Hey guys, I still have the laptop, but Ive given up on trying to game on it.
I tried contacting Acer multiple times, but their customer service is complete ***, so I gave up on that.

I dont remember if anything appeared in the processes list, but I'll look.

And as for plugging my laptop into a monitor to get better graphical performance...Ive never heard of that, but hey, I might as well try it.

Thanks for the advice guys.
m
0
l
April 25, 2012 4:36:01 AM

Hi Turnupthedaft, I've been having a similar issue with my new desktop PC which uses two monitors. The issue was caused by having my primary monitor plugged into the motherboard and my secondary monitor plugged into the discrete GPU, when I switched the monitors around my problems disappeared.

It works because by plugging the primary monitor directly into the dedicated GPU it bypasses the motherboard's integrated GPU and forces the dedicated GPU to always be running. I wonder if you could plug a monitor into your laptop and then make that monitor the primary display.

My system specs: Core i5 2500k, ASRock z77 Pro4, Radeon HD 7850.

I figured out this solution after reading these two paragraphs from TechSpot:

"The software [Virtu] has two settings: i-Mode and d-Mode. The former *requires your monitor to be wired to the motherboard* (the Sandy Bridge graphics engine) and offers nearly zero overhead."

"Conversely, Virtu's d-Mode lets you connect your display directly to a graphics card, but the Sandy Bridge graphics engine is always powered on, whereas it can be powered off in the i-Mode. That sacrifice will be worthwhile for some enthusiasts, as d-Mode's main advantage is that it allows you to use multi-GPU technology such as CrossfireX and SLI, while this isn't possible with the i-Mode."

Source: http://www.techspot.com/review/395-asrock-z68-extreme4/...
m
0
l
April 28, 2012 1:19:19 AM

That actually makes sense. I havent tried plugging in the laptop into a monitor yet, but I will do so soon!
Thank you for the advice, and to the other poster in the thread who recommended it.

Hopefully that will do that trick, and then I can just buy my girlfriend a flatscreen monitor for laptop. =]
m
0
l
!