Sign in with
Sign up | Sign in
Your question

How to have windows experience index show 680m and not intel 4000

Last response: in Laptops & Notebooks
Share
August 31, 2012 3:24:32 PM

Hi! In my MSI GT70 gaming laptop that comes with the latest NVIDIA GTX 680M video card, most programs, including the Windows Experience Index, only utilizes the on-board Intel 4000 chip. I want to change the setup to have the high-end video card handle all graphics instead of the Intel chip. How can I do that??? (MSI was not particularly helpful)
August 31, 2012 4:42:42 PM

go into the control panel and there should be something about power saving or something, simply change it to the highest power setting and the gpu should switch
m
0
l
August 31, 2012 8:06:56 PM

It is set to highest settings (High Performance), also, in NVIDIA controls I selected the 680M for the "Global Setting" to be the main video hardware, but no use. When I asked MSI, they said that the Intel remains the main hardware and it can not be changed, but I've seen previews on the web where the NVIDIA was indicated as the display drive under Windows Experience Index, so I assume there must be a way to change it. I just don't have a clue how. :-(
m
0
l
Related resources
August 31, 2012 8:50:28 PM

My basic "issue" with this is that I've paid a lot of money for this gaming laptop to have the best and latest NVIDEO graphics driver in it, then I find out that they intentionally made it unavailable to be utilized under the false pretext of "providing more battery operation" with the Intel chip. Well, that's fine and everything would be OK if there was an option of using it in power-saving mode, then switching to full power when the user wants everything in maximum performance. But there is no such thing, and even with the "Turbo Mode", all remains the same. In short, this "setup" SUCKS big time! :-(
m
0
l
November 16, 2012 5:25:13 AM

You need to go to the nvida control panel and change the setting under the 'manage 3d setting' tab. If theres not a drop down menu somewhere that allows you to select between 'high performance nvidia processor' and 'integrated graphics', keep looking until you find it.

Also, on the top bar under 'desktop' select the "add ' run with graphics processor' to context menu" option. This allows you to right click any application/shortcut and under the drop out menu of 'run with graphics processor' you can select to use your GPU or the integrated graphics. Have fun!
m
0
l
a b D Laptop
November 16, 2012 3:18:08 PM

Have you tried going into the BIOS and making the 680M your primary/default adapter? It is an option that will force the scenario you are looking for in some setups.
m
0
l
May 5, 2013 4:30:58 PM

Try going into device manager, disable your INTEL HD 4000 card. It'll ask you to reboot, then run the test again.
m
0
l
May 5, 2013 7:33:59 PM

BobMcBobberson said:
Try going into device manager, disable your INTEL HD 4000 card. It'll ask you to reboot, then run the test again.


I would recommend you not do this. The system will utilize the hd400 chip for just about everything. The integrated chip does so much more than just graphics. It provides the backlite controls on the laptop, the brightness, ect. It will force the nvidia to do the work, but it's not really looking to be doing that. Most modern laptops when playing games that need more power will auto switch over to the best card available. Just because windows doesn't do it, doesn't mean it's not being done.
m
0
l
a c 358 U Graphics card
a c 115 å Intel
a c 433 D Laptop
May 6, 2013 7:13:58 AM

Windows automatically defaults to detecting the Intel HD 4000. I suppose you can attempt to uninstall the Intel HD 4000 drivers, but I do not know what the consequences are.

My suggestion is to just leave it alone.
m
0
l
May 6, 2013 9:27:17 AM

Like i said, i just wen't through this exact same thing. If you turn off, or dissable the HD4000, you will not like the outcome. Let the system decide what it needs. If you play a higher end game on ultra, and it works, than the system is doing it's job and switching... because obviously, the HD4000 can't do it. If you dissable or uninstall though, you won't be happy.


jaguarskx said:
Windows automatically defaults to detecting the Intel HD 4000. I suppose you can attempt to uninstall the Intel HD 4000 drivers, but I do not know what the consequences are.

My suggestion is to just leave it alone.


m
0
l
September 30, 2013 12:14:50 PM

if u still needing help go to device manager, display adapter, right click on hd 4000 and uninstall it..thats it
m
0
l
December 3, 2013 9:58:01 PM

I made it work on windows 8.0 with GTX765M and HD4600 combo by creating a new shortcut on my desktop with this path.

C:\Windows\explorer.exe shell:::{78F3955E-3B90-4184-BD14-5397C15F1EFC}

Any name you like.

Then right clicking the shortcut and selecting run with graphics processor and selecting the nvidia option.
Then run as normal and mine uses the nvidia card.

Hope this helps,

Gallium-V
m
0
l
December 18, 2013 10:16:01 AM

Gallium-V said:
I made it work on windows 8.0 with GTX765M and HD4600 combo by creating a new shortcut on my desktop with this path.

C:\Windows\explorer.exe shell:::{78F3955E-3B90-4184-BD14-5397C15F1EFC}

Any name you like.

Then right clicking the shortcut and selecting run with graphics processor and selecting the nvidia option.
Then run as normal and mine uses the nvidia card.

Hope this helps,

Gallium-V

(Yes, I know this is a pretty old post. But I'll throw in because there seem to be a lot of people that still have trouble running WEI for their dedicated GPU)
First, to anyone looking for this solution: Your machine will use the dedicated GPU when it needs to. Why are you so focused on the WEI scores? Their primary value is to communicate to neophytes that there are multiple bottlenecks that affect overall performance (Remember in the old days when consumers really only cared about raw processor speed? Very little care was ever paid to the size of the L2 cache or the timings of the RAM, or the controller used for the Southbridge, or the fact that legacy PCI devices use a shared bus and contend with each other for processing time) The way I use the WEI scores is to grade performance degradation (I'm a software developer). I don't care what the top end of my system is when it's not under load, but I really DO care how it's going to perform when I'm doing my thing. I'm being facetious in posing that as a question, though; I already know why you want to run a WEI with the NVidia card...to show off :) 

I'll show you the scores for my Dell at the bottom. Notice that the difference in the scores don't really mean anything. When I open Unity or ZBrush or Visual Studio, they are always run with the Quadro because I have them registered in the NVidia Control Panel to do so.

To Gallium:
I keep the W7 Godmode link to WEI in my Administration folder, but it never runs against the NVidia card without going into NVidia Control Panel and changing the default graphics device to my Quadro. Then, I {Run with graphics processor | High performance NVidia processor} and it'll run with the Quadro.

I can attest that this method will work correctly, at least in Windows 7.

My WEI scores:

Top line is before switching to run with my NVidia card. Bottom line is after switching:
Processor=7.7|Memory(RAM)=7.9|Graphics=6.4|Gaming Graphics=6.4|Primary hard disk=7.9|Overall=6.4
Processor=7.7|Memory(RAM)=7.9|Graphics=7.9|Gaming Graphics=7.9|Primary hard disk=7.9|Overall=7.7

(Diagnostics run with 4 instances of Visual Studio Ultimate 2013, Unity4.3.2, Chrome (x7), Chrome Metro with 10 live tiles, SQL Server Management Studio 2012, Paint.NET, & ZBrush 2 contending for resources.I feel that the WEI diagnostics are much more valuable performed on a system under normal load.)

Machine specs: i7-3740QM 6M L3 cache, 16GB RAM (1600MHz), 256GB SSD, Intel HD4000 integrated graphics (top), NVidia Quadro K5000M (bottom), $800 for the rig, + $450 for the Quadro (after selling the punier NVidia that came with the machine)
m
0
l
January 11, 2014 4:27:45 PM

Gallium-V said:
I made it work on windows 8.0 with GTX765M and HD4600 combo by creating a new shortcut on my desktop with this path.

C:\Windows\explorer.exe shell:::{78F3955E-3B90-4184-BD14-5397C15F1EFC}

Any name you like.

Then right clicking the shortcut and selecting run with graphics processor and selecting the nvidia option.
Then run as normal and mine uses the nvidia card.

Hope this helps,

Gallium-V


Worked like a charm on my Windows 7 64-bit machine (Intel HD 3000 and GT 555M)!
m
0
l
February 14, 2014 4:52:05 AM

I encounter lots of laptop users worrying that their system, while containing a discrete/dedicated NVIDIA graphics processor, seems to only ever run on the integrated Intel HD graphics chip. The emphasis here is on 'seems'. The system will use the NVIDIA card when it needs to.

Most laptops nowadays come with an NVIDIA feature called Optimus. Optimus is set up so the system uses the integrated Intel graphics for tasks and applications that don't require the power of the dedicated NVIDIA GPU -for those tasks and apps, the integrated Intel graphics will suffice just fine - while using the NVIDIA GPU for more graphic-intensive applications that do require the extra processing power this dedicated card has to offer.

This is done to save battery life, and for hardware durability; even while not running on the battery, the feature is still great for hardware durability.

To those people I say; do not worry. I assure you, your laptop will use the NVIDIA card for tasks and applications that require the extra power and speed. For all other applications that do not require so much power, the integrated Intel graphics will be used, and will be perfectly sufficient(otherwise the NVIDIA would be used if the Intel one wasn't enough). So, why would you wear your dedicated NVIDIA GPU out for applications that don't require that amount of power?

In other words, the OS will decide how much power an application needs, and then choose which of the two GPUs to run it on accordingly; the integrate one or the dedicated one.

Of course, you can always go to your NVIDIA control panel, and select the NVIDIA GPU as the preferred one to run applications on (by default it's set to auto-select; you can either keep it on auto-select, or set it to Intel GPU, or NVIDIA GPU) - note that setting this option to 'dedicated NVIDIA GPU' will not make everything always run on the NVIDIA GPU; this is merely a preference setting. The OS doesn't always abide by it. It also depends on how a respective application is coded or configured; some applications are coded to use the integrated graphics on Optimus systems regardless of the settings in your NVIDIA control panel.

Furthermore, NVIDIA has also configured for certain applications never to run on the dedicated graphics on Optimus systems, but always on the integrated graphics. For those applications, setting them to run on the dedicated graphics in the NVIDIA control panel, or running them on the dedicated graphics through their context menu, will not help. They will always run on the integrated Intel graphics, regardless of what you do. Two such applications are Google Chrome and Mozilla Firefox browsers, and most likely Internet Explorer as well.
m
0
l
!