Graphics Card detected, but not being used.

Mar 4, 2018
1
0
10
So, i've recently got a new computer, but when i tried playing games on it, i got an exceptional amount of lag. After a lot of research, i realised that the pc was using the processor's graphics, not the 1070ti. But the card was connected because i could see the fans running, the device manager shows it, and a single game (minecraft) shows it and runs as well as it should. However, the dxdiag doesn't show it, it shows the processor's graphics.
I've tried reinstalling the graphics driver, installing an older version of it, disabling the processor's graphics (when i do that, dxdiag says "Microsoft Basic Display Adapter", and only then minecraft uses the 1070ti). I've tried setting the default GPU in the BIOS to "PCIe", then to "Auto" (previously it was "CPU graphics"). Nothing worked and i'm getting really worried.
-PC SPECS
Motherboard: Asus Prime B250M-A
CPU: Intel Core i7-7700
GPU: PNY GeForce GTX 1070Ti
RAM: 8GB DDR4
HD: 1TB
OS: Windows 10 Home
 
Solution
Do you have your monitor connected to the graphics card or to the motherboard? If it is connected to the motherboard, you need to move it to the graphics card.