Disabling your integrated graphics card is fairly simple. However, in majority of the cases, you do not have to disable the integrated graphics card yourself.
If you have a dedicated graphics card installed, then the system automatically handles the disabling/switching of graphics cards for you.
Alternatively, if you DO NOT have a dedicated graphics card installed, then disabling the integrated graphics card would result in a far reduced graphics processing performance.
In the following text, I will talk in detail about about how to disable integrated graphics, whether you should disable it or not, and also about how the system generally handles the switching if you have both integrated and dedicated GPUs.
How to Disable Integrated Graphics Card?
There are essentially two basic ways to disable integrated graphics card on your PC.
- Through the Device Manager
- Through BIOS
Method 1: Disabling Integrated Graphics Card Through Device Manager
The first and the more intuitive method on how to disable integrated graphics card is to use the Device Manager.
The following is a video tutorial for this method.
Here are the steps if you do not wish to watch the video above:
Step 1: Click Search on the Windows Taskbar
Step 2: Search for “Device Manager“. You can also access device manager through Control Panel -> Device Manager
Step 3: Once the Device Manager Window opens, expand the “Display Adapters” section and check what graphics card it displays.
In my case, since I have both an integrated and a dedicated GPU, the section displays two graphics card.
Step 4: Select the integrated graphics card and press the disable button. Make sure you have pressed the DISABLE button and NOT the Uninstall (X) button.
Step 5: A prompt will appear with some warning. Select “Yes”
Subsequently, the screen may go blank for a few seconds.
When the display comes back, you will notice a disabled icon next to the iGPU you disabled.
Also Read: Can You Replace an Integrated Graphics Card?
How to Tell Which One is Integrated and Which One is Dedicated GPU?
If you have a dedicated and an integrated GPU installed, then for the newbies it may be hard to tell which one of the two is integrated.
For instance I have both an Intel HD Graphics 630 and an NVIDIA GTX 1050Ti on my system. But, which one of the two is iGPU?
If you have a basic knowledge regarding the graphics card market, you would be quick to tell that the Intel HD Graphics 630 is the iGPU whereas the NVIDIA GeForce GTX 1050Ti is the dedicated GPU.
Essentially, ALL Intel GPUs i.e Intel HD graphics, Intel UHD graphics, Intel IRIS graphics cards are iGPUs. Whereas all NVIDIA GPUs are dedicated graphics cards.
With AMD things can get a bit difficult for the uninitiated as AMD makes both dedicated and integrated GPUs.
If you are new and unsure about which GPU is integrated, it is better to just Google the make and model of the GPU, as shown on the Device Manager, online.
Also Read: Does Ryzen Have Integrated Graphics?
Would Disabling iGPU in Device Manager Automatically Switch Your System to Dedicated GPU Permanently?
Often people believe that disabling the iGPU through the Device Manager will permanently switch their system to the dedicated GPU.
That is hardly the case.
When you disable the iGPU (on laptops), your system switches to software based video processing done through the CPU via a driver called Microsoft Basic Display Driver.
This results in a far reduced performance!
In order to permanently switch your PC from using an iGPU to dedicated GPU, you have to do it through BIOS.
Method 2: Disabling Integrated Graphics Through BIOS
The other method on how to disable integrated graphics card is to use BIOS. This method is a bit more difficult for newbies.
For starters, note that the BIOS version may significantly differ from one PC to another and therefore, the BIOS menu shown on my PC may not be similar to yours at all.
Additionally, certain BIOS versions, particularly those on laptops, are so heavily stripped off of essential settings that you may not find the menu related to disabling the iGPU at all.
Step 1: Access BIOS
On PC startup, press the right key for accessing BIOS. This key may differ for different PCs. Often it is “Delete”, “F10” or the “F12” key.
Step 2: Search for Settings Regarding Display
Once in BIOS, you need to search for settings regarding the Integrated Graphics, Integrated Video, Integrated VGA, or for general graphics settings.
The settings may perhaps be located under the label Onboard Devices, Built-in Devices, or something along these lines.
For this you may have to go into the “Advanced” settings.
Step 3: Disable the iGPU / Select Discrete Graphics
Once you have located the right settings, disable the iGPU and then save and exit BIOS.
In some cases, the settings may also give you the option to select between “Auto”, “Discrete” (Dedicated), or “Integrated” graphics card. Choose the setting “Discrete” graphics card.
Note that tampering with the settings in BIOS can result in unwanted results and issues and therefore it is not recommended, particularly if you have doubt regarding whether you have located the right settings for the iGPU or not.
Also Read: Do I Need Integrated Graphics?
Is It Safe to Disable Integrated Graphics?
It is not recommended to disable the integrated graphics.
On desktops this isn’t much of an issue as the iGPU automatically gets disabled when the dedicated GPU is plugged in.
On laptops, however, disabling the integrated graphics will result in software based video processing to take over using Microsoft Basic Display Driver.
Since Microsoft Basic Display Driver uses the CPU for video processing, you will see a far reduced performance.
However, if you do have a working dedicated GPU also installed, then it is generally safe to disable the iGPU on a laptop if it is done through BIOS. It is often not necessary though.
Also Read: Is Integrated Graphics Card Good Enough?
Do You Need to Disable Integrated Graphics Card if You Have a Dedicated Graphics?
As mentioned earlier, it isn’t necessary to disable the iGPU on a laptop if you have a dedicated graphics card because the system handles the switching of the graphics card automatically on laptops.
Again, on desktops, this isn’t much of concern as the BIOS disables the iGPU when a dedicated GPU is plugged in. There is no dynamic switching of GPU on desktops.
On laptops for instance, in the NVIDIA Control Panel, you can select one of three “Preferred Graphics Processor” settings:
- Auto-Select – Default Option / Recommended Option
- High Performance Nvidia Processor – for using dedicated GPU only for all applications
- Integrated Graphics – For using iGPU for all applications.
Here you can see that with the “Auto-Select” option, the system automatically decides when to use the integrated GPU and when to use the dedicated GPU depending upon your workload.
So for instance, when doing less graphics intensive work like browsing, watching YouTube videos, writing reports, the system would use the integrated graphics card.
However, when gaming or using heavier editing and designing software, the system would automatically switch to dedicated graphics card.
This arrangement is useful for saving up on your energy bills and particularly on laptops for saving up on the essential battery life because having the iGPU disabled and the dedicated GPU running all the time can be taxing on battery.
The Motherboard I/O Ports Will Not Work with iGPU Disabled (Applicable to Desktops)
You should have guessed it already, but the I/O ports located on the back of your motherboard are connected to the iGPU.
If you disabled the iGPU, then the motherboard’s I/O ports will NOT work.
It is also worth mentioning that when you have a dedicated GPU installed, the I/O ports located on the motherboard get automatically disabled on desktops.
There is a good reason to this. A monitor connected to the motherboard’s video out port receives its graphics processing juice from the iGPU. Therefore, if you were to game on this monitor, it will seriously lag even if you have a powerful dedicated GPU installed separately.
To experience the power of the dedicated graphics card, you need to have the monitor connected to the video output ports of the dedicated graphics card.
Also Read: Can I Upgrade My Laptop Graphics Card?
You Can Also Have Both Integrated and Dedicated GPUs Enabled At The Same Time on Desktops
It is also possible to have both iGPU and the dedicated GPU working at the same time on desktops.
This would allow you to use both the motherboard’s video ports and the graphics card vide ports at the same time.
This method is excellent for multiple monitor display setup for office work. However, for gaming and intensive professional work, you will face the same issue as highlighted above i.e the monitors connected to the motherboard’s video ports will have a far reduced performance as compared to those connected to the dedicated graphics cards video ports.
In other words, if you are a gamer, you will have a much higher frame per second on the monitor connected to the dedicated GPU as compared to that connected to the motherboard’s video port.
To enable both dedicated and iGPU at the same time, I have written a comprehensive tutorial here:
Does Disabling Integrated Graphics Card Improve Performance?
On desktops, disabling the iGPU often has no affect – since the monitor is connected to the dedicated GPUs output port anyways.
On laptops, disabling the iGPU may offer some improvements such as more vibrant colors, sharpers visuals etc. Sometimes the improvements can be minor and can go unnoticed. The downside is that it’ll cost you battery life. Dedicated GPUs consume more power and can thus drain your battery quicker.
Hence it is not recommended to disable integrated graphics card on laptops.
What Happens if You Disable All Of Your Graphics Cards?
On laptops, if you disable both the iGPU and the dedicated GPU (if you have one), then your screen will NOT go blank. Instead the software based rendering done through CPU will take over.
Software based video rendering is far weaker and may result in unbearable performance lags.
On desktops, the software based rendering will also take over, but your monitor will need to be connected to the motherboard’s video out put port for it to work.