I'll do some research to see about disabling Optimus though. That is why I was trying to figure out how to force the computer to use the NVIDIA card. I believe that is what the problem is - it is using the integrated card instead of the NVIDIA card. Unfortunately this laptop doesn't have displayport, but 30hz should be fine. Yeah I know that it only goes up to 30hz for 4k resolution through hdmi. You should be able to disable Optimus and (hopefully) have the GTX 660M itself send the signal via displayport, which should expose the full 4K options. Then, your current resolution will be depicted in the ‘Resolution’ drop-down menu. If you could not find any from the above, you can click the ‘settings ‘ option. It will lead you to the display settings. In other words it's actually the Intel HD Graphics which is sending the video to the display. Select either screen resolution or personalize or properties. If I understand Optimus correctly, it actually routes the output from the discrete GPU through the integrated graphics. It uses the (low-power) onboard graphics for basic tasks and reverts to the discrete GPU when you fire up a game. I suspect, from your description, that you're using "Nvidia Optimus". So you won't get what you want over HDMI, you'll need displayport. 1440p Gsync could blow your mind though and is well worth a think. This didn't work, so I tried to go into the BIOS to force it to use the NVIDIA card but there was no option for determining the primary GPU.ĭoes anyone know if I am on the right track here or if I am missing something? Has anyone encountered this problem before? It's getting pretty annoying -_. I changed settings in the NVIDIA control panel to prefer the NVIDIA processor. If you’re curious, you can run 3840×2400 at. This means that unless you have two dual-link DVI outputs and your monitor can accept two inputs (few can), you cannot drive a 4K screen at 60Hz using DVI. I tried to solve this problem by forcing the computer to use the NVIDIA card. A dual-link DVI connection (as opposed to single-link) can handle 2560×1440 at 60Hz, but not 3840×2160 (which is the typical 4K resolution). It uses the dedicated card for graphics intense applications such as games, but for everything else it seems to use the integrated card. I believe the issue might be that the computer uses this integrated Intel HD Graphics 4000 instead of the dedicated NVIDIA geforce gtx 660. The HDMI cable DOES support 4k, as does the HDMI port in my computer. I talked to support for both the monitor (Samsung) and the laptop (MSI) for a while and still haven't gotten the problem solved. Now that your 4K gaming monitor is set up and the display settings are optimized, it’s time to connect your gaming console or PC to the monitor. Connecting Your Gaming Console or PC to the 4K Monitor. When I tried using the DisplayPort it went up to some resolution in the 2k range but still could not reach 3840x2160. In the next section, we’ll explore how to connect your gaming console or PC to your 4K gaming monitor for seamless gameplay. I hooked it up to my laptop (MSI GE70 running Windows 7) using the hdmi cable and the max resolution available was 1920x1080. So I just bought this new sexy 28 inch 4k resolution monitor.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |