So looking around in my display adapter settings I came upon the menu for Adapter> List All Modes. The modes listed were all different resolutions and Hz that are supported by my TV. It was set to 3840x2160 60Hz, but then i saw 4096x2160 60Hz was also supported. I switched it to that and it looks good. Infact the display/picture looks crisper then it looked before. Why would Nvidia not select the best available resolution supported by my TV? To be honest I didn’t even know my TV support 4096x2160, is that a weird resolution or something?
Nvidia resolution set lower by default then what's supported by my TV?
Collapse
X
-
Tags: None
-
-
okay, after using the 4096x2160 resolution for a few hours I decided to switch back to the native 4k resolution. While it made things like desktop icons and graphics sharper it actually made videos look like crap. Watching a 1080p youtube video looked like I was watching a 480p video. When I switched back to the native resolution it went back to looking better. I also did a little bit of research and found out the major reason it uses 3840x2160 is because that is the resolution supported/used by most 4k video content, I was getting black bars on the sides of videos when in 4096x2160. So long story short I now get why 4096x2160 was not chosen as a native resolution even though it’s supported.Comment
Comment