So looking around in my display adapter settings I came upon the menu for Adapter> List All Modes. The modes listed were all different resolutions and Hz that are supported by my TV. It was set to 3840x2160 60Hz, but then i saw 4096x2160 60Hz was also supported. I switched it to that and it looks good. Infact the display/picture looks crisper then it looked before. Why would Nvidia not select the best available resolution supported by my TV? To be honest I didn't even know my TV support 4096x2160, is that a weird resolution or something?