- Lg Monitor Resolution Not Correcting
- Second Monitor Resolution Problems
- Definition Of Screen Frequency
- Lg Monitor Resolution Fix
Lg Monitor Resolution Not Correcting
Windows Suddenly Does Not Recognize Monitor, Stuck On Generic non-PnP Driver at Low Resolution I came across this problem from a customer who brought his computer to my attention. He has a dual monitor setup, one HDMI to the TV and the other to the LG 23″ monitor. I am using a kvm switch to work between two computers. One is running Vista and the other runs Windows 7. Both use the same monitor. The correct resolution for the monitor is 1280x960. Apr 02, 2014 Monitor not showing correct resolution My monitor (Acer S240ML) has stopped showing the correct resolution of 1920x1080, it is now natively showing 1024x768. The day before it was working perfectly on that night I had plugged my PlayStation 4 into the HDMI port, then the morning after it was showing the resolution it is now.
I am unable to get the 1680 x 1050 resolution recommended on my external monitor. My computer identifies the external as Generic non-php and will not allow me to update to another driver. I have tried disabling the built in laptop monitor, but still am not given the resolution option I need. The over scan is not a good option if you go to the source option for Samsung TVs then tools and rename the HDMI source to pc it gets rid of the over scan and quality becomes a lot better in the native screen resolution. After upgrading my PC to Windows 10, I found that I could no longer set my Vizio VMM26 monitor to display the correct resolution.Instead of my monitor's native resolution of 1920x1200 (16:10.
I got new monitor hp la2205 with recomended resolution 1680 x 1050 But when I'm trying to set resolution via display properties , the option 1680 x 1050 does not appear in list of modes
Graphics card : ATI Radeon HD 3450
OS : Windows XP sp3
What can be the issue ? Thanks
Connor W5 Answers
What graphics card is your computer using? My first thought is that your graphics card can't deliver that resolution, or that the correct drivers aren't installed.
So you should first check whether your computer is even capable of delivering that resolution. (You can also update your question with information about your computer/graphics card - then fellow SuperUsers can help you determine if the hardware is good enough.)
If the hardware is good, then check if the correct software is in place and properly configured. Tell us if you're using Windows, and which Windows version that is.
Update: Well, your card certainly can provide this resolution. Next, download the latest ATI display driver (here is a direct link) and see if that helps!

Also check if your monitor has been correctly detected. Check Device Manager > Monitors. If it has detected correctly it will display as the name and model no of your monitor, if not they it will display as Generic PnP Monitor. In this case you also will need to install the drivers that came with your monitor so that it can detect correctly along with it's resolution.
rzlinesrzlinesHad a similar problem and wasted a day fighting it. Hopefully this will help someone else in the same predicament. My configuration wasWindows 7 Ultimate, 32-bitHP dc7700 with built-in Intel graphics adapterBrand new Dell u2410 LCD monitor connected to the computer via a VGA cable
The problem was that I could select certain resolutions on the monitor (e.g. 1792x1344), but when I tried to set the monitor to its native resolution (1920x1200), I would get a black screen with a less then helpful message telling me to set the resolution to one of the supported resolutions such as '1920x1200 @ 60Hz', which is what I was doing in the first place.
I believe one of the following two things fixed the problem:
Turning the power off on both the computer and the monitor and then turning power on on the monitor first and then on the computer second. (simple, try that first)
Booting the system in the 'safe' mode, uninstalling (but NOT deleting!) the driver for the Graphics adapter (in my case it was labeled as an Intel Chipset), shutting the computer and starting Windows in normal mode. When the computer booted it recognized the graphics adapter as 'new' hardware and re-installed the driver. Then I was able to set the monitor resolution to 1920x1200. Note that during all this, the monitor was connected to the computer.
Earlier in the process, I did confirm the 1920x1200 resolution worked on the monitor by connecting it to another computer (Windows XP) using the same VGA cable.
Are you using DVI or VGA cables to connect the screen? I will assume VGA.
If so are you using a display switch box? If so remove it, otherwise are you using the VGA cable that came with the screen or possibly a very old one? My thought here is that the DDC (effectively the resolution plug 'n' play) pins aren't connected.
If the monitor is connected via VGA, you should try to connect rather via DVI.
Otherwise, you can try and use PowerStrip.
See this tutorial how to Customize monitor resolution settings with PowerStrip, which says:
Two primary software components, the video driver and the monitor driver, affect the quality, resolution, and color depth of the image on your screen. As I mentioned, PowerStrip doesn’t replace your video driver; it supplements it. So Windows still uses the original video driver. One of the ways that you can use PowerStrip to supplement the video driver is to create custom resolutions.
See also this Powerstrip guide from which I quote:
Powerstrip is very powerful, but also quite capable of causing your computer and display to stop 'communicating' properly, this could cause the display to start rolling, doubling or generally freaking out. It is important that you don't freak out as well. It is recommended that you know how to start up your computer in 'Safe' mode and 'VGA' mode, uninstall and reinstall drivers and get around in 'Device Manager' if you wish to use this software.
I would also recommend having backups to your data and creating a system restore point, before installing and using it.

Not the answer you're looking for? Browse other questions tagged displayresolution or ask your own question.
Just got a new LG 29UM67 monitor to get into 21:9 but I can't seem to get it to switch to 2560x1080 resolution. I have a GTX760 and am connecting to it via HDMI. When I check the resolution settings from windows and Nvidia options, the 2560x1080 is not listed. If I try to set a custom resolution of 2560x1080 the picture does not fill the monitor.
Second Monitor Resolution Problems
What other things can I try to get it to display the correct resolution? Do I have to use display port for it to work?
Definition Of Screen Frequency
Thanks for any help you guys can provide.