• Hello MLAers! We've re-enabled auto-approval for accounts. If you are still waiting on account approval, please check this thread for more information.

Setting proper resolution on widescreen LCD

quinterro

68020
I now have an Acer/eMachines E181H widescreen LCD monitor. The resolution is 1366 x 768. On my PowerMac G4, it shows a max resolution of 1280x720. Of course the picture isn't clear. Is there a way to set it to the actual resolution even if it does not appear in the Monitors preference panel?

NOTE: My Windows/Linux box is closer, but still off. In Windows the max resolution shown is 1360x768. Just enough to make parts of the screen blurry. :(

 
i hope your luck is better but I have never got 1366 x 768 to display correctly on any machine

last time I said that I got some hooey, but someone had mentioned getting it to display correctly on a mac

 
There's a utility called SwitchRes that might provide you with the extra option for screen resolution.

But it also depends on the video card in your G4 - does it have enough VRAM to display at your desired resolution? It should, but if it's an older card there's a possibility it might not.

 
Although this may not help with any Macs, I have had some issues getting my roommate's television, also at 1366x768, to properly work with my Dell OptiPlex GX270. It would either be 1024x768, centered, or 800x600, for whatever reason, stretched.

The odd-but-it-worked solution, at least how I thought of it at the time, was to update the graphics drivers.

All of that having been said, I'm going to put in another good word for SwitchRes, you can manually set a lot of things, you can use it to overdrive single-scan displays (although that's not too reccomended) or hook up an LCD and run it at native resolution through an otherwise disagreeable VGA adapter or mac > vga cable, that was designed (or pinned, or whatever) for 640X480@60Hz, and nothing else.

 
I'll give SwitchRes a try. I did find out this morning that Ubuntu does 1366x768 just fine. On the Windows side the machine is using the latest drivers from ATI, but it's a legacy video card (Radeon 9600PRO).

 
Forgot to mention - the G4 has the original Rage128 AGP card with 16 MB RAM. It handled 1280x1024 at full color depth just fine.

 
Some (most?) LCD panels have an option to display the resolution that your computer supplies in their native pitch and just be blank in the parts not covered. While the image won't cover the whole screen under these circumstances, this option usually yields better image quality than whatever unnatural stretching and interpolating the monitor does to put the non-native resolution on the whole screen.

This can also come in useful on an older failing screen. I have a beautiful IBM 18.1" LCD panel from about 2001 which has some kind of clouding/dimming/cruft around the edges. But if I run it in 1152 X (mumble) and tell it to just use its native pitch, then I get a fine image with a small black border and I'm not sacrificing much by not having 1280 X 1024.

 
Weirdness - on the PC using the built-in drivers for the 9600 it showed the resolution correctly. After installing the latest ATI drivers it now shows 1360x768. So far I haven't found a function to only show the current resolution non-scaled.

Had to reload the PC - apparently Ubuntu does not like being cloned to a different type of drive (80GB ATA to 250GB SATA). GRUB wouldn't show the boot menu....

 
Some (most?) LCD panels have an option to display the resolution that your computer supplies in their native pitch
Just as an aside, I haven't seen that option on very many LCD monitors. Perhaps it's more common on high-end ones but on cheap ones... nope. Which is annoying.

(The "no scale" option seems to likewise be disappearing from laptop computer BIOSes... which is doubly annoying now that widescreens are standard but most lesser resolutions you might want to run will be 4x3, leading to distorted screen output. At the very least a "preserve aspect ratio" bit would be nice.)

 
The G4 has some issues with starting properly. Sometimes it will go to the desktop, other times it will stay on the gray screen.

I'll try it again tonight.

 
G4 has the original Rage128 AGP card with 16 MB RAM. It handled 1280x1024 at full color depth just fine.
That card has been much maligned by many, but in my 466DA, it does 1080p at 24bit w/o breaking a sweat on my HP 2X.X 1080p LCD Monitor (with HDMI input!) and the monitor displays all my old Mac's screens letterboxed on the sides VERY clearly scaled to 1080 x Whatever at the proper aspect ratio.

The only irritating thing I've found about last summer's ubuntu netbook remix is that the Mini did 720p on the 32" LCD TV upstairs running under WinXP Home3 just fine, but ubuntu won't do 720p or 480p (which the TV autoscales to 720p), so it's less clear in the resolution ubuntu offers than it was runnin' under the Crud Creeping outta Redmond. =8-P

 
I finally was able to try SwitchResX on my G4 after replacing the hard disk and installing Tiger. It came close, but no cigar. It will also do 1360x768. When adding a custom resolution, entering 1366 changed to 1360 after moving to another field.

I also tried DisplayResX. It attempts to determine the resolutions by itself, and won't let you add any extras without paying for the program.

The only possible cause I can see at the moment is that both the G4 and the PC have ATI video cards. I do have another PC with an old Vanta card in it. I'll try that out later today.

 
entering 1366 changed to 1360 after moving to another field.
1366/8 = 170.75

1360/8 = 170

thats your problem, and whoever insisted on these non /8 settings to be marketed for computers needs to be beaten with a rubber hose

so you either cludge it wasting quite a bit of video memory over the entire display for nothing, or it gets stretched

i dunno, if you spent money on this thing I would send it back, you may figure it out now, but 6 months from now on totally different software???

If you got it for nothing or next to nothing do the best you can and be ware of any screen resolution you cant divide by 8

 
I had a problem with my monitor not being able to display its full rez on any machine I tried and then I switched out the VGA cable with a better one and it worked fine after that.

 
Interesting...

I bought it off the clearance table, so it had no cables. Looking through my stash of parts at home I found a Male-Male VGA cable.

I'll try another cable later on and see if it helps.

 
Wikipedia's article on EDID talks about the "divide by 8" issue, specifically in reference to this resolution:

http://en.wikipedia.org/wiki/Extended_display_identification_data#Limitations

A replacement VGA cable isn't going to do anything. (A bad VGA cable is the suspect is if all a monitor will do is something *really* least common denominator like 640x480 or 800x600. You're getting "close to the right resolution", which indicates it's reading the data channel and interpreting the best it can.) More Googling suggests that older ATI cards in particular have problems with non-divisible-by-8 resolutions, up to not being able to do them *at all*, at least out their VGA ports.

Unfortunately I suspect Osgeld is probably right and you may really be stuffed. Does the monitor have a DVI port, and to you have a DVI equipped machine you could try it on? That might be the only chance you have.

 
I'm afraid not. There is only a VGA connector on the bugger. I did find out that if you set the resolution to be less than the actual resolution it will center the picture. Also Nvidia graphics cards don't seem to have the issue.

 
Back
Top