diff options
Diffstat (limited to 'man/intel.man')
-rw-r--r-- | man/intel.man | 87 |
1 files changed, 87 insertions, 0 deletions
diff --git a/man/intel.man b/man/intel.man index 8a8b7a09..aac0efa6 100644 --- a/man/intel.man +++ b/man/intel.man @@ -200,6 +200,15 @@ LVDS-connected display on the other hand is extremely washed out (e.g. white on a lighter white), trying this option might clear the problem. .TP +.BI "Option \*qLVDSFixedMode\*q \*q" boolean \*q +Use a fixed set of timings for the LVDS output, independent of normal +xorg specified timings. The default value if left unspecified is +true, which is what you want for a normal LVDS-connected LCD type of +panel. If you are not sure about this, leave it at its default, which +allows the driver to automatically figure out the correct fixed panel +timings. See further in the section about LVDS fixed timing for more +information. +.TP .BI "Option \*qXvMC\*q \*q" boolean \*q Enable XvMC driver. Current support MPEG2 MC on 915/945 and G33 series. User should provide absolute path to libIntelXvMC.so in XvMCConfig file. @@ -295,6 +304,84 @@ sections with these outputs for configuration. Associating Monitor sections with each output can be helpful if you need to ignore a specific output, for example, or statically configure an extended desktop monitor layout. +.SH HARDWARE LVDS FIXED TIMINGS AND SCALING + +Following here is a discussion that should shed some light on the +nature and reasoning behind the LVDSFixedMode option. + +Unlike a CRT display, an LCD has a "native" resolution corresponding +to the actual pixel geometry. A graphics controller under all normal +circumstances should always output that resolution (and timings) to +the display. Anything else and the image might not fill the display, +it might not be centered, or it might have information missing - any +manner of strange effects can happen if an LCD panel is not fed with +the expected resolution and timings. + +However there are cases where one might want to run an LCD panel at an +effective resolution other than the native one. And for this reason, +GPUs which drive LCD panels typically include a hardware scaler to +match the user-configured frame buffer size to the actual size of the +panel. Thus when one "sets" his/her 1280x1024 panel to only 1024x768, +the GPU happily configures a 1024x768 frame buffer, but it scans the +buffer out in such a way that the image is scaled to 1280x1024 and in +fact sends 1280x1024 to the panel. This is normally invisible to the +user; when a "fuzzy" LCD image is seen, scaling like this is why this +happens. + +In order to make this magic work, this driver logically has to be +configured with two sets of monitor timings - the set specified (or +otherwise determined) as the normal xorg "mode", and the "fixed" +timings that are actually sent to the monitor. But with xorg, it's +only possible to specify the first user-driven set, and not the second +fixed set. So how does the driver figure out the correct fixed panel +timings? Normally it will attempt to detect the fixed timings, and it +uses a number of strategies to figure this out. First it attempts to +read EDID data from whatever is connected to the LVDS port. Failing +that, it will check if the LVDS output is already configured (perhaps +previously by the video BIOS) and will adopt those settings if found. +Failing that, it will scan the video BIOS ROM, looking for an embedded +mode table from which it can infer the proper timings. If even that +fails, then the driver gives up, prints the message "Couldn't detect +panel mode. Disabling panel" to the X server log, and shuts down the +LVDS output. + +Under most circumstances, the detection scheme works. However there +are cases when it can go awry. For example, if you have a panel +without EDID support and it isn't integral to the motherboard +(i.e. not a laptop), then odds are the driver is either not going to +find something suitable to use or it is going to find something +flat-out wrong, leaving a messed up display. Remember that this is +about the fixed timings being discussed here and not the +user-specified timings which can always be set in xorg.conf in the +worst case. So when this process goes awry there seems to be little +recourse. This sort of scenario can happen in some embedded +applications. + +The LVDSFixedMode option is present to deal with this. This option +normally enables the above-described detection strategy. And since it +defaults to true, this is in fact what normally happens. However if +the detection fails to do the right thing, the LVDSFixedMode option +can instead be set to false, which disables all the magic. With +LVDSFixedMode set to false, the detection steps are skipped and the +driver proceeds without a specified fixed mode timing. This then +causes the hardware scaler to be disabled, and the actual timings then +used fall back to those normally configured via the usual xorg +mechanisms. + +Having LVDSFixedMode set to false means that whatever is used for the +monitor's mode (e.g. a modeline setting) is precisely what is sent to +the device connected to the LVDS port. This also means that the user +now has to determine the correct mode to use - but it's really no +different than the work for correctly configuring an old-school CRT +anyway, and the alternative if detection fails will be a useless +display. + +In short, leave LVDSFixedMode alone (thus set to true) and normal +fixed mode detection will take place, which in most cases is exactly +what is needed. Set LVDSFixedMode to false and then the user has full +control over the resolution and timings sent to the LVDS-connected +device, through the usual means in xorg. + .SH "SEE ALSO" __xservername__(__appmansuffix__), __xconfigfile__(__filemansuffix__), xorgconfig(__appmansuffix__), Xserver(__appmansuffix__), X(__miscmansuffix__) .SH AUTHORS |