Age | Commit message (Collapse) | Author |
|
|
|
|
|
Add example dual head config, add info on bug reporting.
|
|
The Intel xorg driver tries mightily to determine the native fixed
panel mode settings for the LVDS output. It does this through various
means, including scanning video BIOS tables, and noticing if the pipe
in question has already been set up by somebody else (and adopting
those timings). This strategy works well for say a laptop where the
LCD panel is an integral part of the machine. But for other
applications where the display is unrelated to the system's BIOS or
other software, then the BIOS will likely have no clue how to
configure the LVDS output. Worse still, the BIOS can simply "get it
wrong", leaving the pipe misconfigured. Unfortunately the Intel
driver can potentially notice this, adopt the same settings, leaving a
messed up display.
All of this complexity normally happens independently, behind the
scenes, from the mode timings that might otherwise be specified by the
user. This driver has a concept of fixed, i.e. "native" mode, and
then user-specified mode. If the corresponding resolutions between
those concepts don't match, then the driver in theory will arrange for
scaling to take place while adhering to the actual native mode of the
panel. Said another way, if the user says 800x600 but the driver
mistakenly (see above) thinks the native mode is 640x480, then 640x480
is the mode set with scaling to an 800x600 frame buffer. If the
driver gets the wrong native mode, then the result is a miserable mess
with no way for the user to override what the driver thinks is right.
This patch provides a means to override the driver. This implements a
new driver option, "LVDSFixedMode" which defaults to true (the normal,
probe-what-I-need behavior). However when set to false, then all the
guessing is skipped and the driver will assume no fixed, i.e. "native"
mode for the display device. Instead with this option set to false,
the driver will directly set the timings specified by the user,
providing an escape hatch for situations where the driver can't
correctly figure out the right mode.
Under most scenarios of course, this option should not be needed. But
in situations where the Intel video BIOS is hopelessly fouled up
related to the LVDS output, this option provides the escape hatch for
the user to get a working display in spite of the BIOS situation.
Signed-off-by: Mike Isely <isely@pobox.com>
|
|
The Intel driver appears to be coded to only work with displays
expecting 18 bit pixels. However I have an application using a LCD
display that expects pixel data in 24 bit format. The difference is
only 2 bits in a single GPU register. This patch implements that
change, controlled by a new driver option, "LVDS24Bit". The default
value is false, which is the previous behavior. When set to true,
then 24 bit panels should work (at least the one I'm testing here
does).
Fd.o bug #15201
Signed-off-by: Mike Isely <isely@pobox.com>
|
|
|
|
They should be listed as lower case, since that's what you'd pass to xrandr.
|
|
Default XvMC to disabled.
|
|
Basic support for panel fitting.
|
|
Conflicts:
man/intel.man
src/i830_driver.c
|
|
|
|
On some platforms, the firmware may read & write GPU registers on lid close,
suspend/resume time or during various SMM events. If one of the graphics pipes
is disabled at that time, the GPU may hang due to the programming dependencies
of the various registers.
This patch adds a quirk to force the driver to keep pipe A enabled if
necessary, through user configuration in xorg.conf or via a platform specific
quirk. Leaving the pipe enabled comes at a power cost however, so the quirk
should only be enabled when strictly necessary.
Fixes https://bugs.freedesktop.org/show_bug.cgi?id=11432.
|
|
|
|
Add descriptions for LVDS and TV output properties and also mention the EDID
property a new output configuration section.
|
|
This commit fixes backlight support for several platforms.
Except on recent machines supporting the IGD OpRegion specification,
backlight control is rather platform specific. In some cases, we can
program the native backlight control regsiters directly without any
trouble. On others, we need to use the legacy backlight control
register. On still others, we need a combination of the two. And on
some platforms, none of the above will work, so we go through the
kernel backlight interface, which provides a platform specific driver
for backlight control.
|
|
|
|
Conflicts:
src/i830_dri.c
src/i830_memory.c
|
|
|
|
Reported by A. Costa" <agcosta@gis.net> in
http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=432061
|
|
This is a step towards being able to expose buffer objects through the screen
private to DRI clients, instead of having them have to use the fake buffer
object type.
This fails in two ways. First, the kernel memory manager is not currently
suitable for doing the physical allocations we need, so we still use AGP for
those. Additionally, the DRI lock can't be initialized early enough for us, so
these buffer object allocations fail. This will be fixed by improving the
DRM interface.
|
|
|
|
|
|
|
|
|
|
"true" in your xorg.conf). Should save ~0.5W during typical 2D usage.
|
|
|
|
It had been necessary to allow more than a small amount of memory to be
allocated, but now those old small allocations people had configured are
getting in the way.
|
|
With the fixes to the 2D frame buffer allocation that allows up to 65536
lines of 2D frame buffer in XAA mode, the old linear allocation hacks are no
longer necessary.
|
|
|
|
Install it as an alias to intel.4x, since we're letting people load the driver
as "i810" still.
|
|
|
|
|
|
While here, update a few other bits as well.
|
|
The artifacts only seemed to occur when EXA was falling back to software for
the front buffer.
|
|
|
|
Need to bump the DRI DDX version minor for the added SAREA fields.
|
|
|
|
Driver installs as intel_drv.so with symlink to i810_drv.so to ensure
existing configurations continue to work. Updated manual page to reflect
name change and add attributions for recent work.
|
|
|
|
The cachelines are used for two things: XAA pixmap cache and XV memory.
Only XAA pixmap cache is referred to using an offset pointing at the
beginning of the front buffer in rendering, and XAA only uses the 2d BLT
engine, which actually has a vertical limit of 65536. So, pixmap cache is now
limited to that much vertical.
Additionally, the previous cachelines allocation was too small for our
advertised XV limits, so video at the limits would fail with BadAlloc. Now,
XAA allocates the same approximate amount of offscreen memory as EXA:
3 times the screen size, plus one packed HD video.
|
|
Conflicts:
man/i810.man
src/Makefile.am
src/i830.h
src/i830_driver.c
src/i830_rotate.c
src/i830_video.c
|
|
Conflicts:
src/i830.h
src/i830_cursor.c
src/i830_dri.c
src/i830_driver.c
src/i830_video.c
|
|
Some code are duplicated with the new libdrm.
Once this code has been released with xserver,
it can be removed.
See the man page for new options and backwards
3D driver compatibility.
|
|
|
|
This reverts most of the mergedfb code. This will instead be done in device-
independent RandR code.
Conflicts:
src/Makefile.am
src/i810_driver.c
src/i810_reg.h
src/i830.h
src/i830_cursor.c
src/i830_driver.c
src/i830_modes.c
src/i830_video.c
|
|
|
|
Because we aren't using the BIOS to set modes any more, what the BIOS thinks is
present is probably even less important than before.
|
|
|
|
|
|
Conflicts:
man/i810.man
src/Makefile.am
src/i830_accel.c
src/i830_dga.c
src/i830_driver.c
|