Hi,
ich habe gestern unvorsichtigerweise die aktuellen Updates für Debian Squeeze ein gespielt, darunter waren auch updates für den XServer von 1.6.5 auf 1.7.4. Dabei wurde auch der Intel Video Treiber Version 2.9.0 durch Version 2.9.1 ersetzt.
Damit es damals in der alten Konfiguration ging musste ich den Intel Treiber patchen mit diesem Patch: xv-intel.patch.txt
Grafikchip ist ein Onboard Intel 915G.
Dann ging die Ausgabe mit xineliboutput und vdr-sxfe über den VGA Ausgang und das VGA2SCART Kabel an den Röhrenfernseher.
Nach etlichem Gefrickel habe ich es nun geschafft, den neuen Treiber 2.9.1 zu patchen und zu kompilieren. Ich dachte es geht dann wieder, aber Pustekuchen, hier ein Ausschnitt der XOrg.0.log:
(II) LoadModule: "intel"
(II) Loading /usr/lib/xorg/modules/drivers/intel_drv.so
(II) Module intel: vendor="X.Org Foundation"
compiled for 1.7.4, module version = 2.9.1
Module class: X.Org Video Driver
ABI class: X.Org Video Driver, version 6.0
(II) intel: Driver for Intel Integrated Graphics Chipsets: i810,
i810-dc100, i810e, i815, i830M, 845G, 852GM/855GM, 865G, 915G,
E7221 (i915), 915GM, 945G, 945GM, 945GME, Pineview GM, Pineview G,
965G, G35, 965Q, 946GZ, 965GM, 965GME/GLE, G33, Q35, Q33, GM45,
4 Series, G45/G43, Q45/Q43, G41, B43, Clarkdale, Arrandale
(II) Primary Device is: PCI 00@00:02:0
(WW) VGA arbiter: cannot open kernel arbiter, no multi-card support
drmOpenDevice: node name is /dev/dri/card0
drmOpenDevice: open result is 10, (OK)
drmOpenByBusid: Searching for BusID pci:0000:00:02.0
drmOpenDevice: node name is /dev/dri/card0
drmOpenDevice: open result is 10, (OK)
drmOpenByBusid: drmOpenMinor returns 10
drmOpenByBusid: drmGetBusid reports pci:0000:00:02.0
(**) intel(0): Depth 24, (--) framebuffer bpp 32
(==) intel(0): RGB weight 888
(==) intel(0): Default visual is TrueColor
(II) intel(0): Integrated Graphics Chipset: Intel(R) 915G
(--) intel(0): Chipset: "915G"
(II) intel(0): Output VGA1 using monitor section Monitor0
(**) intel(0): Option "PreferredMode" "720x576_50i"
(II) intel(0): Output DVI1 has no monitor section
(II) intel(0): EDID for output VGA1
(II) intel(0): Not using mode "720x576_50i" (interlace mode not supported)
(II) intel(0): Not using mode "800x520_50i" (interlace mode not supported)
(II) intel(0): Not using mode "1440x576_50i" (interlace mode not supported)
(II) intel(0): Not using mode "800x600" (vrefresh out of range)
(II) intel(0): Not using default mode "640x350" (vrefresh out of range)
(II) intel(0): Not using default mode "320x175" (doublescan mode not supported)
[...]
(II) intel(0): Not using default mode "1024x768" (doublescan mode not supported)
(II) intel(0): No remaining probed modes for output VGA1
(II) intel(0): EDID for output DVI1
(II) intel(0): Output VGA1 connected
(II) intel(0): Output DVI1 disconnected
(WW) intel(0): Unable to find initial modes
(EE) intel(0): Output VGA1 enabled but has no modes
(==) intel(0): video overlay key set to 0x101fe
(==) intel(0): sync fields activated
(==) intel(0): vertical scale fine tuning set to 0
(==) intel(0): Y/RGB vertical phase set to 0x0
(==) intel(0): UV vertical phase set to 0x0
(==) intel(0): scheduling priority not set explicitly
(==) intel(0): sync fields debug deactivated: 0
(EE) intel(0): No modes.
(II) UnloadModule: "intel"
(EE) Screen(s) found, but none have a usable configuration.
Fatal server error:
no screens found
Please consult the The X.Org Foundation support
at http://wiki.x.org
for help.
Please also check the log file at "/var/log/Xorg.0.log" for additional information.
Alles anzeigen
Meine xorg.conf mit der es schon mal ging sieht so aus:
Section "ServerLayout"
Identifier "X.org Configured"
Screen 0 "Screen0" 0 0
Option "BlankTime" "0"
Option "StandbyTime" "0"
Option "SuspendTime" "0"
Option "OffTime" "0"
EndSection
Section "Module"
Load "extmod"
Load "xtrap"
Load "dbe"
Load "record"
Load "dri"
EndSection
Section "Monitor"
Identifier "Monitor0"
VendorName "Monitor Vendor"
ModelName "Monitor Model"
HorizSync 15-16
VertRefresh 50-51
Modeline "720x576_50i" 13.875 720 744 808 888 576 580 585 625 -hsync -vsync interlace
Modeline "800x520_50i" 17.00 800 856 936 1088 520 548 553 625 -hsync -vsync interlace
ModeLine "1440x576_50i" 27.75 1440 1488 1609 1769 576 580 585 625 -hsync -vsync interlace
Option "PreferredMode" "720x576_50i"
# Option "Ignore" "true"
EndSection
Section "Monitor"
Identifier "Monitor1"
VendorName "some other Monitor Vendor"
ModelName "some other Monitor Model"
Option "Ignore" "true"
EndSection
Section "Monitor"
Identifier "Monitor2"
VendorName "some other Monitor Vendor"
ModelName "some other Monitor Model"
Option "Ignore" "true"
EndSection
Section "Device"
### Available Driver options are:-
### Values: <i>: integer, <f>: float, <bool>: "True"/"False",
### <string>: "String", <freq>: "<f> Hz/kHz/MHz"
### [arg]: arg optional
#Option "NoAccel" # [<bool>]
#Option "SWcursor" # [<bool>]
#Option "ColorKey" # <i>
#Option "CacheLines" # <i>
#Option "Dac6Bit" # [<bool>]
#Option "DRI" # [<bool>]
#Option "NoDDC" # [<bool>]
#Option "ShowCache" # [<bool>]
#Option "XvMCSurfaces" # <i>
#Option "PageFlip" # [<bool>]
Identifier "Card0"
Driver "intel"
VendorName "Intel Corporation"
BoardName "82945G/GZ Integrated Graphics Controller"
Option "monitor-VGA" "Monitor0"
Option "monitor-LVDS" "Monitor1"
Option "monitor-TMDS-1" "Monitor2"
Option "ForceMinDotClock" "12MHz"
# Option "ModeDebug" "True"
EndSection
Section "Screen"
Identifier "Screen0"
Device "Card0"
Monitor "Monitor0"
DefaultDepth 24
SubSection "Display"
Viewport 0 0
Depth 24
EndSubSection
EndSection
Alles anzeigen
Wieso steht da jetzt auf einmal "interlaced mode not supported" und nichts geht mehr?
Verzweifelt um Hilfe bittend...
Robert