Guten Morgen !
Ich versuche auf einem "alten" Rechner mit einer GT430 Grafikkarte yaVDR ansible unter Focal zu installieren. Der Rechner hat keine eigenen SAT-Karten sondern läuft als streaming-Client zu meinem "Haupt"-VDR.
Bevor ich das customized playbook habe laufen lassen, habe ich die alten NVIDIA-drivers installiert sudo apt -y install nvidia-utils-390. Beim Ausführen des Playbooks erhalte ich folgenden Fehler;
TASK [yavdr-xorg : start x-verbose@.service] ****************************************
fatal: [localhost]: FAILED! => {
"ansible_facts": {
"discovered_interpreter_python": "/usr/bin/python3"
},
"changed": false
}
MSG:
Unable to start service x-verbose@vt7.service: Job for x-verbose@vt7.service failed because the control process exited with error code.
See "systemctl status x-verbose@vt7.service" and "journalctl -xe" for details.
RUNNING HANDLER [Restart Samba] *****************************************************
PLAY RECAP **************************************************************************
localhost : ok=98 changed=9 unreachable=0 failed=1 skipped=18 rescued=0 ignored=0
Alles anzeigen
Ich habe danach am TV zwar Ton, aber kein Bild. Meine customized host_vars/localhost hat folgenden Inhalt:
---
# file: host_vars/localhost
# this allows to set up streamdev-client - set the server, port and number of devices
streamdev_client_remote_ip: "192.168.1.60"
streamdev_client_remote_port: 2004
streamdev_client_num_provided_systems: 5
# copy channels.conf from a local file
vdr_channels_conf: /home/mod/yavdr-conf/var/lib/vdr/channels.conf
# choose the vdr output plugin (defaults to vdr-plugin-softhddevice resp. vdr-plugin>
# vdr_output_plugin: vdr-plugin-softhddevice
# add the vdr plugins you want to install
vdr_plugins:
- vdr-plugin-devstatus
- vdr-plugin-markad
- vdr-plugin-epgsearch
- vdr-plugin-live
- vdr-plugin-osdteletext
- vdr-plugin-skindesigner
- vdr-plugin-streamdev-client
- vdr-plugin-svdrpservice
- vdr-plugin-tvguideng
- vdr-plugin-vnsiserver
- vdr-plugin-weatherforecast
# set the name of the output plugin (as used by vdrctl) - this defaults to softhddev>
selected_frontend: softhddevice
# set the package name of the output plugin - this defaults to vdr-plugin-softhddevi>
# vdr_output_plugin: vdr-plugin-softhddevice-cuvid
# IP (range) filter for vdr and plugins (this must be an array):
vdr_allowed_hosts:
- 192.168.1.0/24
# hosts and subnets for svdrphosts.conf (overrides vdr_allowed_hosts):
vdr_svdrphosts:
- 192.168.1.0/24
# hosts and subnets for allowed_hosts.conf of xineliboutput (overrides vdr_allowed_h>
xineliboutput_allowed_hosts:
- 192.168.1.0/24
# hosts and subnets for allowed_hosts.conf of vnsiserver (overrides vdr_allowed_host>
vnsiserver_allowed_hosts:
- 192.168.1.0/24
# hosts and subnets for streamdevhosts.conf (overrides vdr_allowed_hosts):
streamdev_server_allowed_hosts:
- 192.168.1.0/24
# dictionary of directories for (shared) files. Automatically exported via NFS and S>
media_dirs:
audio: /srv/audio
video: /srv/video
pictures: /srv/picture
files: /srv/files
backups: /srv/backups
recordings: '{{ vdr.recdir }}'
nfs:
insecure: false # set to true for OS X clients or if you plan to use libnfs as un>
samba:
workgroup: YAVDR
windows_compatible: '{{ vdr.safe_dirnames }}' # set to true to disable unix exten>
# additional packages you want to install
extra_packages:
- bpython
- bpython3
- htop
- tree
- vim
- w-scan
- t2scan
- vdrpbd
# choose which channellogos to download from the github.com/Jasmeet181 mediaportal-*>
# currently suported langugages/regions are: au, be, cz, de, es, ie, il, it, nordic,>
#
channellogo_languages:
# - au
# - be
# - cz
- de
# - es
# - ie
# - il
# - it
# - nordic
# - nz
# - ru
# - uk
# - us
Alles anzeigen
systemctl status x-verbose@vt7.service erigbt:
mod@htpc-gz:~/yavdr-ansible/host_vars$ systemctl status x-verbose@vt7.service
● x-verbose@vt7.service - X with verbose logging on vt7
Loaded: loaded (/etc/systemd/system/x-verbose@.service; static; vendor preset: >
Active: failed (Result: exit-code) since Fri 2020-09-11 10:01:50 UTC; 6min ago
Process: 178735 ExecStart=/usr/bin/x-daemon -logverbose 6 -noreset vt7 -config />
Sep 11 10:01:50 htpc-gz x-daemon[178737]: (EE)
Sep 11 10:01:50 htpc-gz x-daemon[178737]: Please consult the The X.Org Foundation su>
Sep 11 10:01:50 htpc-gz x-daemon[178737]: at http://wiki.x.org
Sep 11 10:01:50 htpc-gz x-daemon[178737]: for help.
Sep 11 10:01:50 htpc-gz x-daemon[178737]: (EE) Please also check the log file at "/v>
Sep 11 10:01:50 htpc-gz x-daemon[178737]: (EE)
Sep 11 10:01:50 htpc-gz x-daemon[178737]: (EE) Server terminated with error (1). Clo>
Sep 11 10:01:50 htpc-gz systemd[1]: x-verbose@vt7.service: Control process exited, c>
Sep 11 10:01:50 htpc-gz systemd[1]: x-verbose@vt7.service: Failed with result 'exit->
Sep 11 10:01:50 htpc-gz systemd[1]: Failed to start X with verbose logging on vt7.
lines 1-15/15 (END)
● x-verbose@vt7.service - X with verbose logging on vt7
Loaded: loaded (/etc/systemd/system/x-verbose@.service; static; vendor preset: enabled)
Active: failed (Result: exit-code) since Fri 2020-09-11 10:01:50 UTC; 6min ago
Process: 178735 ExecStart=/usr/bin/x-daemon -logverbose 6 -noreset vt7 -config /etc/X11/xorg-verbose.conf (code=exited, status=1/FAILURE)
Sep 11 10:01:50 htpc-gz x-daemon[178737]: (EE)
Sep 11 10:01:50 htpc-gz x-daemon[178737]: Please consult the The X.Org Foundation support
Sep 11 10:01:50 htpc-gz x-daemon[178737]: at http://wiki.x.org
Sep 11 10:01:50 htpc-gz x-daemon[178737]: for help.
Sep 11 10:01:50 htpc-gz x-daemon[178737]: (EE) Please also check the log file at "/var/log/Xorg.0.log" for additional information.
Sep 11 10:01:50 htpc-gz x-daemon[178737]: (EE)
Sep 11 10:01:50 htpc-gz x-daemon[178737]: (EE) Server terminated with error (1). Closing log file.
Sep 11 10:01:50 htpc-gz systemd[1]: x-verbose@vt7.service: Control process exited, code=exited, status=1/FAILURE
Sep 11 10:01:50 htpc-gz systemd[1]: x-verbose@vt7.service: Failed with result 'exit-code'.
Sep 11 10:01:50 htpc-gz systemd[1]: Failed to start X with verbose logging on vt7.
Alles anzeigen
Und journalctl -xe zeigt
--
-- A stop job for unit nvidia-persistenced.service has begun execution.
--
-- The job identifier is 1000144.
Sep 11 10:12:14 htpc-gz systemd-udevd[178840]: nvidia_uvm: Process '/sbin/create-uvm-dev-node' failed with exit code 1.
Sep 11 10:12:14 htpc-gz kernel: resource sanity check: requesting [mem 0x000c0000-0x000fffff], which spans more than PCI Bus 0000:00 [mem 0x000d0000-0x000d3f>
Sep 11 10:12:14 htpc-gz kernel: caller os_map_kernel_space.part.0+0x100/0x140 [nvidia] mapping multiple BARs
Sep 11 10:12:15 htpc-gz nvidia-persistenced[204186]: Received signal 15
Sep 11 10:12:15 htpc-gz nvidia-persistenced[204186]: PID file unlocked.
Sep 11 10:12:15 htpc-gz nvidia-persistenced[204186]: PID file closed.
Sep 11 10:12:15 htpc-gz nvidia-persistenced[204186]: The daemon no longer has permission to remove its runtime data directory /var/run/nvidia-persistenced
Sep 11 10:12:15 htpc-gz nvidia-persistenced[204186]: Shutdown (204186)
Sep 11 10:12:15 htpc-gz systemd[1]: nvidia-persistenced.service: Succeeded.
-- Subject: Unit succeeded
-- Defined-By: systemd
-- Support: http://www.ubuntu.com/support
--
-- The unit nvidia-persistenced.service has successfully entered the 'dead' state.
Sep 11 10:12:15 htpc-gz systemd[1]: Stopped NVIDIA Persistence Daemon.
-- Subject: A stop job for unit nvidia-persistenced.service has finished
-- Defined-By: systemd
-- Support: http://www.ubuntu.com/support
--
-- A stop job for unit nvidia-persistenced.service has finished.
--
-- The job identifier is 1000144 and the job result is done.
Sep 11 10:12:15 htpc-gz systemd-udevd[178833]: nvidia: Process '/sbin/modprobe -r nvidia-modeset' failed with exit code 1.
Sep 11 10:12:15 htpc-gz kernel: [drm] [nvidia-drm] [GPU ID 0x00000100] Unloading driver
Sep 11 10:12:15 htpc-gz kernel: nvidia-modeset: Unloading
Sep 11 10:12:15 htpc-gz kernel: nvidia-uvm: Unloaded the UVM driver in 8 mode
Sep 11 10:12:15 htpc-gz kernel: nvidia-nvlink: Unregistered the Nvlink Core, major device number 242
Sep 11 10:12:15 htpc-gz systemd[1]: Starting NVIDIA Persistence Daemon...
-- Subject: A start job for unit nvidia-persistenced.service has begun execution
-- Defined-By: systemd
-- Support: http://www.ubuntu.com/support
--
-- A start job for unit nvidia-persistenced.service has begun execution.
--
-- The job identifier is 1000147.
Sep 11 10:12:15 htpc-gz nvidia-persistenced[204204]: Verbose syslog connection opened
Sep 11 10:12:15 htpc-gz nvidia-persistenced[204204]: Now running with user ID 112 and group ID 121
Sep 11 10:12:15 htpc-gz nvidia-persistenced[204204]: Started (204204)
Sep 11 10:12:15 htpc-gz systemd[1]: Started NVIDIA Persistence Daemon.
-- Subject: A start job for unit nvidia-persistenced.service has finished successfully
-- Defined-By: systemd
-- Support: http://www.ubuntu.com/support
--
-- A start job for unit nvidia-persistenced.service has finished successfully.
--
-- The job identifier is 1000147.
Sep 11 10:12:15 htpc-gz systemd[1]: Stopping NVIDIA Persistence Daemon...
-- Subject: A stop job for unit nvidia-persistenced.service has begun execution
-- Defined-By: systemd
-- Support: http://www.ubuntu.com/support
--
-- A stop job for unit nvidia-persistenced.service has begun execution.
--
-- The job identifier is 1000259.
Alles anzeigen
Kann es sein, dass meine NVIDIA-Drivers nicht so funktionieren wie ich es erwartet habe ? Wie kann ich das beheben, wenn ich die GT430-Karte weiter behalten möchte ?
Danke für Eure Hinweise !