KpuCko

Proxmox cluster failing

Oct 25th, 2023
73
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
Bash 20.69 KB | None | 0 0
  1. Thanks for the fast replay, here is all the info you require
  2.  
  3.  
  4. [CODE]root@sofx1010pve3302.home.lan:~# pvecm status
  5.  
  6. Error: Corosync config '/etc/pve/corosync.conf' does not exist - is this node part of a cluster?
  7.  
  8.  
  9.  
  10. [email protected]:~# pvecm status
  11.  
  12. Error: Corosync config '/etc/pve/corosync.conf' does not exist - is this node part of a cluster?
  13.  
  14.  
  15.  
  16. root@sofx1010pve3307:~# pvecm status
  17.  
  18. Cluster information
  19.  
  20. -------------------
  21.  
  22. Name:             Proxmox
  23.  
  24. Config Version:   1
  25.  
  26. Transport:        knet
  27.  
  28. Secure auth:      on
  29.  
  30.  
  31. Quorum information
  32.  
  33. ------------------
  34.  
  35. Date:             Wed Oct 25 10:14:52 2023
  36.  
  37. Quorum provider:  corosync_votequorum
  38.  
  39. Nodes:            1
  40.  
  41. Node ID:          0x00000001
  42.  
  43. Ring ID:          1.a
  44.  
  45. Quorate:          Yes
  46.  
  47.  
  48. Votequorum information
  49.  
  50. ----------------------
  51.  
  52. Expected votes:   1
  53.  
  54. Highest expected: 1
  55.  
  56. Total votes:      1
  57.  
  58. Quorum:           1
  59.  
  60. Flags:            Quorate
  61.  
  62.  
  63. Membership information
  64.  
  65. ----------------------
  66.  
  67.     Nodeid      Votes Name
  68.  
  69. 0x00000001          1 192.168.30.7 (local)
  70.  
  71. root@sofx1010pve3307:~#
  72.  
  73. [/CODE]
  74.  
  75.  
  76. [CODE]root@sofx1010pve3302.home.lan:~# cat /etc/corosync/corosync.conf
  77.  
  78. cat: /etc/corosync/corosync.conf: No such file or directory
  79.  
  80.  
  81.  
  82. [email protected]:~# cat /etc/corosync/corosync.conf
  83.  
  84. cat: /etc/corosync/corosync.conf: No such file or directory
  85.  
  86.  
  87.  
  88. root@sofx1010pve3307:~# cat /etc/corosync/corosync.conf
  89.  
  90. logging {
  91.  
  92.   debug: off
  93.  
  94.   to_syslog: yes
  95.  
  96. }
  97.  
  98.  
  99. nodelist {
  100.  
  101.   node {
  102.  
  103.     name: sofx1010pve3307
  104.  
  105.     nodeid: 1
  106.  
  107.     quorum_votes: 1
  108.  
  109.     ring0_addr: 192.168.30.7
  110.  
  111.   }
  112.  
  113. }
  114.  
  115.  
  116. quorum {
  117.  
  118.   provider: corosync_votequorum
  119.  
  120. }
  121.  
  122.  
  123. totem {
  124.  
  125.   cluster_name: Proxmox
  126.  
  127.   config_version: 1
  128.  
  129.   interface {
  130.  
  131.     linknumber: 0
  132.  
  133.   }
  134.  
  135.   ip_version: ipv4-6
  136.  
  137.   link_mode: passive
  138.  
  139.   secauth: on
  140.  
  141.   version: 2
  142.  
  143. }
  144.  
  145.  
  146. root@sofx1010pve3307:~#
  147.  
  148. [/CODE]
  149.  
  150.  
  151. [CODE]root@sofx1010pve3302.home.lan:~# systemctl status corosync.service
  152.  
  153. ○ corosync.service - Corosync Cluster Engine
  154.  
  155.      Loaded: loaded (/lib/systemd/system/corosync.service; enabled; preset: enabled)
  156.  
  157.      Active: inactive (dead)
  158.  
  159.   Condition: start condition failed at Wed 2023-10-25 10:16:31 EEST; 249ms ago
  160.  
  161.              └─ ConditionPathExists=/etc/corosync/corosync.conf was not met
  162.  
  163.        Docs: man:corosync
  164.  
  165.              man:corosync.conf
  166.  
  167.              man:corosync_overview
  168.  
  169.  
  170. Oct 25 10:14:05 sofx1010pve3302.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  171.  
  172. Oct 25 10:14:21 sofx1010pve3302.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  173.  
  174. Oct 25 10:14:37 sofx1010pve3302.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  175.  
  176. Oct 25 10:14:54 sofx1010pve3302.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  177.  
  178. Oct 25 10:15:10 sofx1010pve3302.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  179.  
  180. Oct 25 10:15:26 sofx1010pve3302.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  181.  
  182. Oct 25 10:15:42 sofx1010pve3302.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  183.  
  184. Oct 25 10:15:59 sofx1010pve3302.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  185.  
  186. Oct 25 10:16:15 sofx1010pve3302.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  187.  
  188. Oct 25 10:16:31 sofx1010pve3302.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  189.  
  190.  
  191.  
  192. [email protected]:~# systemctl status corosync.service
  193.  
  194. ○ corosync.service - Corosync Cluster Engine
  195.  
  196.      Loaded: loaded (/lib/systemd/system/corosync.service; enabled; preset: enabled)
  197.  
  198.      Active: inactive (dead)
  199.  
  200.   Condition: start condition failed at Wed 2023-10-25 10:16:31 EEST; 5s ago
  201.  
  202.              └─ ConditionPathExists=/etc/corosync/corosync.conf was not met
  203.  
  204.        Docs: man:corosync
  205.  
  206.              man:corosync.conf
  207.  
  208.              man:corosync_overview
  209.  
  210.  
  211. Oct 25 10:14:05 sofx1010pve3303.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  212.  
  213. Oct 25 10:14:21 sofx1010pve3303.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  214.  
  215. Oct 25 10:14:37 sofx1010pve3303.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  216.  
  217. Oct 25 10:14:54 sofx1010pve3303.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  218.  
  219. Oct 25 10:15:10 sofx1010pve3303.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  220.  
  221. Oct 25 10:15:26 sofx1010pve3303.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  222.  
  223. Oct 25 10:15:42 sofx1010pve3303.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  224.  
  225. Oct 25 10:15:59 sofx1010pve3303.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  226.  
  227. Oct 25 10:16:15 sofx1010pve3303.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  228.  
  229. Oct 25 10:16:31 sofx1010pve3303.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  230.  
  231.  
  232.  
  233. root@sofx1010pve3307:~# systemctl status corosync.service
  234.  
  235. ● corosync.service - Corosync Cluster Engine
  236.  
  237.      Loaded: loaded (/lib/systemd/system/corosync.service; enabled; preset: enabled)
  238.  
  239.      Active: active (running) since Wed 2023-10-25 09:37:49 EEST; 38min ago
  240.  
  241.        Docs: man:corosync
  242.  
  243.              man:corosync.conf
  244.  
  245.              man:corosync_overview
  246.  
  247.    Main PID: 2445 (corosync)
  248.  
  249.       Tasks: 9 (limit: 76866)
  250.  
  251.      Memory: 157.1M
  252.  
  253.      CGroup: /system.slice/corosync.service
  254.  
  255.              └─2445 /usr/sbin/corosync -f
  256.  
  257.  
  258. Oct 25 09:37:49 sofx1010pve3307 corosync[2445]:   [QB    ] server name: quorum
  259.  
  260. Oct 25 09:37:49 sofx1010pve3307 corosync[2445]:   [TOTEM ] Configuring link 0
  261.  
  262. Oct 25 09:37:49 sofx1010pve3307 corosync[2445]:   [TOTEM ] Configured link number 0: local addr: 192.168.30.7, port=5405
  263.  
  264. Oct 25 09:37:49 sofx1010pve3307 corosync[2445]:   [KNET  ] link: Resetting MTU for link 0 because host 1 joined
  265.  
  266. Oct 25 09:37:49 sofx1010pve3307 corosync[2445]:   [QUORUM] Sync members[1]: 1
  267.  
  268. Oct 25 09:37:49 sofx1010pve3307 corosync[2445]:   [QUORUM] Sync joined[1]: 1
  269.  
  270. Oct 25 09:37:49 sofx1010pve3307 corosync[2445]:   [TOTEM ] A new membership (1.a) was formed. Members joined: 1
  271.  
  272. Oct 25 09:37:49 sofx1010pve3307 corosync[2445]:   [QUORUM] Members[1]: 1
  273.  
  274. Oct 25 09:37:49 sofx1010pve3307 corosync[2445]:   [MAIN  ] Completed service synchronization, ready to provide service.
  275.  
  276. Oct 25 09:37:49 sofx1010pve3307 systemd[1]: Started corosync.service - Corosync Cluster Engine.
  277.  
  278. root@sofx1010pve3307:~#
  279.  
  280. [/CODE]
  281.  
  282.  
  283. ​Here are the outputs only on the nodes which are failing to start the UI:
  284.  
  285.  
  286. [CODE]root@sofx1010pve3302.home.lan:~# systemctl status pveproxy.service pvedaemon.service
  287.  
  288. ● pveproxy.service - PVE API Proxy Server
  289.  
  290.      Loaded: loaded (/lib/systemd/system/pveproxy.service; enabled; preset: enabled)
  291.  
  292.      Active: active (running) since Wed 2023-10-25 10:03:50 EEST; 13min ago
  293.  
  294.     Process: 2467 ExecStartPre=/usr/bin/pvecm updatecerts --silent (code=exited, status=0/SUCCESS)
  295.  
  296.     Process: 2480 ExecStart=/usr/bin/pveproxy start (code=exited, status=0/SUCCESS)
  297.  
  298.    Main PID: 2529 (pveproxy)
  299.  
  300.       Tasks: 4 (limit: 38288)
  301.  
  302.      Memory: 141.3M
  303.  
  304.      CGroup: /system.slice/pveproxy.service
  305.  
  306.              ├─2529 pveproxy
  307.  
  308.              ├─2530 "pveproxy worker"
  309.  
  310.              ├─2531 "pveproxy worker"
  311.  
  312.              └─2532 "pveproxy worker"
  313.  
  314.  
  315. Oct 25 10:03:49 sofx1010pve3302.home.lan systemd[1]: Starting pveproxy.service - PVE API Proxy Server...
  316.  
  317. Oct 25 10:03:50 sofx1010pve3302.home.lan pveproxy[2529]: starting server
  318.  
  319. Oct 25 10:03:50 sofx1010pve3302.home.lan pveproxy[2529]: starting 3 worker(s)
  320.  
  321. Oct 25 10:03:50 sofx1010pve3302.home.lan pveproxy[2529]: worker 2530 started
  322.  
  323. Oct 25 10:03:50 sofx1010pve3302.home.lan pveproxy[2529]: worker 2531 started
  324.  
  325. Oct 25 10:03:50 sofx1010pve3302.home.lan pveproxy[2529]: worker 2532 started
  326.  
  327. Oct 25 10:03:50 sofx1010pve3302.home.lan systemd[1]: Started pveproxy.service - PVE API Proxy Server.
  328.  
  329.  
  330. ● pvedaemon.service - PVE API Daemon
  331.  
  332.      Loaded: loaded (/lib/systemd/system/pvedaemon.service; enabled; preset: enabled)
  333.  
  334.      Active: active (running) since Wed 2023-10-25 10:03:49 EEST; 13min ago
  335.  
  336.     Process: 2313 ExecStart=/usr/bin/pvedaemon start (code=exited, status=0/SUCCESS)
  337.  
  338.    Main PID: 2462 (pvedaemon)
  339.  
  340.       Tasks: 4 (limit: 38288)
  341.  
  342.      Memory: 209.0M
  343.  
  344.      CGroup: /system.slice/pvedaemon.service
  345.  
  346.              ├─2462 pvedaemon
  347.  
  348.              ├─2463 "pvedaemon worker"
  349.  
  350.              ├─2464 "pvedaemon worker"
  351.  
  352.              └─2465 "pvedaemon worker"
  353.  
  354.  
  355. Oct 25 10:03:48 sofx1010pve3302.home.lan systemd[1]: Starting pvedaemon.service - PVE API Daemon...
  356.  
  357. Oct 25 10:03:49 sofx1010pve3302.home.lan pvedaemon[2462]: starting server
  358.  
  359. Oct 25 10:03:49 sofx1010pve3302.home.lan pvedaemon[2462]: starting 3 worker(s)
  360.  
  361. Oct 25 10:03:49 sofx1010pve3302.home.lan pvedaemon[2462]: worker 2463 started
  362.  
  363. Oct 25 10:03:49 sofx1010pve3302.home.lan pvedaemon[2462]: worker 2464 started
  364.  
  365. Oct 25 10:03:49 sofx1010pve3302.home.lan pvedaemon[2462]: worker 2465 started
  366.  
  367. Oct 25 10:03:49 sofx1010pve3302.home.lan systemd[1]: Started pvedaemon.service - PVE API Daemon.
  368.  
  369.  
  370.  
  371. [email protected]:~# systemctl status pveproxy.service pvedaemon.service
  372.  
  373. ● pveproxy.service - PVE API Proxy Server
  374.  
  375.      Loaded: loaded (/lib/systemd/system/pveproxy.service; enabled; preset: enabled)
  376.  
  377.      Active: active (running) since Wed 2023-10-25 10:03:50 EEST; 14min ago
  378.  
  379.     Process: 1402 ExecStartPre=/usr/bin/pvecm updatecerts --silent (code=exited, status=0/SUCCESS)
  380.  
  381.     Process: 1408 ExecStart=/usr/bin/pveproxy start (code=exited, status=0/SUCCESS)
  382.  
  383.    Main PID: 1416 (pveproxy)
  384.  
  385.       Tasks: 4 (limit: 38288)
  386.  
  387.      Memory: 141.5M
  388.  
  389.      CGroup: /system.slice/pveproxy.service
  390.  
  391.              ├─1416 pveproxy
  392.  
  393.              ├─1417 "pveproxy worker"
  394.  
  395.              ├─1418 "pveproxy worker"
  396.  
  397.              └─1419 "pveproxy worker"
  398.  
  399.  
  400. Oct 25 10:03:48 sofx1010pve3303.home.lan systemd[1]: Starting pveproxy.service - PVE API Proxy Server...
  401.  
  402. Oct 25 10:03:50 sofx1010pve3303.home.lan pveproxy[1416]: starting server
  403.  
  404. Oct 25 10:03:50 sofx1010pve3303.home.lan pveproxy[1416]: starting 3 worker(s)
  405.  
  406. Oct 25 10:03:50 sofx1010pve3303.home.lan pveproxy[1416]: worker 1417 started
  407.  
  408. Oct 25 10:03:50 sofx1010pve3303.home.lan pveproxy[1416]: worker 1418 started
  409.  
  410. Oct 25 10:03:50 sofx1010pve3303.home.lan pveproxy[1416]: worker 1419 started
  411.  
  412. Oct 25 10:03:50 sofx1010pve3303.home.lan systemd[1]: Started pveproxy.service - PVE API Proxy Server.
  413.  
  414.  
  415. ● pvedaemon.service - PVE API Daemon
  416.  
  417.      Loaded: loaded (/lib/systemd/system/pvedaemon.service; enabled; preset: enabled)
  418.  
  419.      Active: active (running) since Wed 2023-10-25 10:03:48 EEST; 14min ago
  420.  
  421.     Process: 1262 ExecStart=/usr/bin/pvedaemon start (code=exited, status=0/SUCCESS)
  422.  
  423.    Main PID: 1397 (pvedaemon)
  424.  
  425.       Tasks: 4 (limit: 38288)
  426.  
  427.      Memory: 208.8M
  428.  
  429.      CGroup: /system.slice/pvedaemon.service
  430.  
  431.              ├─1397 pvedaemon
  432.  
  433.              ├─1398 "pvedaemon worker"
  434.  
  435.              ├─1399 "pvedaemon worker"
  436.  
  437.              └─1400 "pvedaemon worker"
  438.  
  439.  
  440. Oct 25 10:03:48 sofx1010pve3303.home.lan systemd[1]: Starting pvedaemon.service - PVE API Daemon...
  441.  
  442. Oct 25 10:03:48 sofx1010pve3303.home.lan pvedaemon[1397]: starting server
  443.  
  444. Oct 25 10:03:48 sofx1010pve3303.home.lan pvedaemon[1397]: starting 3 worker(s)
  445.  
  446. Oct 25 10:03:48 sofx1010pve3303.home.lan pvedaemon[1397]: worker 1398 started
  447.  
  448. Oct 25 10:03:48 sofx1010pve3303.home.lan pvedaemon[1397]: worker 1399 started
  449.  
  450. Oct 25 10:03:48 sofx1010pve3303.home.lan pvedaemon[1397]: worker 1400 started
  451.  
  452. Oct 25 10:03:48 sofx1010pve3303.home.lan systemd[1]: Started pvedaemon.service - PVE API Daemon.
  453.  
  454.  
  455. [/CODE]
  456.  
  457.  
  458. Again only on those nodes which are failing:
  459.  
  460.  
  461. [CODE]Oct 25 10:18:57 sofx1010pve3302.home.lan pacemakerd[6277]:  notice: Additional logging available in /var/log/pacemaker/pacemaker.log
  462.  
  463. Oct 25 10:18:57 sofx1010pve3302.home.lan systemd[1]: Started pacemaker.service - Pacemaker High Availability Cluster Manager.
  464.  
  465. Oct 25 10:18:57 sofx1010pve3302.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  466.  
  467. Oct 25 10:18:57 sofx1010pve3302.home.lan systemd[1]: Stopped pacemaker.service - Pacemaker High Availability Cluster Manager.
  468.  
  469. Oct 25 10:18:57 sofx1010pve3302.home.lan systemd[1]: pacemaker.service: Scheduled restart job, restart counter is at 56.
  470.  
  471. Oct 25 10:18:56 sofx1010pve3302.home.lan systemd[1]: pacemaker.service: Failed with result 'exit-code'.
  472.  
  473. Oct 25 10:18:56 sofx1010pve3302.home.lan systemd[1]: pacemaker.service: Main process exited, code=exited, status=69/UNAVAILABLE
  474.  
  475. Oct 25 10:18:56 sofx1010pve3302.home.lan pacemakerd[6229]:  crit: Could not connect to Corosync CMAP: CS_ERR_LIBRARY
  476.  
  477. Oct 25 10:18:55 sofx1010pve3302.home.lan pvestatd[2408]: status update time (10.183 seconds)
  478.  
  479. Oct 25 10:18:55 sofx1010pve3302.home.lan pvestatd[2408]: storage 'truenas-nfs' is not online
  480.  
  481. Oct 25 10:18:45 sofx1010pve3302.home.lan pvestatd[2408]: status update time (10.183 seconds)
  482.  
  483. Oct 25 10:18:45 sofx1010pve3302.home.lan pvestatd[2408]: storage 'truenas-nfs' is not online
  484.  
  485. Oct 25 10:18:41 sofx1010pve3302.home.lan pacemakerd[6229]:  notice: Additional logging available in /var/log/pacemaker/pacemaker.log
  486.  
  487. Oct 25 10:18:41 sofx1010pve3302.home.lan systemd[1]: Started pacemaker.service - Pacemaker High Availability Cluster Manager.
  488.  
  489. Oct 25 10:18:41 sofx1010pve3302.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  490.  
  491. Oct 25 10:18:41 sofx1010pve3302.home.lan systemd[1]: Stopped pacemaker.service - Pacemaker High Availability Cluster Manager.
  492.  
  493. Oct 25 10:18:41 sofx1010pve3302.home.lan systemd[1]: pacemaker.service: Scheduled restart job, restart counter is at 55.
  494.  
  495. Oct 25 10:18:40 sofx1010pve3302.home.lan systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
  496.  
  497. Oct 25 10:18:40 sofx1010pve3302.home.lan systemd[1]: Finished systemd-tmpfiles-clean.service - Cleanup of Temporary Directories.
  498.  
  499. Oct 25 10:18:40 sofx1010pve3302.home.lan systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
  500.  
  501. Oct 25 10:18:40 sofx1010pve3302.home.lan systemd[1]: Starting systemd-tmpfiles-clean.service - Cleanup of Temporary Directories...
  502.  
  503. Oct 25 10:18:40 sofx1010pve3302.home.lan systemd[1]: pacemaker.service: Failed with result 'exit-code'.
  504.  
  505. Oct 25 10:18:40 sofx1010pve3302.home.lan systemd[1]: pacemaker.service: Main process exited, code=exited, status=69/UNAVAILABLE
  506.  
  507. Oct 25 10:18:40 sofx1010pve3302.home.lan pacemakerd[6164]:  crit: Could not connect to Corosync CMAP: CS_ERR_LIBRARY
  508.  
  509. Oct 25 10:18:35 sofx1010pve3302.home.lan pvestatd[2408]: status update time (10.183 seconds)
  510.  
  511.  
  512. Oct 25 10:20:17 sofx1010pve3303.home.lan systemd[1]: pacemaker.service: Failed with result 'exit-code'.
  513.  
  514. Oct 25 10:20:17 sofx1010pve3303.home.lan systemd[1]: pacemaker.service: Main process exited, code=exited, status=69/UNAVAILABLE
  515.  
  516. Oct 25 10:20:17 sofx1010pve3303.home.lan pacemakerd[5284]:  crit: Could not connect to Corosync CMAP: CS_ERR_LIBRARY
  517.  
  518. Oct 25 10:20:16 sofx1010pve3303.home.lan pvestatd[1343]: status update time (10.166 seconds)
  519.  
  520. Oct 25 10:20:16 sofx1010pve3303.home.lan pvestatd[1343]: storage 'truenas-nfs' is not online
  521.  
  522. Oct 25 10:20:06 sofx1010pve3303.home.lan pvestatd[1343]: status update time (10.166 seconds)
  523.  
  524. Oct 25 10:20:06 sofx1010pve3303.home.lan pvestatd[1343]: storage 'truenas-nfs' is not online
  525.  
  526. Oct 25 10:20:02 sofx1010pve3303.home.lan pacemakerd[5284]:  notice: Additional logging available in /var/log/pacemaker/pacemaker.log
  527.  
  528. Oct 25 10:20:02 sofx1010pve3303.home.lan systemd[1]: Started pacemaker.service - Pacemaker High Availability Cluster Manager.
  529.  
  530. Oct 25 10:20:02 sofx1010pve3303.home.lan systemd[1]: corosync.service - Corosync Cluster Engine was skipped because of an unmet condition check (ConditionPathExists=/etc/corosync/corosync.c>
  531.  
  532. Oct 25 10:20:02 sofx1010pve3303.home.lan systemd[1]: Stopped pacemaker.service - Pacemaker High Availability Cluster Manager.
  533.  
  534. Oct 25 10:20:02 sofx1010pve3303.home.lan systemd[1]: pacemaker.service: Scheduled restart job, restart counter is at 60.
  535.  
  536. Oct 25 10:20:02 sofx1010pve3303.home.lan CRON[5263]: pam_unix(cron:session): session closed for user root
  537.  
  538. Oct 25 10:20:01 sofx1010pve3303.home.lan systemd-logind[758]: Removed session 18.
  539.  
  540. Oct 25 10:20:01 sofx1010pve3303.home.lan systemd-logind[758]: Session 18 logged out. Waiting for processes to exit.
  541.  
  542. Oct 25 10:20:01 sofx1010pve3303.home.lan systemd[1]: session-18.scope: Deactivated successfully.
  543.  
  544. Oct 25 10:20:01 sofx1010pve3303.home.lan sshd[5260]: pam_unix(sshd:session): session closed for user root
  545.  
  546. Oct 25 10:20:01 sofx1010pve3303.home.lan sshd[5260]: Disconnected from user root 192.168.30.2 port 36358
  547.  
  548. Oct 25 10:20:01 sofx1010pve3303.home.lan sshd[5260]: Received disconnect from 192.168.30.2 port 36358:11: disconnected by user
  549.  
  550. Oct 25 10:20:01 sofx1010pve3303.home.lan sshd[5260]: pam_env(sshd:session): deprecated reading of user environment enabled
  551.  
  552. Oct 25 10:20:01 sofx1010pve3303.home.lan systemd[1]: Started session-18.scope - Session 18 of User root.
  553.  
  554. Oct 25 10:20:01 sofx1010pve3303.home.lan systemd-logind[758]: New session 18 of user root.
  555.  
  556. Oct 25 10:20:01 sofx1010pve3303.home.lan sshd[5260]: pam_unix(sshd:session): session opened for user root(uid=0) by (uid=0)
  557.  
  558. Oct 25 10:20:01 sofx1010pve3303.home.lan sshd[5260]: Accepted publickey for root from 192.168.30.2 port 36358 ssh2: RSA SHA256:rbXel5Ru72ZLUZGrasjIV8XP4pn95nU/7r8Qmgx8lJ0
  559.  
  560. Oct 25 10:20:01 sofx1010pve3303.home.lan CRON[5262]: pam_unix(cron:session): session closed for user root
  561.  
  562. Oct 25 10:20:01 sofx1010pve3303.home.lan CRON[5265]: (root) CMD (/usr/local/sbin/check_interfaces_realtime.sh)
  563.  
  564. Oct 25 10:20:01 sofx1010pve3303.home.lan CRON[5264]: (root) CMD (unison profile-var-lib-vz.prf >/dev/null 2>&1)
  565.  
  566. Oct 25 10:20:01 sofx1010pve3303.home.lan CRON[5262]: pam_unix(cron:session): session opened for user root(uid=0) by (uid=0)
  567.  
  568. Oct 25 10:20:01 sofx1010pve3303.home.lan CRON[5263]: pam_unix(cron:session): session opened for user root(uid=0) by (uid=0)
  569.  
  570. Oct 25 10:20:01 sofx1010pve3303.home.lan systemd[1]: pacemaker.service: Failed with result 'exit-code'.
  571.  
  572. Oct 25 10:20:01 sofx1010pve3303.home.lan systemd[1]: pacemaker.service: Main process exited, code=exited, status=69/UNAVAILABLE
  573.  
  574. Oct 25 10:20:01 sofx1010pve3303.home.lan pacemakerd[5189]:  crit: Could not connect to Corosync CMAP: CS_ERR_LIBRARY
  575.  
  576. Oct 25 10:19:56 sofx1010pve3303.home.lan pvestatd[1343]: status update time (10.167 seconds)
  577.  
  578.  
  579. [/CODE]
Advertisement
Add Comment
Please, Sign In to add comment