Darce101

Untitled

Jan 5th, 2024
81
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 38.04 KB | None | 0 0
  1. =====================
  2. = Umbrel debug info =
  3. =====================
  4.  
  5. Umbrel version
  6. --------------
  7. 0.5.4
  8.  
  9. Flashed OS version
  10. -----------------
  11. v0.5.4
  12.  
  13. Raspberry Pi Model
  14. ------------------
  15. Revision : d03115
  16. Serial : 100000008278a770
  17. Model : Raspberry Pi 4 Model B Rev 1.5
  18.  
  19. Firmware
  20. --------
  21. May 9 2023 12:16:34
  22. Copyright (c) 2012 Broadcom
  23. version 30aa0d70ab280427ba04ebc718c81d4350b9d394 (clean) (release) (start)
  24.  
  25. Temperature
  26. -----------
  27. temp=45.7'C
  28.  
  29. Throttling
  30. ----------
  31. throttled=0x0
  32.  
  33. Memory usage
  34. ------------
  35. total used free shared buff/cache available
  36. Mem: 7.8G 2.4G 951M 392M 4.5G 4.9G
  37. Swap: 4.1G 2.0M 4.1G
  38.  
  39. total: 30.7%
  40. immich: 11.9%
  41. electrs: 7%
  42. lightning: 5%
  43. thunderhub: 4.4%
  44. bitcoin: 3.7%
  45. bluewallet: 1.1%
  46. tailscale: 0.9%
  47. system: 0%
  48.  
  49. Memory monitor logs
  50. -------------------
  51. 2023-12-08 00:15:57 Memory monitor running!
  52. 2023-12-08 00:20:26 Memory monitor running!
  53. 2023-12-13 20:03:35 Memory monitor running!
  54. 2023-12-14 11:11:11 Memory monitor running!
  55. 2023-12-14 11:31:25 Memory monitor running!
  56. 2023-12-14 22:29:22 Memory monitor running!
  57. 2023-12-24 12:50:12 Memory monitor running!
  58. 2023-12-25 01:53:15 Memory monitor running!
  59. 2023-12-25 05:03:56 Memory monitor running!
  60. 2023-12-28 08:35:34 Memory monitor running!
  61.  
  62. Filesystem information
  63. ----------------------
  64. Filesystem Size Used Avail Use% Mounted on
  65. /dev/root 15G 4.1G 9.8G 30% /
  66. /dev/sda1 916G 694G 176G 80% /home/umbrel/umbrel
  67.  
  68. Startup service logs
  69. --------------------
  70. -- Logs begin at Wed 2024-01-03 19:10:26 UTC, end at Fri 2024-01-05 06:30:49 UTC. --
  71. Jan 05 06:28:08 umbrel passwd[16431]: pam_unix(passwd:chauthtok): password changed for umbrel
  72.  
  73. External storage service logs
  74. -----------------------------
  75. -- Logs begin at Wed 2024-01-03 19:10:26 UTC, end at Fri 2024-01-05 06:30:49 UTC. --
  76. -- No entries --
  77.  
  78. External storage SD card update service logs
  79. --------------------------------------------
  80. -- Logs begin at Wed 2024-01-03 19:10:26 UTC, end at Fri 2024-01-05 06:30:49 UTC. --
  81. -- No entries --
  82.  
  83. Karen logs
  84. ----------
  85.  
  86.  
  87. 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
  88. 0 0 0 0 0 0 0 0 --:--:-- 0:00:04 --:--:-- 0
  89. 0 0 0 0 0 0 0 0 --:--:-- 0:00:05 --:--:-- 0
  90. 100 1038 0 0 100 1038 0 162 0:00:06 0:00:06 --:--:-- 162
  91. 100 1038 0 0 100 1038 0 153 0:00:06 0:00:06 --:--:-- 153
  92. 100 1184 100 146 100 1038 20 144 0:00:07 0:00:07 --:--:-- 165
  93. {"message":"Successfully uploaded backup 1704410333945.tar.gz.pgp for backup ID a5294e2026327e7bc8a23cc318c1a622d86371cd88eebd4a70318e026616b1a5"}
  94. =============================
  95. ====== Backup success =======
  96. =============================
  97. Got signal: backup
  98. karen is getting triggered!
  99. Deriving keys...
  100. Creating backup...
  101. Adding random padding...
  102. 1+0 records in
  103. 1+0 records out
  104. 213 bytes copied, 0.000224459 s, 949 kB/s
  105. Creating encrypted tarball...
  106. backup/
  107. backup/channel.backup
  108. backup/.padding
  109. Uploading backup...
  110. % Total % Received % Xferd Average Speed Time Time Time Current
  111. Dload Upload Total Spent Left Speed
  112.  
  113. 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
  114. 0 0 0 0 0 0 0 0 --:--:-- 0:00:05 --:--:-- 0
  115. 0 0 0 0 0 0 0 0 --:--:-- 0:00:06 --:--:-- 0
  116. 0 0 0 0 0 0 0 0 --:--:-- 0:00:06 --:--:-- 0
  117. 100 1258 0 0 100 1258 0 167 0:00:07 0:00:07 --:--:-- 167
  118. 100 1404 100 146 100 1258 16 143 0:00:09 0:00:08 0:00:01 160
  119. 100 1404 100 146 100 1258 16 143 0:00:09 0:00:08 0:00:01 375
  120. {"message":"Successfully uploaded backup 1704416703556.tar.gz.pgp for backup ID a5294e2026327e7bc8a23cc318c1a622d86371cd88eebd4a70318e026616b1a5"}
  121. =============================
  122. ====== Backup success =======
  123. =============================
  124. Got signal: backup
  125. karen is getting triggered!
  126. Deriving keys...
  127. Creating backup...
  128. Adding random padding...
  129. 1+0 records in
  130. 1+0 records out
  131. 6486 bytes (6.5 kB, 6.3 KiB) copied, 0.000326217 s, 19.9 MB/s
  132. Creating encrypted tarball...
  133. backup/
  134. backup/channel.backup
  135. backup/.padding
  136. Uploading backup...
  137. % Total % Received % Xferd Average Speed Time Time Time Current
  138. Dload Upload Total Spent Left Speed
  139.  
  140. 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
  141. 0 0 0 0 0 0 0 0 --:--:-- 0:00:05 --:--:-- 0
  142. 0 0 0 0 0 0 0 0 --:--:-- 0:00:05 --:--:-- 0
  143. 100 7546 0 0 100 7546 0 1094 0:00:06 0:00:06 --:--:-- 1093
  144. 100 7546 0 0 100 7546 0 955 0:00:07 0:00:07 --:--:-- 955
  145. 100 7692 100 146 100 7546 18 947 0:00:08 0:00:07 0:00:01 965
  146. {"message":"Successfully uploaded backup 1704420533897.tar.gz.pgp for backup ID a5294e2026327e7bc8a23cc318c1a622d86371cd88eebd4a70318e026616b1a5"}
  147. =============================
  148. ====== Backup success =======
  149. =============================
  150. Got signal: change-password
  151. karen is getting triggered!
  152. New password: Retype new password: passwd: password updated successfully
  153. Got signal: debug
  154. karen is getting triggered!
  155.  
  156. Docker containers
  157. -----------------
  158. NAMES STATUS
  159. immich_server_1 Up 7 days
  160. immich_microservices_1 Up 7 days
  161. bitcoin_server_1 Up 7 days
  162. electrs_app_1 Up 7 days
  163. bluewallet_lndhub_1 Up 3 seconds
  164. immich_machine-learning_1 Up 7 days
  165. immich_redis_1 Up 7 days
  166. immich_postgres_1 Up 7 days
  167. immich_app_proxy_1 Up 7 days
  168. bitcoin_tor_1 Up 7 days
  169. bitcoin_i2pd_daemon_1 Up 7 days
  170. bitcoin_app_proxy_1 Up 7 days
  171. bitcoin_bitcoind_1 Up 28 seconds
  172. bluewallet_redis_1 Up 7 days
  173. bluewallet_app_proxy_1 Up 7 days
  174. electrs_app_proxy_1 Up 7 days
  175. electrs_tor_1 Up 7 days
  176. electrs_electrs_1 Up 31 seconds
  177. thunderhub_web_1 Up 7 days
  178. thunderhub_app_proxy_1 Up 7 days
  179. lightning_app_1 Up 7 days
  180. lightning_lnd_1 Up 7 days
  181. lightning_app_proxy_1 Up 7 days
  182. lightning_tor_1 Up 7 days
  183. tailscale_web_1 Up 7 days
  184. nginx Up 7 days
  185. dashboard Up 7 days
  186. manager Up 7 days
  187. tor_proxy Up 7 days
  188. auth Up 7 days
  189.  
  190. Umbrel logs
  191. -----------
  192.  
  193. Attaching to manager
  194. manager | ::ffff:10.21.21.2 - - [Fri, 05 Jan 2024 06:30:51 GMT] "GET /v1/system/storage HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
  195. manager |
  196. manager | umbrel-manager
  197. manager | ::ffff:10.21.21.2 - - [Fri, 05 Jan 2024 06:30:51 GMT] "GET /v1/system/info HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
  198. manager |
  199. manager | umbrel-manager
  200. manager | ::ffff:10.21.21.2 - - [Fri, 05 Jan 2024 06:30:51 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
  201. manager |
  202. manager | umbrel-manager
  203. manager | ::ffff:10.21.21.2 - - [Fri, 05 Jan 2024 06:30:52 GMT] "GET /v1/system/get-update HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
  204. manager |
  205. manager | umbrel-manager
  206. manager | ::ffff:10.21.21.2 - - [Fri, 05 Jan 2024 06:30:52 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
  207. manager |
  208. manager | umbrel-manager
  209. manager | ::ffff:10.21.0.16 - - [Fri, 05 Jan 2024 06:30:52 GMT] "GET /v1/account/token?token=26f1639082be146a42a0d09aa6afc8841aa9c53eb30eb601883f1480b4d5f354 HTTP/1.1" 200 16 "-" "app-proxy/0.0.1"
  210. manager |
  211. manager | umbrel-manager
  212. manager | ::ffff:10.21.21.2 - - [Fri, 05 Jan 2024 06:30:54 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
  213. manager |
  214. manager | umbrel-manager
  215. manager | ::ffff:10.21.0.16 - - [Fri, 05 Jan 2024 06:30:54 GMT] "GET /v1/account/token?token=26f1639082be146a42a0d09aa6afc8841aa9c53eb30eb601883f1480b4d5f354 HTTP/1.1" 200 16 "-" "app-proxy/0.0.1"
  216. manager |
  217. manager | umbrel-manager
  218. manager | ::ffff:10.21.21.2 - - [Fri, 05 Jan 2024 06:30:56 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
  219. manager |
  220. manager | umbrel-manager
  221. manager | ::ffff:10.21.0.16 - - [Fri, 05 Jan 2024 06:30:56 GMT] "GET /v1/account/token?token=26f1639082be146a42a0d09aa6afc8841aa9c53eb30eb601883f1480b4d5f354 HTTP/1.1" 200 16 "-" "app-proxy/0.0.1"
  222. manager |
  223. manager | umbrel-manager
  224.  
  225. Tor Proxy logs
  226. --------
  227.  
  228. Attaching to tor_proxy
  229. tor_proxy | Jan 05 04:28:55.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 163.172.182.26:444 while fetching consensus directory.
  230. tor_proxy | Jan 05 04:29:55.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 194.32.107.206:443 while fetching consensus directory.
  231. tor_proxy | Jan 05 04:30:55.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 163.172.182.26:444 while fetching consensus directory.
  232. tor_proxy | Jan 05 04:31:55.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 194.32.107.206:443 while fetching consensus directory.
  233. tor_proxy | Jan 05 04:34:55.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 194.32.107.206:443 while fetching consensus directory.
  234. tor_proxy | Jan 05 04:38:57.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 163.172.182.26:444 while fetching consensus directory.
  235. tor_proxy | Jan 05 04:42:56.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 194.32.107.206:443 while fetching consensus directory.
  236. tor_proxy | Jan 05 04:44:57.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 163.172.182.26:444 while fetching consensus directory.
  237. tor_proxy | Jan 05 04:51:57.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 194.32.107.206:443 while fetching consensus directory.
  238. tor_proxy | Jan 05 05:06:57.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 194.32.107.206:443 while fetching consensus directory.
  239.  
  240. App logs
  241. --------
  242.  
  243. bitcoin
  244.  
  245. Attaching to bitcoin_server_1, bitcoin_tor_1, bitcoin_i2pd_daemon_1, bitcoin_app_proxy_1, bitcoin_bitcoind_1
  246. app_proxy_1 | Validating token: 26f1639082be ...
  247. app_proxy_1 | Validating token: 26f1639082be ...
  248. app_proxy_1 | Validating token: 26f1639082be ...
  249. app_proxy_1 | Validating token: 26f1639082be ...
  250. app_proxy_1 | Validating token: 26f1639082be ...
  251. app_proxy_1 | Validating token: 26f1639082be ...
  252. app_proxy_1 | Validating token: 26f1639082be ...
  253. app_proxy_1 | Validating token: 26f1639082be ...
  254. app_proxy_1 | Validating token: 26f1639082be ...
  255. app_proxy_1 | Validating token: 26f1639082be ...
  256. bitcoind_1 | 2024-01-05T06:31:01Z * Using 715.0 MiB for transaction index database
  257. bitcoind_1 | 2024-01-05T06:31:01Z * Using 625.6 MiB for basic block filter index database
  258. bitcoind_1 | 2024-01-05T06:31:01Z * Using 8.0 MiB for chain state database
  259. bitcoind_1 | 2024-01-05T06:31:01Z * Using 4371.4 MiB for in-memory UTXO set (plus up to 286.1 MiB of unused mempool space)
  260. bitcoind_1 | 2024-01-05T06:31:01Z init message: Loading block index…
  261. bitcoind_1 | 2024-01-05T06:31:01Z Assuming ancestors of block 000000000000000000035c3f0d31e71a5ee24c5aaf3354689f65bd7b07dee632 have valid signatures.
  262. bitcoind_1 | 2024-01-05T06:31:01Z Setting nMinimumChainWork=000000000000000000000000000000000000000044a50fe819c39ad624021859
  263. bitcoind_1 | 2024-01-05T06:31:01Z Opening LevelDB in /data/.bitcoin/blocks/index
  264. bitcoind_1 | 2024-01-05T06:31:02Z Opened LevelDB successfully
  265. bitcoind_1 | 2024-01-05T06:31:02Z Using obfuscation key for /data/.bitcoin/blocks/index: 0000000000000000
  266. i2pd_daemon_1 | 06:12:51@94/error - Garlic: Can't handle ECIES-X25519-AEAD-Ratchet message
  267. i2pd_daemon_1 | 06:14:30@94/error - ElGamal decrypt hash doesn't match
  268. i2pd_daemon_1 | 06:14:30@94/error - Garlic: Can't handle ECIES-X25519-AEAD-Ratchet message
  269. i2pd_daemon_1 | 06:15:16@966/error - Garlic: Can't handle ECIES-X25519-AEAD-Ratchet message
  270. i2pd_daemon_1 | 06:18:58@94/error - Garlic: Can't handle ECIES-X25519-AEAD-Ratchet message
  271. i2pd_daemon_1 | 06:19:45@619/error - Tunnels: Can't select next hop for reeXDOC6E2F0pHe2jahSZgziYcJsWyoW-v1nNVsVHxI=
  272. i2pd_daemon_1 | 06:19:45@619/error - Tunnels: Can't create inbound tunnel, no peers available
  273. i2pd_daemon_1 | 06:24:49@677/error - SSU2: RelayIntro unknown router to introduce
  274. i2pd_daemon_1 | 06:24:52@619/error - Tunnel: Tunnel with id 470115970 already exists
  275. i2pd_daemon_1 | 06:26:20@619/error - Tunnel: Tunnel with id 77984737 already exists
  276. server_1 | umbrel-middleware
  277. server_1 | ::ffff:10.21.0.16 - - [Fri, 05 Jan 2024 06:30:57 GMT] "GET /v1/bitcoind/info/status HTTP/1.1" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
  278. server_1 |
  279. server_1 | umbrel-middleware
  280. server_1 | ::ffff:10.21.0.16 - - [Fri, 05 Jan 2024 06:31:02 GMT] "GET /v1/bitcoind/info/status HTTP/1.1" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
  281. server_1 |
  282. server_1 | umbrel-middleware
  283. server_1 | ::ffff:10.21.0.16 - - [Fri, 05 Jan 2024 06:31:03 GMT] "GET /v1/bitcoind/info/status HTTP/1.1" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
  284. server_1 |
  285. server_1 | umbrel-middleware
  286. tor_1 | Jan 05 04:02:49.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 198.100.153.7:9001 while fetching consensus directory.
  287. tor_1 | Jan 05 04:03:49.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 198.100.153.7:9001 while fetching consensus directory.
  288. tor_1 | Jan 05 04:04:49.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 198.100.153.7:9001 while fetching consensus directory.
  289. tor_1 | Jan 05 04:05:49.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 148.251.83.53:8443 while fetching consensus directory.
  290. tor_1 | Jan 05 04:06:49.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 198.100.153.7:9001 while fetching consensus directory.
  291. tor_1 | Jan 05 04:10:49.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 198.100.153.7:9001 while fetching consensus directory.
  292. tor_1 | Jan 05 04:22:49.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 148.251.83.53:8443 while fetching consensus directory.
  293. tor_1 | Jan 05 04:24:49.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 198.100.153.7:9001 while fetching consensus directory.
  294. tor_1 | Jan 05 04:29:49.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 198.100.153.7:9001 while fetching consensus directory.
  295. tor_1 | Jan 05 04:40:49.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 148.251.83.53:8443 while fetching consensus directory.
  296.  
  297. bluewallet
  298.  
  299. Attaching to bluewallet_lndhub_1, bluewallet_redis_1, bluewallet_app_proxy_1
  300. app_proxy_1 | [HPM] Error occurred while proxying request 192.168.0.2:3008/ to http://bluewallet_lndhub_1:3008/ [ECONNREFUSED] (https://nodejs.org/api/errors.html#errors_common_system_errors)
  301. app_proxy_1 | [HPM] Error occurred while proxying request 192.168.0.2:3008/ to http://bluewallet_lndhub_1:3008/ [ECONNREFUSED] (https://nodejs.org/api/errors.html#errors_common_system_errors)
  302. app_proxy_1 | [HPM] Error occurred while proxying request 192.168.0.2:3008/ to http://bluewallet_lndhub_1:3008/ [ECONNREFUSED] (https://nodejs.org/api/errors.html#errors_common_system_errors)
  303. app_proxy_1 | [HPM] Error occurred while proxying request 192.168.0.2:3008/ to http://bluewallet_lndhub_1:3008/ [ECONNREFUSED] (https://nodejs.org/api/errors.html#errors_common_system_errors)
  304. app_proxy_1 | [HPM] Error occurred while proxying request 192.168.0.2:3008/ to http://bluewallet_lndhub_1:3008/ [ENOTFOUND] (https://nodejs.org/api/errors.html#errors_common_system_errors)
  305. app_proxy_1 | [HPM] Error occurred while proxying request 192.168.0.2:3008/ to http://bluewallet_lndhub_1:3008/ [ECONNREFUSED] (https://nodejs.org/api/errors.html#errors_common_system_errors)
  306. app_proxy_1 | [HPM] Error occurred while proxying request 192.168.0.2:3008/ to http://bluewallet_lndhub_1:3008/ [ECONNREFUSED] (https://nodejs.org/api/errors.html#errors_common_system_errors)
  307. app_proxy_1 | [HPM] Error occurred while proxying request 192.168.0.2:3008/ to http://bluewallet_lndhub_1:3008/ [ECONNREFUSED] (https://nodejs.org/api/errors.html#errors_common_system_errors)
  308. app_proxy_1 | [HPM] Error occurred while proxying request 192.168.0.2:3008/ to http://bluewallet_lndhub_1:3008/ [ECONNREFUSED] (https://nodejs.org/api/errors.html#errors_common_system_errors)
  309. app_proxy_1 | [HPM] Error occurred while proxying request 100.90.16.136:3008/ to http://bluewallet_lndhub_1:3008/ [ENOTFOUND] (https://nodejs.org/api/errors.html#errors_common_system_errors)
  310. lndhub_1 | at Object.onReceiveStatus (/lndhub/node_modules/@grpc/grpc-js/src/client-interceptors.ts:389:48)
  311. lndhub_1 | at /lndhub/node_modules/@grpc/grpc-js/src/call-stream.ts:276:24
  312. lndhub_1 | at processTicksAndRejections (node:internal/process/task_queues:78:11) {
  313. lndhub_1 | code: 2,
  314. lndhub_1 | details: 'wallet locked, unlock it to enable full RPC access',
  315. lndhub_1 | metadata: Metadata {
  316. lndhub_1 | internalRepr: Map(1) { 'content-type' => [Array] },
  317. lndhub_1 | options: {}
  318. lndhub_1 | }
  319. lndhub_1 | }
  320. redis_1 | 7:C 28 Dec 2023 08:37:11.951 # Configuration loaded
  321. redis_1 | 7:M 28 Dec 2023 08:37:11.953 * monotonic clock: POSIX clock_gettime
  322. redis_1 | 7:M 28 Dec 2023 08:37:12.084 * Running mode=standalone, port=6379.
  323. redis_1 | 7:M 28 Dec 2023 08:37:12.084 # Server initialized
  324. redis_1 | 7:M 28 Dec 2023 08:37:12.084 # WARNING overcommit_memory is set to 0! Background save may fail under low memory condition. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
  325. redis_1 | 7:M 28 Dec 2023 08:37:12.312 * Loading RDB produced by version 6.2.2
  326. redis_1 | 7:M 28 Dec 2023 08:37:12.313 * RDB age 272066 seconds
  327. redis_1 | 7:M 28 Dec 2023 08:37:12.313 * RDB memory usage when created 0.82 Mb
  328. redis_1 | 7:M 28 Dec 2023 08:37:12.313 * DB loaded from disk: 0.162 seconds
  329. redis_1 | 7:M 28 Dec 2023 08:37:12.313 * Ready to accept connections
  330.  
  331. electrs
  332.  
  333. Attaching to electrs_app_1, electrs_app_proxy_1, electrs_tor_1, electrs_electrs_1
  334. electrs_1 |
  335. electrs_1 | Caused by:
  336. electrs_1 | 0: bitcoind RPC polling failed
  337. electrs_1 | 1: daemon not available
  338. electrs_1 | 2: JSON-RPC error: transport error: Couldn't connect to host: Connection refused (os error 111)
  339. electrs_1 | Starting electrs 0.10.1 on aarch64 linux with Config { network: Bitcoin, db_path: "/data/db/bitcoin", daemon_dir: "/data/.bitcoin", daemon_auth: CookieFile("/data/.bitcoin/.cookie"), daemon_rpc_addr: 10.21.21.8:8332, daemon_p2p_addr: 10.21.21.8:8333, electrum_rpc_addr: 0.0.0.0:50001, monitoring_addr: 127.0.0.1:4224, wait_duration: 10s, jsonrpc_timeout: 15s, index_batch_size: 10, index_lookup_limit: None, reindex_last_blocks: 0, auto_reindex: true, ignore_mempool: false, sync_once: false, skip_block_download_wait: false, disable_electrum_rpc: false, server_banner: "Umbrel Electrs (0.10.1)", signet_magic: f9beb4d9, args: [] }
  340. electrs_1 | [2024-01-05T06:31:00.195Z INFO electrs::metrics::metrics_impl] serving Prometheus metrics on 127.0.0.1:4224
  341. electrs_1 | [2024-01-05T06:31:00.196Z INFO electrs::server] serving Electrum RPC on 0.0.0.0:50001
  342. electrs_1 | [2024-01-05T06:31:00.746Z INFO electrs::db] "/data/db/bitcoin": 203 SST files, 44.597110734 GB, 5.570348022 Grows
  343. electrs_1 | [2024-01-05T06:31:11.343Z INFO electrs::chain] loading 820191 headers, tip=00000000000000000000775074fa77acd1870a49f45f5b31c11f22f5fbd04c2e
  344. tor_1 | Jan 05 04:50:26.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 5.9.14.25:993 while fetching consensus directory.
  345. tor_1 | Jan 05 04:51:26.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 5.9.14.25:993 while fetching consensus directory.
  346. tor_1 | Jan 05 04:53:28.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 65.21.94.13:5443 while fetching consensus directory.
  347. tor_1 | Jan 05 04:55:26.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 65.21.94.13:5443 while fetching consensus directory.
  348. tor_1 | Jan 05 05:02:26.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 51.68.204.139:9001 while fetching consensus directory.
  349. tor_1 | Jan 05 05:07:26.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 51.68.204.139:9001 while fetching consensus directory.
  350. tor_1 | Jan 05 05:08:26.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 5.9.14.25:993 while fetching consensus directory.
  351. tor_1 | Jan 05 05:09:26.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 5.9.14.25:993 while fetching consensus directory.
  352. tor_1 | Jan 05 05:10:26.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 51.68.204.139:9001 while fetching consensus directory.
  353. tor_1 | Jan 05 05:14:26.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 5.9.14.25:993 while fetching consensus directory.
  354. app_proxy_1 | yarn run v1.22.19
  355. app_proxy_1 | $ node ./bin/www
  356. app_proxy_1 | [HPM] Proxy created: / -> http://10.21.22.4:3006
  357. app_proxy_1 | Waiting for 10.21.22.4:3006 to open...
  358. app_proxy_1 | Electrs is now ready...
  359. app_proxy_1 | Listening on port: 2102
  360. app_1 | > [email protected] dev:backend
  361. app_1 | > npm run start -w umbrel-electrs-backend
  362. app_1 |
  363. app_1 |
  364. app_1 | > [email protected] start
  365. app_1 | > node ./bin/www
  366. app_1 |
  367. app_1 | Thu, 28 Dec 2023 08:46:26 GMT morgan deprecated morgan(options): use morgan("default", options) instead at app.js:28:9
  368. app_1 | Thu, 28 Dec 2023 08:46:26 GMT morgan deprecated default format: use combined format at app.js:28:9
  369. app_1 | Listening on port 3006
  370.  
  371. immich
  372.  
  373. Attaching to immich_server_1, immich_microservices_1, immich_machine-learning_1, immich_redis_1, immich_postgres_1, immich_app_proxy_1
  374. microservices_1 | [Nest] 7 - 12/28/2023, 8:52:27 AM LOG [InstanceLoader] ConfigModule dependencies initialized
  375. microservices_1 | [Nest] 7 - 12/28/2023, 8:52:27 AM LOG [InstanceLoader] BullModule dependencies initialized
  376. microservices_1 | [Nest] 7 - 12/28/2023, 8:52:27 AM LOG [InstanceLoader] BullModule dependencies initialized
  377. microservices_1 | [Nest] 7 - 12/28/2023, 8:52:27 AM LOG [InstanceLoader] TypeOrmCoreModule dependencies initialized
  378. microservices_1 | [Nest] 7 - 12/28/2023, 8:52:27 AM LOG [InstanceLoader] TypeOrmModule dependencies initialized
  379. microservices_1 | [Nest] 7 - 12/28/2023, 8:52:27 AM LOG [InstanceLoader] InfraModule dependencies initialized
  380. microservices_1 | [Nest] 7 - 12/28/2023, 8:52:27 AM LOG [InstanceLoader] DomainModule dependencies initialized
  381. microservices_1 | [Nest] 7 - 12/28/2023, 8:52:27 AM LOG [InstanceLoader] MicroservicesModule dependencies initialized
  382. microservices_1 | [Nest] 7 - 12/28/2023, 8:52:27 AM LOG [NestApplication] Nest application successfully started
  383. microservices_1 | [Nest] 7 - 12/28/2023, 8:52:27 AM LOG [ImmichMicroservice] Immich Microservices is listening on http://[::1]:3002 [v1.91.4] [PRODUCTION]
  384. app_proxy_1 | Validating token: 26b8a8ad218c ...
  385. app_proxy_1 | Validating token: 26b8a8ad218c ...
  386. app_proxy_1 | Validating token: 26b8a8ad218c ...
  387. app_proxy_1 | Validating token: 26b8a8ad218c ...
  388. app_proxy_1 | Validating token: 26b8a8ad218c ...
  389. app_proxy_1 | Validating token: 26b8a8ad218c ...
  390. app_proxy_1 | Validating token: 26b8a8ad218c ...
  391. app_proxy_1 | Validating token: 26b8a8ad218c ...
  392. app_proxy_1 | Validating token: 26b8a8ad218c ...
  393. app_proxy_1 | [HPM] Client disconnected
  394. redis_1 | 1:M 05 Jan 2024 06:26:00.029 * 100 changes in 300 seconds. Saving...
  395. redis_1 | 1:M 05 Jan 2024 06:26:00.031 * Background saving started by pid 2280
  396. machine-learning_1 | [12/28/23 08:49:15] CRITICAL WORKER TIMEOUT (pid:20)
  397. machine-learning_1 | [12/28/23 08:49:52] ERROR Worker (pid:20) was sent code 134!
  398. machine-learning_1 | [12/28/23 08:49:52] INFO Booting worker with pid: 27
  399. machine-learning_1 | [12/28/23 08:51:52] CRITICAL WORKER TIMEOUT (pid:27)
  400. machine-learning_1 | [12/28/23 08:52:04] ERROR Worker (pid:27) was sent SIGKILL! Perhaps out of
  401. machine-learning_1 | memory?
  402. machine-learning_1 | [12/28/23 08:52:04] INFO Booting worker with pid: 35
  403. redis_1 | 2280:C 05 Jan 2024 06:26:01.058 * DB saved on disk
  404. redis_1 | 2280:C 05 Jan 2024 06:26:01.059 * RDB: 1 MB of memory used by copy-on-write
  405. redis_1 | 1:M 05 Jan 2024 06:26:01.137 * Background saving terminated with success
  406. redis_1 | 1:M 05 Jan 2024 06:31:02.001 * 100 changes in 300 seconds. Saving...
  407. redis_1 | 1:M 05 Jan 2024 06:31:02.002 * Background saving started by pid 2281
  408. redis_1 | 2281:C 05 Jan 2024 06:31:02.281 * DB saved on disk
  409. redis_1 | 2281:C 05 Jan 2024 06:31:02.282 * RDB: 1 MB of memory used by copy-on-write
  410. redis_1 | 1:M 05 Jan 2024 06:31:02.305 * Background saving terminated with success
  411. machine-learning_1 | [12/28/23 08:53:35] INFO Created in-memory cache with unloading after 300s
  412. machine-learning_1 | of inactivity.
  413. machine-learning_1 | [12/28/23 08:53:35] INFO Initialized request thread pool with 4 threads.
  414. postgres_1 | [2023-12-28T08:38:37Z INFO vectors::utils::clean] Find directory "pg_vectors/indexes/26034/segments/3c07d095-5df3-48e7-a560-e3c4d88cd0c8".
  415. postgres_1 | [2023-12-28T08:38:37Z INFO vectors::utils::clean] Find directory "pg_vectors/indexes/26034/segments/991dcf5f-55c9-4fdd-ab09-6d74798708cc".
  416. postgres_1 | [2023-12-28T08:38:41Z INFO vectors::utils::clean] Find directory "pg_vectors/indexes/26033/segments/2b314b8c-2227-4017-ae6c-f1b864725150".
  417. postgres_1 | [2023-12-28T08:38:41Z INFO vectors::utils::clean] Find directory "pg_vectors/indexes/26033/segments/c3e43117-506b-4f88-824e-1001692776fc".
  418. postgres_1 | [2023-12-28T08:38:41Z INFO vectors::utils::clean] Find directory "pg_vectors/indexes/26033/segments/553d265b-4514-438b-80b8-e6ac0a5dde1e".
  419. postgres_1 | 2023-12-28 08:43:11.200 UTC [14] LOG: database system was not properly shut down; automatic recovery in progress
  420. postgres_1 | 2023-12-28 08:43:11.753 UTC [14] LOG: redo starts at 0/8576C40
  421. postgres_1 | 2023-12-28 08:43:11.812 UTC [14] LOG: invalid record length at 0/8578238: wanted 24, got 0
  422. postgres_1 | 2023-12-28 08:43:11.812 UTC [14] LOG: redo done at 0/8578210 system usage: CPU: user: 0.00 s, system: 0.00 s, elapsed: 0.05 s
  423. postgres_1 | 2023-12-28 08:43:25.860 UTC [1] LOG: database system is ready to accept connections
  424. server_1 | [Nest] 7 - 12/28/2023, 8:52:29 AM LOG [RouterExplorer] Mapped {/api/person/:id/assets, GET} route
  425. server_1 | [Nest] 7 - 12/28/2023, 8:52:29 AM LOG [RouterExplorer] Mapped {/api/person/:id/merge, POST} route
  426. server_1 | [Nest] 7 - 12/28/2023, 8:52:29 AM LOG [NestApplication] Nest application successfully started
  427. server_1 | [Nest] 7 - 12/28/2023, 8:52:29 AM LOG [ImmichServer] Immich Server is listening on http://[::1]:3001 [v1.91.4] [PRODUCTION]
  428. server_1 | [Nest] 7 - 12/29/2023, 10:53:36 AM LOG [CommunicationRepository] Websocket Connect: yx3cLQQTl7YcG-RZAAAB
  429. server_1 | [Nest] 7 - 12/29/2023, 10:53:40 AM LOG [CommunicationRepository] Websocket Disconnect: yx3cLQQTl7YcG-RZAAAB
  430. server_1 | [Nest] 7 - 12/30/2023, 2:15:24 AM LOG [CommunicationRepository] Websocket Connect: A_53G3FuzsHkm76OAAAD
  431. server_1 | [Nest] 7 - 12/30/2023, 2:16:36 AM LOG [CommunicationRepository] Websocket Disconnect: A_53G3FuzsHkm76OAAAD
  432. server_1 | [Nest] 7 - 01/03/2024, 12:29:26 AM LOG [CommunicationRepository] Websocket Connect: 5maeMNidv1YBST3ZAAAF
  433. server_1 | [Nest] 7 - 01/03/2024, 12:32:37 AM LOG [CommunicationRepository] Websocket Disconnect: 5maeMNidv1YBST3ZAAAF
  434.  
  435. lightning
  436.  
  437. Attaching to lightning_app_1, lightning_lnd_1, lightning_app_proxy_1, lightning_tor_1
  438. app_1 | Checking LND status...
  439. app_1 | Waiting for LND...
  440. app_1 | Checking LND status...
  441. app_1 | Waiting for LND...
  442. app_1 | Checking LND status...
  443. app_1 | Waiting for LND...
  444. app_1 | Checking LND status...
  445. app_1 | Waiting for LND...
  446. app_1 | Checking LND status...
  447. app_1 | Waiting for LND...
  448. app_proxy_1 | Validating token: 26f1639082be ...
  449. app_proxy_1 | Validating token: 26f1639082be ...
  450. app_proxy_1 | Validating token: 26f1639082be ...
  451. app_proxy_1 | Validating token: 26f1639082be ...
  452. app_proxy_1 | Validating token: 26f1639082be ...
  453. app_proxy_1 | Validating token: 26f1639082be ...
  454. app_proxy_1 | Validating token: 26f1639082be ...
  455. app_proxy_1 | Validating token: 26f1639082be ...
  456. app_proxy_1 | Validating token: 26f1639082be ...
  457. app_proxy_1 | Validating token: 26f1639082be ...
  458. lnd_1 | 2024-01-05 06:29:48.081 [ERR] RPCS: [/lnrpc.Lightning/ListChannels]: wallet locked, unlock it to enable full RPC access
  459. lnd_1 | 2024-01-05 06:29:48.081 [ERR] RPCS: [/lnrpc.Lightning/GetInfo]: wallet locked, unlock it to enable full RPC access
  460. lnd_1 | 2024-01-05 06:29:56.912 [ERR] RPCS: [/lnrpc.Lightning/GetInfo]: wallet locked, unlock it to enable full RPC access
  461. lnd_1 | 2024-01-05 06:29:56.912 [ERR] RPCS: [/lnrpc.Lightning/GetInfo]: wallet locked, unlock it to enable full RPC access
  462. lnd_1 | 2024-01-05 06:30:56.919 [ERR] RPCS: [/lnrpc.Lightning/GetInfo]: wallet locked, unlock it to enable full RPC access
  463. lnd_1 | 2024-01-05 06:30:56.920 [ERR] RPCS: [/lnrpc.Lightning/GetInfo]: wallet locked, unlock it to enable full RPC access
  464. lnd_1 | 2024-01-05 06:31:00.233 [ERR] RPCS: [/lnrpc.Lightning/ListChannels]: wallet locked, unlock it to enable full RPC access
  465. lnd_1 | 2024-01-05 06:31:00.233 [ERR] RPCS: [/lnrpc.Lightning/GetInfo]: wallet locked, unlock it to enable full RPC access
  466. lnd_1 | 2024-01-05 06:31:00.234 [ERR] RPCS: [/lnrpc.Lightning/SubscribeInvoices]: wallet locked, unlock it to enable full RPC access
  467. lnd_1 | 2024-01-05 06:31:00.234 [ERR] RPCS: [/lnrpc.Lightning/GetInfo]: wallet locked, unlock it to enable full RPC access
  468. tor_1 | Jan 05 04:01:36.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 185.32.222.237:9444 while fetching consensus directory.
  469. tor_1 | Jan 05 04:02:36.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 128.238.62.37:443 while fetching consensus directory.
  470. tor_1 | Jan 05 04:03:36.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 128.238.62.37:443 while fetching consensus directory.
  471. tor_1 | Jan 05 04:04:36.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 185.32.222.237:9444 while fetching consensus directory.
  472. tor_1 | Jan 05 04:07:36.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 185.32.222.237:9444 while fetching consensus directory.
  473. tor_1 | Jan 05 04:09:36.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 185.32.222.237:9444 while fetching consensus directory.
  474. tor_1 | Jan 05 04:19:36.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 185.32.222.237:9444 while fetching consensus directory.
  475. tor_1 | Jan 05 04:55:36.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 128.238.62.37:443 while fetching consensus directory.
  476. tor_1 | Jan 05 05:17:36.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 185.32.222.237:9444 while fetching consensus directory.
  477. tor_1 | Jan 05 05:47:44.000 [notice] No circuits are opened. Relaxed timeout for circuit 12051 (a Hidden service: Uploading HS descriptor 4-hop circuit in state doing handshakes with channel state open) to 60000ms. However, it appears the circuit has timed out anyway. [7 similar message(s) suppressed in last 11280 seconds]
  478.  
  479. tailscale
  480.  
  481. Attaching to tailscale_web_1
  482. web_1 | 2024/01/05 06:30:58 monitor: RTM_DELROUTE: src=, dst=fe80::7860:98ff:fe6a:3fa/128, gw=, outif=169899, table=255
  483. web_1 | 2024/01/05 06:30:58 monitor: RTM_DELROUTE: src=, dst=ff00::/8, gw=, outif=169899, table=255
  484. web_1 | 2024/01/05 06:30:58 [RATELIMIT] format("monitor: %s: src=%v, dst=%v, gw=%v, outif=%v, table=%v")
  485. web_1 | 2024/01/05 06:31:00 Accept: TCP{100.125.14.138:60944 > 100.90.16.136:81} 52 tcp ok
  486. web_1 | 2024/01/05 06:31:09 Accept: TCP{100.125.14.138:60948 > 100.90.16.136:2100} 52 tcp ok
  487. web_1 | 2024/01/05 06:31:19 Accept: TCP{100.125.14.138:60957 > 100.90.16.136:2100} 52 tcp ok
  488. web_1 | 2024/01/05 06:31:20 [RATELIMIT] format("monitor: %s: src=%v, dst=%v, gw=%v, outif=%v, table=%v") (3 dropped)
  489. web_1 | 2024/01/05 06:31:20 monitor: RTM_DELROUTE: src=, dst=fe80::/64, gw=, outif=169907, table=254
  490. web_1 | 2024/01/05 06:31:20 monitor: RTM_DELROUTE: src=, dst=fe80::bc17:1dff:fe80:3826/128, gw=, outif=169907, table=255
  491. web_1 | 2024/01/05 06:31:20 monitor: RTM_DELROUTE: src=, dst=ff00::/8, gw=, outif=169907, table=255
  492.  
  493. thunderhub
  494.  
  495. Attaching to thunderhub_web_1, thunderhub_app_proxy_1
  496. web_1 | ],
  497. web_1 | level: 'error',
  498. web_1 | message: 'Error connecting to node',
  499. web_1 | timestamp: '2024-01-05T06:30:56.924Z'
  500. web_1 | }
  501. web_1 | {
  502. web_1 | message: 'No node available for balance pushes',
  503. web_1 | level: 'error',
  504. web_1 | timestamp: '2024-01-05T06:30:56.927Z'
  505. web_1 | }
  506. app_proxy_1 | Validating token: 1d47af867241 ...
  507. app_proxy_1 | Validating token: 1d47af867241 ...
  508. app_proxy_1 | Validating token: 1d47af867241 ...
  509. app_proxy_1 | Validating token: 1d47af867241 ...
  510. app_proxy_1 | Validating token: 1d47af867241 ...
  511. app_proxy_1 | Validating token: 1d47af867241 ...
  512. app_proxy_1 | Validating token: 1d47af867241 ...
  513. app_proxy_1 | Validating token: 1d47af867241 ...
  514. app_proxy_1 | Validating token: 1d47af867241 ...
  515. app_proxy_1 | Validating token: 1d47af867241 ...
  516. ================
  517. ==== Result ====
  518. ================
  519. ==== END =====
Advertisement
Add Comment
Please, Sign In to add comment