Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- =====================
- = Umbrel debug info =
- =====================
- Umbrel version
- --------------
- 0.5.4
- Flashed OS version
- -----------------
- v0.5.4
- Raspberry Pi Model
- ------------------
- Revision : d03115
- Serial : 100000008278a770
- Model : Raspberry Pi 4 Model B Rev 1.5
- Firmware
- --------
- May 9 2023 12:16:34
- Copyright (c) 2012 Broadcom
- version 30aa0d70ab280427ba04ebc718c81d4350b9d394 (clean) (release) (start)
- Temperature
- -----------
- temp=45.7'C
- Throttling
- ----------
- throttled=0x0
- Memory usage
- ------------
- total used free shared buff/cache available
- Mem: 7.8G 2.4G 951M 392M 4.5G 4.9G
- Swap: 4.1G 2.0M 4.1G
- total: 30.7%
- immich: 11.9%
- electrs: 7%
- lightning: 5%
- thunderhub: 4.4%
- bitcoin: 3.7%
- bluewallet: 1.1%
- tailscale: 0.9%
- system: 0%
- Memory monitor logs
- -------------------
- 2023-12-08 00:15:57 Memory monitor running!
- 2023-12-08 00:20:26 Memory monitor running!
- 2023-12-13 20:03:35 Memory monitor running!
- 2023-12-14 11:11:11 Memory monitor running!
- 2023-12-14 11:31:25 Memory monitor running!
- 2023-12-14 22:29:22 Memory monitor running!
- 2023-12-24 12:50:12 Memory monitor running!
- 2023-12-25 01:53:15 Memory monitor running!
- 2023-12-25 05:03:56 Memory monitor running!
- 2023-12-28 08:35:34 Memory monitor running!
- Filesystem information
- ----------------------
- Filesystem Size Used Avail Use% Mounted on
- /dev/root 15G 4.1G 9.8G 30% /
- /dev/sda1 916G 694G 176G 80% /home/umbrel/umbrel
- Startup service logs
- --------------------
- -- Logs begin at Wed 2024-01-03 19:10:26 UTC, end at Fri 2024-01-05 06:30:49 UTC. --
- Jan 05 06:28:08 umbrel passwd[16431]: pam_unix(passwd:chauthtok): password changed for umbrel
- External storage service logs
- -----------------------------
- -- Logs begin at Wed 2024-01-03 19:10:26 UTC, end at Fri 2024-01-05 06:30:49 UTC. --
- -- No entries --
- External storage SD card update service logs
- --------------------------------------------
- -- Logs begin at Wed 2024-01-03 19:10:26 UTC, end at Fri 2024-01-05 06:30:49 UTC. --
- -- No entries --
- Karen logs
- ----------
- 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
- 0 0 0 0 0 0 0 0 --:--:-- 0:00:04 --:--:-- 0
- 0 0 0 0 0 0 0 0 --:--:-- 0:00:05 --:--:-- 0
- 100 1038 0 0 100 1038 0 162 0:00:06 0:00:06 --:--:-- 162
- 100 1038 0 0 100 1038 0 153 0:00:06 0:00:06 --:--:-- 153
- 100 1184 100 146 100 1038 20 144 0:00:07 0:00:07 --:--:-- 165
- {"message":"Successfully uploaded backup 1704410333945.tar.gz.pgp for backup ID a5294e2026327e7bc8a23cc318c1a622d86371cd88eebd4a70318e026616b1a5"}
- =============================
- ====== Backup success =======
- =============================
- Got signal: backup
- karen is getting triggered!
- Deriving keys...
- Creating backup...
- Adding random padding...
- 1+0 records in
- 1+0 records out
- 213 bytes copied, 0.000224459 s, 949 kB/s
- Creating encrypted tarball...
- backup/
- backup/channel.backup
- backup/.padding
- Uploading backup...
- % Total % Received % Xferd Average Speed Time Time Time Current
- Dload Upload Total Spent Left Speed
- 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
- 0 0 0 0 0 0 0 0 --:--:-- 0:00:05 --:--:-- 0
- 0 0 0 0 0 0 0 0 --:--:-- 0:00:06 --:--:-- 0
- 0 0 0 0 0 0 0 0 --:--:-- 0:00:06 --:--:-- 0
- 100 1258 0 0 100 1258 0 167 0:00:07 0:00:07 --:--:-- 167
- 100 1404 100 146 100 1258 16 143 0:00:09 0:00:08 0:00:01 160
- 100 1404 100 146 100 1258 16 143 0:00:09 0:00:08 0:00:01 375
- {"message":"Successfully uploaded backup 1704416703556.tar.gz.pgp for backup ID a5294e2026327e7bc8a23cc318c1a622d86371cd88eebd4a70318e026616b1a5"}
- =============================
- ====== Backup success =======
- =============================
- Got signal: backup
- karen is getting triggered!
- Deriving keys...
- Creating backup...
- Adding random padding...
- 1+0 records in
- 1+0 records out
- 6486 bytes (6.5 kB, 6.3 KiB) copied, 0.000326217 s, 19.9 MB/s
- Creating encrypted tarball...
- backup/
- backup/channel.backup
- backup/.padding
- Uploading backup...
- % Total % Received % Xferd Average Speed Time Time Time Current
- Dload Upload Total Spent Left Speed
- 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
- 0 0 0 0 0 0 0 0 --:--:-- 0:00:05 --:--:-- 0
- 0 0 0 0 0 0 0 0 --:--:-- 0:00:05 --:--:-- 0
- 100 7546 0 0 100 7546 0 1094 0:00:06 0:00:06 --:--:-- 1093
- 100 7546 0 0 100 7546 0 955 0:00:07 0:00:07 --:--:-- 955
- 100 7692 100 146 100 7546 18 947 0:00:08 0:00:07 0:00:01 965
- {"message":"Successfully uploaded backup 1704420533897.tar.gz.pgp for backup ID a5294e2026327e7bc8a23cc318c1a622d86371cd88eebd4a70318e026616b1a5"}
- =============================
- ====== Backup success =======
- =============================
- Got signal: change-password
- karen is getting triggered!
- New password: Retype new password: passwd: password updated successfully
- Got signal: debug
- karen is getting triggered!
- Docker containers
- -----------------
- NAMES STATUS
- immich_server_1 Up 7 days
- immich_microservices_1 Up 7 days
- bitcoin_server_1 Up 7 days
- electrs_app_1 Up 7 days
- bluewallet_lndhub_1 Up 3 seconds
- immich_machine-learning_1 Up 7 days
- immich_redis_1 Up 7 days
- immich_postgres_1 Up 7 days
- immich_app_proxy_1 Up 7 days
- bitcoin_tor_1 Up 7 days
- bitcoin_i2pd_daemon_1 Up 7 days
- bitcoin_app_proxy_1 Up 7 days
- bitcoin_bitcoind_1 Up 28 seconds
- bluewallet_redis_1 Up 7 days
- bluewallet_app_proxy_1 Up 7 days
- electrs_app_proxy_1 Up 7 days
- electrs_tor_1 Up 7 days
- electrs_electrs_1 Up 31 seconds
- thunderhub_web_1 Up 7 days
- thunderhub_app_proxy_1 Up 7 days
- lightning_app_1 Up 7 days
- lightning_lnd_1 Up 7 days
- lightning_app_proxy_1 Up 7 days
- lightning_tor_1 Up 7 days
- tailscale_web_1 Up 7 days
- nginx Up 7 days
- dashboard Up 7 days
- manager Up 7 days
- tor_proxy Up 7 days
- auth Up 7 days
- Umbrel logs
- -----------
- Attaching to manager
- manager | ::ffff:10.21.21.2 - - [Fri, 05 Jan 2024 06:30:51 GMT] "GET /v1/system/storage HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
- manager |
- manager | umbrel-manager
- manager | ::ffff:10.21.21.2 - - [Fri, 05 Jan 2024 06:30:51 GMT] "GET /v1/system/info HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
- manager |
- manager | umbrel-manager
- manager | ::ffff:10.21.21.2 - - [Fri, 05 Jan 2024 06:30:51 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
- manager |
- manager | umbrel-manager
- manager | ::ffff:10.21.21.2 - - [Fri, 05 Jan 2024 06:30:52 GMT] "GET /v1/system/get-update HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
- manager |
- manager | umbrel-manager
- manager | ::ffff:10.21.21.2 - - [Fri, 05 Jan 2024 06:30:52 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
- manager |
- manager | umbrel-manager
- manager | ::ffff:10.21.0.16 - - [Fri, 05 Jan 2024 06:30:52 GMT] "GET /v1/account/token?token=26f1639082be146a42a0d09aa6afc8841aa9c53eb30eb601883f1480b4d5f354 HTTP/1.1" 200 16 "-" "app-proxy/0.0.1"
- manager |
- manager | umbrel-manager
- manager | ::ffff:10.21.21.2 - - [Fri, 05 Jan 2024 06:30:54 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
- manager |
- manager | umbrel-manager
- manager | ::ffff:10.21.0.16 - - [Fri, 05 Jan 2024 06:30:54 GMT] "GET /v1/account/token?token=26f1639082be146a42a0d09aa6afc8841aa9c53eb30eb601883f1480b4d5f354 HTTP/1.1" 200 16 "-" "app-proxy/0.0.1"
- manager |
- manager | umbrel-manager
- manager | ::ffff:10.21.21.2 - - [Fri, 05 Jan 2024 06:30:56 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
- manager |
- manager | umbrel-manager
- manager | ::ffff:10.21.0.16 - - [Fri, 05 Jan 2024 06:30:56 GMT] "GET /v1/account/token?token=26f1639082be146a42a0d09aa6afc8841aa9c53eb30eb601883f1480b4d5f354 HTTP/1.1" 200 16 "-" "app-proxy/0.0.1"
- manager |
- manager | umbrel-manager
- Tor Proxy logs
- --------
- Attaching to tor_proxy
- tor_proxy | Jan 05 04:28:55.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 163.172.182.26:444 while fetching consensus directory.
- tor_proxy | Jan 05 04:29:55.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 194.32.107.206:443 while fetching consensus directory.
- tor_proxy | Jan 05 04:30:55.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 163.172.182.26:444 while fetching consensus directory.
- tor_proxy | Jan 05 04:31:55.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 194.32.107.206:443 while fetching consensus directory.
- tor_proxy | Jan 05 04:34:55.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 194.32.107.206:443 while fetching consensus directory.
- tor_proxy | Jan 05 04:38:57.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 163.172.182.26:444 while fetching consensus directory.
- tor_proxy | Jan 05 04:42:56.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 194.32.107.206:443 while fetching consensus directory.
- tor_proxy | Jan 05 04:44:57.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 163.172.182.26:444 while fetching consensus directory.
- tor_proxy | Jan 05 04:51:57.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 194.32.107.206:443 while fetching consensus directory.
- tor_proxy | Jan 05 05:06:57.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 194.32.107.206:443 while fetching consensus directory.
- App logs
- --------
- bitcoin
- Attaching to bitcoin_server_1, bitcoin_tor_1, bitcoin_i2pd_daemon_1, bitcoin_app_proxy_1, bitcoin_bitcoind_1
- app_proxy_1 | Validating token: 26f1639082be ...
- app_proxy_1 | Validating token: 26f1639082be ...
- app_proxy_1 | Validating token: 26f1639082be ...
- app_proxy_1 | Validating token: 26f1639082be ...
- app_proxy_1 | Validating token: 26f1639082be ...
- app_proxy_1 | Validating token: 26f1639082be ...
- app_proxy_1 | Validating token: 26f1639082be ...
- app_proxy_1 | Validating token: 26f1639082be ...
- app_proxy_1 | Validating token: 26f1639082be ...
- app_proxy_1 | Validating token: 26f1639082be ...
- bitcoind_1 | 2024-01-05T06:31:01Z * Using 715.0 MiB for transaction index database
- bitcoind_1 | 2024-01-05T06:31:01Z * Using 625.6 MiB for basic block filter index database
- bitcoind_1 | 2024-01-05T06:31:01Z * Using 8.0 MiB for chain state database
- bitcoind_1 | 2024-01-05T06:31:01Z * Using 4371.4 MiB for in-memory UTXO set (plus up to 286.1 MiB of unused mempool space)
- bitcoind_1 | 2024-01-05T06:31:01Z init message: Loading block index…
- bitcoind_1 | 2024-01-05T06:31:01Z Assuming ancestors of block 000000000000000000035c3f0d31e71a5ee24c5aaf3354689f65bd7b07dee632 have valid signatures.
- bitcoind_1 | 2024-01-05T06:31:01Z Setting nMinimumChainWork=000000000000000000000000000000000000000044a50fe819c39ad624021859
- bitcoind_1 | 2024-01-05T06:31:01Z Opening LevelDB in /data/.bitcoin/blocks/index
- bitcoind_1 | 2024-01-05T06:31:02Z Opened LevelDB successfully
- bitcoind_1 | 2024-01-05T06:31:02Z Using obfuscation key for /data/.bitcoin/blocks/index: 0000000000000000
- i2pd_daemon_1 | 06:12:51@94/error - Garlic: Can't handle ECIES-X25519-AEAD-Ratchet message
- i2pd_daemon_1 | 06:14:30@94/error - ElGamal decrypt hash doesn't match
- i2pd_daemon_1 | 06:14:30@94/error - Garlic: Can't handle ECIES-X25519-AEAD-Ratchet message
- i2pd_daemon_1 | 06:15:16@966/error - Garlic: Can't handle ECIES-X25519-AEAD-Ratchet message
- i2pd_daemon_1 | 06:18:58@94/error - Garlic: Can't handle ECIES-X25519-AEAD-Ratchet message
- i2pd_daemon_1 | 06:19:45@619/error - Tunnels: Can't select next hop for reeXDOC6E2F0pHe2jahSZgziYcJsWyoW-v1nNVsVHxI=
- i2pd_daemon_1 | 06:19:45@619/error - Tunnels: Can't create inbound tunnel, no peers available
- i2pd_daemon_1 | 06:24:49@677/error - SSU2: RelayIntro unknown router to introduce
- i2pd_daemon_1 | 06:24:52@619/error - Tunnel: Tunnel with id 470115970 already exists
- i2pd_daemon_1 | 06:26:20@619/error - Tunnel: Tunnel with id 77984737 already exists
- server_1 | umbrel-middleware
- server_1 | ::ffff:10.21.0.16 - - [Fri, 05 Jan 2024 06:30:57 GMT] "GET /v1/bitcoind/info/status HTTP/1.1" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
- server_1 |
- server_1 | umbrel-middleware
- server_1 | ::ffff:10.21.0.16 - - [Fri, 05 Jan 2024 06:31:02 GMT] "GET /v1/bitcoind/info/status HTTP/1.1" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
- server_1 |
- server_1 | umbrel-middleware
- server_1 | ::ffff:10.21.0.16 - - [Fri, 05 Jan 2024 06:31:03 GMT] "GET /v1/bitcoind/info/status HTTP/1.1" 304 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
- server_1 |
- server_1 | umbrel-middleware
- tor_1 | Jan 05 04:02:49.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 198.100.153.7:9001 while fetching consensus directory.
- tor_1 | Jan 05 04:03:49.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 198.100.153.7:9001 while fetching consensus directory.
- tor_1 | Jan 05 04:04:49.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 198.100.153.7:9001 while fetching consensus directory.
- tor_1 | Jan 05 04:05:49.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 148.251.83.53:8443 while fetching consensus directory.
- tor_1 | Jan 05 04:06:49.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 198.100.153.7:9001 while fetching consensus directory.
- tor_1 | Jan 05 04:10:49.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 198.100.153.7:9001 while fetching consensus directory.
- tor_1 | Jan 05 04:22:49.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 148.251.83.53:8443 while fetching consensus directory.
- tor_1 | Jan 05 04:24:49.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 198.100.153.7:9001 while fetching consensus directory.
- tor_1 | Jan 05 04:29:49.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 198.100.153.7:9001 while fetching consensus directory.
- tor_1 | Jan 05 04:40:49.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 148.251.83.53:8443 while fetching consensus directory.
- bluewallet
- Attaching to bluewallet_lndhub_1, bluewallet_redis_1, bluewallet_app_proxy_1
- app_proxy_1 | [HPM] Error occurred while proxying request 192.168.0.2:3008/ to http://bluewallet_lndhub_1:3008/ [ECONNREFUSED] (https://nodejs.org/api/errors.html#errors_common_system_errors)
- app_proxy_1 | [HPM] Error occurred while proxying request 192.168.0.2:3008/ to http://bluewallet_lndhub_1:3008/ [ECONNREFUSED] (https://nodejs.org/api/errors.html#errors_common_system_errors)
- app_proxy_1 | [HPM] Error occurred while proxying request 192.168.0.2:3008/ to http://bluewallet_lndhub_1:3008/ [ECONNREFUSED] (https://nodejs.org/api/errors.html#errors_common_system_errors)
- app_proxy_1 | [HPM] Error occurred while proxying request 192.168.0.2:3008/ to http://bluewallet_lndhub_1:3008/ [ECONNREFUSED] (https://nodejs.org/api/errors.html#errors_common_system_errors)
- app_proxy_1 | [HPM] Error occurred while proxying request 192.168.0.2:3008/ to http://bluewallet_lndhub_1:3008/ [ENOTFOUND] (https://nodejs.org/api/errors.html#errors_common_system_errors)
- app_proxy_1 | [HPM] Error occurred while proxying request 192.168.0.2:3008/ to http://bluewallet_lndhub_1:3008/ [ECONNREFUSED] (https://nodejs.org/api/errors.html#errors_common_system_errors)
- app_proxy_1 | [HPM] Error occurred while proxying request 192.168.0.2:3008/ to http://bluewallet_lndhub_1:3008/ [ECONNREFUSED] (https://nodejs.org/api/errors.html#errors_common_system_errors)
- app_proxy_1 | [HPM] Error occurred while proxying request 192.168.0.2:3008/ to http://bluewallet_lndhub_1:3008/ [ECONNREFUSED] (https://nodejs.org/api/errors.html#errors_common_system_errors)
- app_proxy_1 | [HPM] Error occurred while proxying request 192.168.0.2:3008/ to http://bluewallet_lndhub_1:3008/ [ECONNREFUSED] (https://nodejs.org/api/errors.html#errors_common_system_errors)
- app_proxy_1 | [HPM] Error occurred while proxying request 100.90.16.136:3008/ to http://bluewallet_lndhub_1:3008/ [ENOTFOUND] (https://nodejs.org/api/errors.html#errors_common_system_errors)
- lndhub_1 | at Object.onReceiveStatus (/lndhub/node_modules/@grpc/grpc-js/src/client-interceptors.ts:389:48)
- lndhub_1 | at /lndhub/node_modules/@grpc/grpc-js/src/call-stream.ts:276:24
- lndhub_1 | at processTicksAndRejections (node:internal/process/task_queues:78:11) {
- lndhub_1 | code: 2,
- lndhub_1 | details: 'wallet locked, unlock it to enable full RPC access',
- lndhub_1 | metadata: Metadata {
- lndhub_1 | internalRepr: Map(1) { 'content-type' => [Array] },
- lndhub_1 | options: {}
- lndhub_1 | }
- lndhub_1 | }
- redis_1 | 7:C 28 Dec 2023 08:37:11.951 # Configuration loaded
- redis_1 | 7:M 28 Dec 2023 08:37:11.953 * monotonic clock: POSIX clock_gettime
- redis_1 | 7:M 28 Dec 2023 08:37:12.084 * Running mode=standalone, port=6379.
- redis_1 | 7:M 28 Dec 2023 08:37:12.084 # Server initialized
- redis_1 | 7:M 28 Dec 2023 08:37:12.084 # WARNING overcommit_memory is set to 0! Background save may fail under low memory condition. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
- redis_1 | 7:M 28 Dec 2023 08:37:12.312 * Loading RDB produced by version 6.2.2
- redis_1 | 7:M 28 Dec 2023 08:37:12.313 * RDB age 272066 seconds
- redis_1 | 7:M 28 Dec 2023 08:37:12.313 * RDB memory usage when created 0.82 Mb
- redis_1 | 7:M 28 Dec 2023 08:37:12.313 * DB loaded from disk: 0.162 seconds
- redis_1 | 7:M 28 Dec 2023 08:37:12.313 * Ready to accept connections
- electrs
- Attaching to electrs_app_1, electrs_app_proxy_1, electrs_tor_1, electrs_electrs_1
- electrs_1 |
- electrs_1 | Caused by:
- electrs_1 | 0: bitcoind RPC polling failed
- electrs_1 | 1: daemon not available
- electrs_1 | 2: JSON-RPC error: transport error: Couldn't connect to host: Connection refused (os error 111)
- electrs_1 | Starting electrs 0.10.1 on aarch64 linux with Config { network: Bitcoin, db_path: "/data/db/bitcoin", daemon_dir: "/data/.bitcoin", daemon_auth: CookieFile("/data/.bitcoin/.cookie"), daemon_rpc_addr: 10.21.21.8:8332, daemon_p2p_addr: 10.21.21.8:8333, electrum_rpc_addr: 0.0.0.0:50001, monitoring_addr: 127.0.0.1:4224, wait_duration: 10s, jsonrpc_timeout: 15s, index_batch_size: 10, index_lookup_limit: None, reindex_last_blocks: 0, auto_reindex: true, ignore_mempool: false, sync_once: false, skip_block_download_wait: false, disable_electrum_rpc: false, server_banner: "Umbrel Electrs (0.10.1)", signet_magic: f9beb4d9, args: [] }
- electrs_1 | [2024-01-05T06:31:00.195Z INFO electrs::metrics::metrics_impl] serving Prometheus metrics on 127.0.0.1:4224
- electrs_1 | [2024-01-05T06:31:00.196Z INFO electrs::server] serving Electrum RPC on 0.0.0.0:50001
- electrs_1 | [2024-01-05T06:31:00.746Z INFO electrs::db] "/data/db/bitcoin": 203 SST files, 44.597110734 GB, 5.570348022 Grows
- electrs_1 | [2024-01-05T06:31:11.343Z INFO electrs::chain] loading 820191 headers, tip=00000000000000000000775074fa77acd1870a49f45f5b31c11f22f5fbd04c2e
- tor_1 | Jan 05 04:50:26.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 5.9.14.25:993 while fetching consensus directory.
- tor_1 | Jan 05 04:51:26.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 5.9.14.25:993 while fetching consensus directory.
- tor_1 | Jan 05 04:53:28.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 65.21.94.13:5443 while fetching consensus directory.
- tor_1 | Jan 05 04:55:26.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 65.21.94.13:5443 while fetching consensus directory.
- tor_1 | Jan 05 05:02:26.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 51.68.204.139:9001 while fetching consensus directory.
- tor_1 | Jan 05 05:07:26.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 51.68.204.139:9001 while fetching consensus directory.
- tor_1 | Jan 05 05:08:26.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 5.9.14.25:993 while fetching consensus directory.
- tor_1 | Jan 05 05:09:26.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 5.9.14.25:993 while fetching consensus directory.
- tor_1 | Jan 05 05:10:26.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 51.68.204.139:9001 while fetching consensus directory.
- tor_1 | Jan 05 05:14:26.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 5.9.14.25:993 while fetching consensus directory.
- app_proxy_1 | yarn run v1.22.19
- app_proxy_1 | $ node ./bin/www
- app_proxy_1 | [HPM] Proxy created: / -> http://10.21.22.4:3006
- app_proxy_1 | Waiting for 10.21.22.4:3006 to open...
- app_proxy_1 | Electrs is now ready...
- app_proxy_1 | Listening on port: 2102
- app_1 | > [email protected] dev:backend
- app_1 | > npm run start -w umbrel-electrs-backend
- app_1 |
- app_1 |
- app_1 | > [email protected] start
- app_1 | > node ./bin/www
- app_1 |
- app_1 | Thu, 28 Dec 2023 08:46:26 GMT morgan deprecated morgan(options): use morgan("default", options) instead at app.js:28:9
- app_1 | Thu, 28 Dec 2023 08:46:26 GMT morgan deprecated default format: use combined format at app.js:28:9
- app_1 | Listening on port 3006
- immich
- Attaching to immich_server_1, immich_microservices_1, immich_machine-learning_1, immich_redis_1, immich_postgres_1, immich_app_proxy_1
- microservices_1 | [Nest] 7 - 12/28/2023, 8:52:27 AM LOG [38;5;3m[InstanceLoader] ConfigModule dependencies initialized
- microservices_1 | [Nest] 7 - 12/28/2023, 8:52:27 AM LOG [38;5;3m[InstanceLoader] BullModule dependencies initialized
- microservices_1 | [Nest] 7 - 12/28/2023, 8:52:27 AM LOG [38;5;3m[InstanceLoader] BullModule dependencies initialized
- microservices_1 | [Nest] 7 - 12/28/2023, 8:52:27 AM LOG [38;5;3m[InstanceLoader] TypeOrmCoreModule dependencies initialized
- microservices_1 | [Nest] 7 - 12/28/2023, 8:52:27 AM LOG [38;5;3m[InstanceLoader] TypeOrmModule dependencies initialized
- microservices_1 | [Nest] 7 - 12/28/2023, 8:52:27 AM LOG [38;5;3m[InstanceLoader] InfraModule dependencies initialized
- microservices_1 | [Nest] 7 - 12/28/2023, 8:52:27 AM LOG [38;5;3m[InstanceLoader] DomainModule dependencies initialized
- microservices_1 | [Nest] 7 - 12/28/2023, 8:52:27 AM LOG [38;5;3m[InstanceLoader] MicroservicesModule dependencies initialized
- microservices_1 | [Nest] 7 - 12/28/2023, 8:52:27 AM LOG [38;5;3m[NestApplication] Nest application successfully started
- microservices_1 | [Nest] 7 - 12/28/2023, 8:52:27 AM LOG [38;5;3m[ImmichMicroservice] Immich Microservices is listening on http://[::1]:3002 [v1.91.4] [PRODUCTION]
- app_proxy_1 | Validating token: 26b8a8ad218c ...
- app_proxy_1 | Validating token: 26b8a8ad218c ...
- app_proxy_1 | Validating token: 26b8a8ad218c ...
- app_proxy_1 | Validating token: 26b8a8ad218c ...
- app_proxy_1 | Validating token: 26b8a8ad218c ...
- app_proxy_1 | Validating token: 26b8a8ad218c ...
- app_proxy_1 | Validating token: 26b8a8ad218c ...
- app_proxy_1 | Validating token: 26b8a8ad218c ...
- app_proxy_1 | Validating token: 26b8a8ad218c ...
- app_proxy_1 | [HPM] Client disconnected
- redis_1 | 1:M 05 Jan 2024 06:26:00.029 * 100 changes in 300 seconds. Saving...
- redis_1 | 1:M 05 Jan 2024 06:26:00.031 * Background saving started by pid 2280
- machine-learning_1 | [12/28/23 08:49:15] [1;7;31mCRITICAL WORKER TIMEOUT (pid:20)
- machine-learning_1 | [12/28/23 08:49:52] ERROR Worker (pid:20) was sent code 134!
- machine-learning_1 | [12/28/23 08:49:52] INFO Booting worker with pid: 27
- machine-learning_1 | [12/28/23 08:51:52] [1;7;31mCRITICAL WORKER TIMEOUT (pid:27)
- machine-learning_1 | [12/28/23 08:52:04] ERROR Worker (pid:27) was sent SIGKILL! Perhaps out of
- machine-learning_1 | memory?
- machine-learning_1 | [12/28/23 08:52:04] INFO Booting worker with pid: 35
- redis_1 | 2280:C 05 Jan 2024 06:26:01.058 * DB saved on disk
- redis_1 | 2280:C 05 Jan 2024 06:26:01.059 * RDB: 1 MB of memory used by copy-on-write
- redis_1 | 1:M 05 Jan 2024 06:26:01.137 * Background saving terminated with success
- redis_1 | 1:M 05 Jan 2024 06:31:02.001 * 100 changes in 300 seconds. Saving...
- redis_1 | 1:M 05 Jan 2024 06:31:02.002 * Background saving started by pid 2281
- redis_1 | 2281:C 05 Jan 2024 06:31:02.281 * DB saved on disk
- redis_1 | 2281:C 05 Jan 2024 06:31:02.282 * RDB: 1 MB of memory used by copy-on-write
- redis_1 | 1:M 05 Jan 2024 06:31:02.305 * Background saving terminated with success
- machine-learning_1 | [12/28/23 08:53:35] INFO Created in-memory cache with unloading after 300s
- machine-learning_1 | of inactivity.
- machine-learning_1 | [12/28/23 08:53:35] INFO Initialized request thread pool with 4 threads.
- postgres_1 | [2023-12-28T08:38:37Z INFO vectors::utils::clean] Find directory "pg_vectors/indexes/26034/segments/3c07d095-5df3-48e7-a560-e3c4d88cd0c8".
- postgres_1 | [2023-12-28T08:38:37Z INFO vectors::utils::clean] Find directory "pg_vectors/indexes/26034/segments/991dcf5f-55c9-4fdd-ab09-6d74798708cc".
- postgres_1 | [2023-12-28T08:38:41Z INFO vectors::utils::clean] Find directory "pg_vectors/indexes/26033/segments/2b314b8c-2227-4017-ae6c-f1b864725150".
- postgres_1 | [2023-12-28T08:38:41Z INFO vectors::utils::clean] Find directory "pg_vectors/indexes/26033/segments/c3e43117-506b-4f88-824e-1001692776fc".
- postgres_1 | [2023-12-28T08:38:41Z INFO vectors::utils::clean] Find directory "pg_vectors/indexes/26033/segments/553d265b-4514-438b-80b8-e6ac0a5dde1e".
- postgres_1 | 2023-12-28 08:43:11.200 UTC [14] LOG: database system was not properly shut down; automatic recovery in progress
- postgres_1 | 2023-12-28 08:43:11.753 UTC [14] LOG: redo starts at 0/8576C40
- postgres_1 | 2023-12-28 08:43:11.812 UTC [14] LOG: invalid record length at 0/8578238: wanted 24, got 0
- postgres_1 | 2023-12-28 08:43:11.812 UTC [14] LOG: redo done at 0/8578210 system usage: CPU: user: 0.00 s, system: 0.00 s, elapsed: 0.05 s
- postgres_1 | 2023-12-28 08:43:25.860 UTC [1] LOG: database system is ready to accept connections
- server_1 | [Nest] 7 - 12/28/2023, 8:52:29 AM LOG [38;5;3m[RouterExplorer] Mapped {/api/person/:id/assets, GET} route
- server_1 | [Nest] 7 - 12/28/2023, 8:52:29 AM LOG [38;5;3m[RouterExplorer] Mapped {/api/person/:id/merge, POST} route
- server_1 | [Nest] 7 - 12/28/2023, 8:52:29 AM LOG [38;5;3m[NestApplication] Nest application successfully started
- server_1 | [Nest] 7 - 12/28/2023, 8:52:29 AM LOG [38;5;3m[ImmichServer] Immich Server is listening on http://[::1]:3001 [v1.91.4] [PRODUCTION]
- server_1 | [Nest] 7 - 12/29/2023, 10:53:36 AM LOG [38;5;3m[CommunicationRepository] Websocket Connect: yx3cLQQTl7YcG-RZAAAB
- server_1 | [Nest] 7 - 12/29/2023, 10:53:40 AM LOG [38;5;3m[CommunicationRepository] Websocket Disconnect: yx3cLQQTl7YcG-RZAAAB
- server_1 | [Nest] 7 - 12/30/2023, 2:15:24 AM LOG [38;5;3m[CommunicationRepository] Websocket Connect: A_53G3FuzsHkm76OAAAD
- server_1 | [Nest] 7 - 12/30/2023, 2:16:36 AM LOG [38;5;3m[CommunicationRepository] Websocket Disconnect: A_53G3FuzsHkm76OAAAD
- server_1 | [Nest] 7 - 01/03/2024, 12:29:26 AM LOG [38;5;3m[CommunicationRepository] Websocket Connect: 5maeMNidv1YBST3ZAAAF
- server_1 | [Nest] 7 - 01/03/2024, 12:32:37 AM LOG [38;5;3m[CommunicationRepository] Websocket Disconnect: 5maeMNidv1YBST3ZAAAF
- lightning
- Attaching to lightning_app_1, lightning_lnd_1, lightning_app_proxy_1, lightning_tor_1
- app_1 | Checking LND status...
- app_1 | Waiting for LND...
- app_1 | Checking LND status...
- app_1 | Waiting for LND...
- app_1 | Checking LND status...
- app_1 | Waiting for LND...
- app_1 | Checking LND status...
- app_1 | Waiting for LND...
- app_1 | Checking LND status...
- app_1 | Waiting for LND...
- app_proxy_1 | Validating token: 26f1639082be ...
- app_proxy_1 | Validating token: 26f1639082be ...
- app_proxy_1 | Validating token: 26f1639082be ...
- app_proxy_1 | Validating token: 26f1639082be ...
- app_proxy_1 | Validating token: 26f1639082be ...
- app_proxy_1 | Validating token: 26f1639082be ...
- app_proxy_1 | Validating token: 26f1639082be ...
- app_proxy_1 | Validating token: 26f1639082be ...
- app_proxy_1 | Validating token: 26f1639082be ...
- app_proxy_1 | Validating token: 26f1639082be ...
- lnd_1 | 2024-01-05 06:29:48.081 [ERR] RPCS: [/lnrpc.Lightning/ListChannels]: wallet locked, unlock it to enable full RPC access
- lnd_1 | 2024-01-05 06:29:48.081 [ERR] RPCS: [/lnrpc.Lightning/GetInfo]: wallet locked, unlock it to enable full RPC access
- lnd_1 | 2024-01-05 06:29:56.912 [ERR] RPCS: [/lnrpc.Lightning/GetInfo]: wallet locked, unlock it to enable full RPC access
- lnd_1 | 2024-01-05 06:29:56.912 [ERR] RPCS: [/lnrpc.Lightning/GetInfo]: wallet locked, unlock it to enable full RPC access
- lnd_1 | 2024-01-05 06:30:56.919 [ERR] RPCS: [/lnrpc.Lightning/GetInfo]: wallet locked, unlock it to enable full RPC access
- lnd_1 | 2024-01-05 06:30:56.920 [ERR] RPCS: [/lnrpc.Lightning/GetInfo]: wallet locked, unlock it to enable full RPC access
- lnd_1 | 2024-01-05 06:31:00.233 [ERR] RPCS: [/lnrpc.Lightning/ListChannels]: wallet locked, unlock it to enable full RPC access
- lnd_1 | 2024-01-05 06:31:00.233 [ERR] RPCS: [/lnrpc.Lightning/GetInfo]: wallet locked, unlock it to enable full RPC access
- lnd_1 | 2024-01-05 06:31:00.234 [ERR] RPCS: [/lnrpc.Lightning/SubscribeInvoices]: wallet locked, unlock it to enable full RPC access
- lnd_1 | 2024-01-05 06:31:00.234 [ERR] RPCS: [/lnrpc.Lightning/GetInfo]: wallet locked, unlock it to enable full RPC access
- tor_1 | Jan 05 04:01:36.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 185.32.222.237:9444 while fetching consensus directory.
- tor_1 | Jan 05 04:02:36.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 128.238.62.37:443 while fetching consensus directory.
- tor_1 | Jan 05 04:03:36.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 128.238.62.37:443 while fetching consensus directory.
- tor_1 | Jan 05 04:04:36.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 185.32.222.237:9444 while fetching consensus directory.
- tor_1 | Jan 05 04:07:36.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 185.32.222.237:9444 while fetching consensus directory.
- tor_1 | Jan 05 04:09:36.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 185.32.222.237:9444 while fetching consensus directory.
- tor_1 | Jan 05 04:19:36.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 185.32.222.237:9444 while fetching consensus directory.
- tor_1 | Jan 05 04:55:36.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 128.238.62.37:443 while fetching consensus directory.
- tor_1 | Jan 05 05:17:36.000 [warn] Received http status code 404 ("Consensus not signed by sufficient number of requested authorities") from server 185.32.222.237:9444 while fetching consensus directory.
- tor_1 | Jan 05 05:47:44.000 [notice] No circuits are opened. Relaxed timeout for circuit 12051 (a Hidden service: Uploading HS descriptor 4-hop circuit in state doing handshakes with channel state open) to 60000ms. However, it appears the circuit has timed out anyway. [7 similar message(s) suppressed in last 11280 seconds]
- tailscale
- Attaching to tailscale_web_1
- web_1 | 2024/01/05 06:30:58 monitor: RTM_DELROUTE: src=, dst=fe80::7860:98ff:fe6a:3fa/128, gw=, outif=169899, table=255
- web_1 | 2024/01/05 06:30:58 monitor: RTM_DELROUTE: src=, dst=ff00::/8, gw=, outif=169899, table=255
- web_1 | 2024/01/05 06:30:58 [RATELIMIT] format("monitor: %s: src=%v, dst=%v, gw=%v, outif=%v, table=%v")
- web_1 | 2024/01/05 06:31:00 Accept: TCP{100.125.14.138:60944 > 100.90.16.136:81} 52 tcp ok
- web_1 | 2024/01/05 06:31:09 Accept: TCP{100.125.14.138:60948 > 100.90.16.136:2100} 52 tcp ok
- web_1 | 2024/01/05 06:31:19 Accept: TCP{100.125.14.138:60957 > 100.90.16.136:2100} 52 tcp ok
- web_1 | 2024/01/05 06:31:20 [RATELIMIT] format("monitor: %s: src=%v, dst=%v, gw=%v, outif=%v, table=%v") (3 dropped)
- web_1 | 2024/01/05 06:31:20 monitor: RTM_DELROUTE: src=, dst=fe80::/64, gw=, outif=169907, table=254
- web_1 | 2024/01/05 06:31:20 monitor: RTM_DELROUTE: src=, dst=fe80::bc17:1dff:fe80:3826/128, gw=, outif=169907, table=255
- web_1 | 2024/01/05 06:31:20 monitor: RTM_DELROUTE: src=, dst=ff00::/8, gw=, outif=169907, table=255
- thunderhub
- Attaching to thunderhub_web_1, thunderhub_app_proxy_1
- web_1 | ],
- web_1 | level: 'error',
- web_1 | message: 'Error connecting to node',
- web_1 | timestamp: '2024-01-05T06:30:56.924Z'
- web_1 | }
- web_1 | {
- web_1 | message: 'No node available for balance pushes',
- web_1 | level: 'error',
- web_1 | timestamp: '2024-01-05T06:30:56.927Z'
- web_1 | }
- app_proxy_1 | Validating token: 1d47af867241 ...
- app_proxy_1 | Validating token: 1d47af867241 ...
- app_proxy_1 | Validating token: 1d47af867241 ...
- app_proxy_1 | Validating token: 1d47af867241 ...
- app_proxy_1 | Validating token: 1d47af867241 ...
- app_proxy_1 | Validating token: 1d47af867241 ...
- app_proxy_1 | Validating token: 1d47af867241 ...
- app_proxy_1 | Validating token: 1d47af867241 ...
- app_proxy_1 | Validating token: 1d47af867241 ...
- app_proxy_1 | Validating token: 1d47af867241 ...
- ================
- ==== Result ====
- ================
- ==== END =====
Advertisement
Add Comment
Please, Sign In to add comment