Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- =====================
- = Umbrel debug info =
- =====================
- Umbrel version
- --------------
- 0.5.4
- Memory usage
- ------------
- total used free shared buff/cache available
- Mem: 33G 6.5G 19G 119M 8.0G 26G
- Swap: 1.0G 0B 1.0G
- total: 19.6%
- llama-gpt: 12.3%
- penpot: 3.2%
- lightning: 1.5%
- thunderhub: 1.4%
- ipfs-podcasting: 1.2%
- n8n: 0.8%
- torq: 0.7%
- lnplus: 0.4%
- bitcoin: 0.4%
- lightning-shell: 0.3%
- snort: 0.2%
- ln-visualizer: 0.2%
- helipad: 0.2%
- system: 0%
- Memory monitor logs
- -------------------
- 2023-08-15 22:35:32 Memory monitor running!
- 2023-08-16 20:16:08 Memory monitor running!
- 2023-08-16 22:51:23 Memory monitor running!
- 2023-08-25 15:29:06 Memory monitor running!
- 2023-08-30 17:46:26 Memory monitor running!
- 2023-08-30 18:27:41 Memory monitor running!
- 2023-08-30 22:47:11 Memory monitor running!
- 2023-08-30 22:54:56 Memory monitor running!
- 2023-08-30 23:06:06 Memory monitor running!
- 2023-08-30 23:18:52 Memory monitor running!
- Filesystem information
- ----------------------
- Filesystem Size Used Avail Use% Mounted on
- /dev/sda2 1.8T 1.4T 316G 82% /
- /dev/sda2 1.8T 1.4T 316G 82% /
- Karen logs
- ----------
- Pulling web ... extracting (100.0%)
- Pulling web ... extracting (100.0%)
- Pulling web ... pull complete
- Pulling web ... extracting (100.0%)
- Pulling web ... extracting (100.0%)
- Pulling web ... pull complete
- Pulling web ... extracting (100.0%)
- Pulling web ... extracting (100.0%)
- Pulling web ... pull complete
- Pulling web ... extracting (100.0%)
- Pulling web ... extracting (100.0%)
- Pulling web ... pull complete
- Pulling web ... extracting (100.0%)
- Pulling web ... extracting (100.0%)
- Pulling web ... pull complete
- Pulling web ... digest: sha256:4ea6aafee8ddd092b2...
- Pulling web ... status: downloaded newer image fo...
- Pulling web ... done
- Starting app lightning-shell...
- Creating lightning-shell_app_proxy_1 ...
- Creating lightning-shell_web_1 ...
- Creating lightning-shell_app_proxy_1 ... done
- Creating lightning-shell_web_1 ... done
- Saving app lightning-shell in DB...
- Successfully installed app lightning-shell
- Got signal: app-uninstall-photoprism
- karen is getting triggered!
- Removing images for app photoprism...
- Stopping photoprism_app_proxy_1 ...
- Stopping photoprism_db_1 ...
- Stopping photoprism_web_1 ...
- Stopping photoprism_web_1 ... done
- Stopping photoprism_app_proxy_1 ... done
- Stopping photoprism_db_1 ... done
- Removing photoprism_app_proxy_1 ...
- Removing photoprism_db_1 ...
- Removing photoprism_web_1 ...
- Removing photoprism_db_1 ... done
- Removing photoprism_web_1 ... done
- Removing photoprism_app_proxy_1 ... done
- Network umbrel_main_network is external, skipping
- Removing image getumbrel/app-proxy:v0.5.2@sha256:a2e3e0ddfcf84838bf0ba66f4b839ec958832d51f0ac9ace47962459c838b2d6
- Failed to remove image for service app_proxy: 409 Client Error for http+docker://localhost/v1.41/images/getumbrel/app-proxy:v0.5.2%40sha256:a2e3e0ddfcf84838bf0ba66f4b839ec958832d51f0ac9ace47962459c838b2d6?force=False&noprune=False: Conflict ("conflict: unable to remove repository reference "getumbrel/app-proxy:v0.5.2@sha256:a2e3e0ddfcf84838bf0ba66f4b839ec958832d51f0ac9ace47962459c838b2d6" (must force) - container 16679417de62 is using its referenced image ca53932ff8f1")
- Removing image mariadb:10.5.12@sha256:dfcba5641bdbfd7cbf5b07eeed707e6a3672f46823695a0d3aba2e49bbd9b1dd
- Removing image photoprism/photoprism:230625@sha256:3b6a64d86abb566b5314dc7b168476e421ca7322b9102c1bd9c79834c6bc6756
- Deleting app data for app photoprism...
- Removing app photoprism from DB...
- Successfully uninstalled app photoprism
- Got signal: debug
- karen is getting triggered!
- Docker containers
- -----------------
- NAMES STATUS
- lightning-shell_web_1 Up 12 minutes
- lightning-shell_app_proxy_1 Up 12 minutes
- ipfs-podcasting_app_proxy_1 Up 30 minutes
- ipfs-podcasting_web_1 Up 30 minutes
- bitcoin_server_1 Up 2 hours
- bitcoin_i2pd_daemon_1 Up 2 hours
- bitcoin_tor_1 Restarting (1) 35 seconds ago
- bitcoin_app_proxy_1 Up 2 hours
- bitcoin_bitcoind_1 Restarting (1) 23 seconds ago
- penpot_penpot-frontend_1 Up 2 hours
- penpot_penpot-backend_1 Up 2 hours
- penpot_penpot-redis_1 Up 2 hours
- penpot_app_proxy_1 Up 2 hours
- penpot_penpot-postgres_1 Up 2 hours
- penpot_penpot-exporter_1 Up 2 hours
- llama-gpt_llama-gpt-ui_1 Up 2 hours
- llama-gpt_llama-gpt-api_1 Up 2 hours
- llama-gpt_app_proxy_1 Up 2 hours
- torq_web_1 Up 2 hours
- torq_db_1 Up 2 hours
- torq_app_proxy_1 Up 2 hours
- lnplus_app_proxy_1 Up 2 hours
- lnplus_web_1 Up 2 hours
- ln-visualizer_app_proxy_1 Up 2 hours
- thunderhub_app_proxy_1 Up 2 hours
- ln-visualizer_web_1 Up About an hour
- thunderhub_web_1 Up 2 hours
- ln-visualizer_api_1 Restarting (1) 50 seconds ago
- helipad_web_1 Up 2 hours
- helipad_app_proxy_1 Up 2 hours
- lightning_lnd_1 Up 2 hours
- n8n_app_proxy_1 Up 2 hours
- lightning_app_proxy_1 Up 2 hours
- n8n_server_1 Up 2 hours
- lightning_app_1 Up 2 hours
- lightning_tor_1 Restarting (1) 39 seconds ago
- snort_app_proxy_1 Up 2 hours
- snort_web_1 Up 2 hours
- nginx Up 2 hours
- dashboard Up 2 hours
- manager Up 2 hours
- auth Up 2 hours
- tor_proxy Up 2 hours
- magical_mccarthy Up 2 hours
- grafana-grafana-1 Up 2 hours
- Umbrel logs
- -----------
- Attaching to manager
- manager | ::ffff:10.21.0.16 - - [Wed, 30 Aug 2023 23:53:49 GMT] "GET /v1/account/token?token=bdaf587c825d1739a420d08f67a16a2a1881311346a9827f318bf8117dbb3e04 HTTP/1.1" 200 16 "-" "app-proxy/0.0.1"
- manager |
- manager | umbrel-manager
- manager | ::ffff:10.21.21.2 - - [Wed, 30 Aug 2023 23:53:49 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0"
- manager |
- manager | umbrel-manager
- manager | ::ffff:10.21.21.2 - - [Wed, 30 Aug 2023 23:53:50 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0"
- manager |
- manager | umbrel-manager
- manager | ::ffff:10.21.21.2 - - [Wed, 30 Aug 2023 23:53:51 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0"
- manager |
- manager | umbrel-manager
- manager | ::ffff:10.21.0.15 - - [Wed, 30 Aug 2023 23:53:52 GMT] "GET /v1/account/token?token=bdaf587c825d1739a420d08f67a16a2a1881311346a9827f318bf8117dbb3e04 HTTP/1.1" 200 16 "-" "app-proxy/0.0.1"
- manager |
- manager | umbrel-manager
- manager | ::ffff:10.21.0.15 - - [Wed, 30 Aug 2023 23:53:52 GMT] "GET /v1/account/token?token=bdaf587c825d1739a420d08f67a16a2a1881311346a9827f318bf8117dbb3e04 HTTP/1.1" 200 16 "-" "app-proxy/0.0.1"
- manager |
- manager | umbrel-manager
- manager | ::ffff:10.21.0.16 - - [Wed, 30 Aug 2023 23:53:52 GMT] "GET /v1/account/token?token=bdaf587c825d1739a420d08f67a16a2a1881311346a9827f318bf8117dbb3e04 HTTP/1.1" 200 16 "-" "app-proxy/0.0.1"
- manager |
- manager | umbrel-manager
- manager | ::ffff:10.21.0.15 - - [Wed, 30 Aug 2023 23:53:52 GMT] "GET /v1/account/token?token=bdaf587c825d1739a420d08f67a16a2a1881311346a9827f318bf8117dbb3e04 HTTP/1.1" 200 16 "-" "app-proxy/0.0.1"
- manager |
- manager | umbrel-manager
- manager | ::ffff:10.21.0.15 - - [Wed, 30 Aug 2023 23:53:52 GMT] "GET /v1/account/token?token=bdaf587c825d1739a420d08f67a16a2a1881311346a9827f318bf8117dbb3e04 HTTP/1.1" 200 16 "-" "app-proxy/0.0.1"
- manager |
- manager | umbrel-manager
- manager | ::ffff:10.21.21.2 - - [Wed, 30 Aug 2023 23:53:52 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0"
- manager |
- manager | umbrel-manager
- Tor Proxy logs
- --------
- Attaching to tor_proxy
- tor_proxy | Aug 30 22:18:55.000 [notice] Bootstrapped 10% (conn_done): Connected to a relay
- tor_proxy | Aug 30 22:18:55.000 [notice] Bootstrapped 14% (handshake): Handshaking with a relay
- tor_proxy | Aug 30 22:18:55.000 [notice] Bootstrapped 15% (handshake_done): Handshake with a relay done
- tor_proxy | Aug 30 22:18:55.000 [notice] Bootstrapped 75% (enough_dirinfo): Loaded enough directory info to build circuits
- tor_proxy | Aug 30 22:18:56.000 [notice] Bootstrapped 80% (ap_conn): Connecting to a relay to build circuits
- tor_proxy | Aug 30 22:18:56.000 [notice] Bootstrapped 85% (ap_conn_done): Connected to a relay to build circuits
- tor_proxy | Aug 30 22:18:56.000 [notice] Bootstrapped 89% (ap_handshake): Finishing handshake with a relay to build circuits
- tor_proxy | Aug 30 22:18:56.000 [notice] Bootstrapped 90% (ap_handshake_done): Handshake finished with a relay to build circuits
- tor_proxy | Aug 30 22:18:56.000 [notice] Bootstrapped 95% (circuit_create): Establishing a Tor circuit
- tor_proxy | Aug 30 22:18:57.000 [notice] Bootstrapped 100% (done): Done
- App logs
- --------
- bitcoin
- Attaching to bitcoin_server_1, bitcoin_i2pd_daemon_1, bitcoin_tor_1, bitcoin_app_proxy_1, bitcoin_bitcoind_1
- bitcoind_1 | Error: Settings file could not be written:
- bitcoind_1 | - Error: Unable to open settings file /data/.bitcoin/settings.json.tmp for writing
- bitcoind_1 | Error: Settings file could not be written:
- bitcoind_1 | - Error: Unable to open settings file /data/.bitcoin/settings.json.tmp for writing
- bitcoind_1 | Error: Settings file could not be written:
- bitcoind_1 | - Error: Unable to open settings file /data/.bitcoin/settings.json.tmp for writing
- bitcoind_1 | Error: Settings file could not be written:
- bitcoind_1 | - Error: Unable to open settings file /data/.bitcoin/settings.json.tmp for writing
- bitcoind_1 | Error: Settings file could not be written:
- bitcoind_1 | - Error: Unable to open settings file /data/.bitcoin/settings.json.tmp for writing
- i2pd_daemon_1 | 22:57:10@116/error - Tunnel: Tunnel with id 2112003669 already exists
- i2pd_daemon_1 | 22:59:07@116/error - Tunnel: Tunnel with id 4248552299 already exists
- i2pd_daemon_1 | 23:01:50@925/error - SSU2: RelayIntro unknown router to introduce
- i2pd_daemon_1 | 23:06:57@820/error - Garlic: Can't handle ECIES-X25519-AEAD-Ratchet message
- i2pd_daemon_1 | 23:16:41@116/error - Tunnels: Can't select next hop for m21UyuFnkjv~H0Fq7LO~2WE~Dg8AOM8zISoKKVa5lxM=
- i2pd_daemon_1 | 23:16:41@116/error - Tunnels: Can't create inbound tunnel, no peers available
- i2pd_daemon_1 | 23:16:42@820/error - Garlic: Can't handle ECIES-X25519-AEAD-Ratchet message
- i2pd_daemon_1 | 23:16:52@820/error - Garlic: Can't handle ECIES-X25519-AEAD-Ratchet message
- i2pd_daemon_1 | 23:17:21@820/error - Garlic: Can't handle ECIES-X25519-AEAD-Ratchet message
- i2pd_daemon_1 | 23:53:42@820/error - Garlic: Can't handle ECIES-X25519-AEAD-Ratchet message
- server_1 | umbrel-middleware
- server_1 | ::ffff:10.21.0.16 - - [Wed, 30 Aug 2023 23:53:47 GMT] "GET /v1/bitcoind/info/status HTTP/1.1" 304 - "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0"
- server_1 |
- server_1 | umbrel-middleware
- server_1 | ::ffff:10.21.0.16 - - [Wed, 30 Aug 2023 23:53:50 GMT] "GET /v1/bitcoind/info/status HTTP/1.1" 304 - "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0"
- server_1 |
- server_1 | umbrel-middleware
- server_1 | ::ffff:10.21.0.16 - - [Wed, 30 Aug 2023 23:53:53 GMT] "GET /v1/bitcoind/info/status HTTP/1.1" 304 - "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0"
- server_1 |
- server_1 | umbrel-middleware
- tor_1 | Aug 30 23:52:16.156 [warn] Failed to parse/validate config: Failed to configure rendezvous options. See logs for details.
- tor_1 | Aug 30 23:52:16.156 [err] Reading config failed--see warnings above.
- tor_1 | Aug 30 23:53:16.605 [notice] Tor 0.4.7.8 running on Linux with Libevent 2.1.12-stable, OpenSSL 1.1.1n, Zlib 1.2.11, Liblzma N/A, Libzstd N/A and Glibc 2.31 as libc.
- tor_1 | Aug 30 23:53:16.605 [notice] Tor can't help you if you use it wrong! Learn how to be safe at https://support.torproject.org/faq/staying-anonymous/
- tor_1 | Aug 30 23:53:16.605 [notice] Read configuration file "/etc/tor/torrc".
- tor_1 | Aug 30 23:53:16.607 [warn] You have a ControlPort set to accept connections from a non-local address. This means that programs not running on your computer can reconfigure your Tor. That's pretty bad, since the controller protocol isn't encrypted! Maybe you should just listen on 127.0.0.1 and use a tool like stunnel or ssh to encrypt remote connections to your control port.
- tor_1 | Aug 30 23:53:16.607 [warn] CookieAuthFileGroupReadable is set, but will have no effect: you must specify an explicit CookieAuthFile to have it group-readable.
- tor_1 | Aug 30 23:53:16.607 [warn] Permissions on directory /data/app-bitcoin-p2p are too permissive.
- tor_1 | Aug 30 23:53:16.607 [warn] Failed to parse/validate config: Failed to configure rendezvous options. See logs for details.
- tor_1 | Aug 30 23:53:16.607 [err] Reading config failed--see warnings above.
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- helipad
- Attaching to helipad_web_1, helipad_app_proxy_1
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- web_1 | source: None,
- web_1 | }
- web_1 | ** get_last_boost_index_from_db() -> [3]
- web_1 | ** get_last_boost_index_from_db() -> [3]
- web_1 | ** Supplied index from call: [3]
- web_1 | ** Supplied boostcount from call: [100]
- web_1 | ** Supplied index from call: [3]
- web_1 | ** Supplied boostcount from call: [100]
- web_1 | lnd::Lnd::list_invoices failed: status: Unknown, message: "wallet locked, unlock it to enable full RPC access", details: [], metadata: MetadataMap { headers: {"content-type": "application/grpc"} }
- web_1 | Current index: 3
- ipfs-podcasting
- Attaching to ipfs-podcasting_app_proxy_1, ipfs-podcasting_web_1
- app_proxy_1 | $ node ./bin/www
- app_proxy_1 | [HPM] Proxy created: / -> http://ipfs-podcasting_web_1:8675
- app_proxy_1 | Waiting for ipfs-podcasting_web_1:8675 to open...
- app_proxy_1 | IPFS Podcasting is now ready...
- app_proxy_1 | Listening on port: 8675
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- web_1 | Bottle v0.12.21 server starting up (using WSGIRefServer())...
- web_1 | Listening on http://0.0.0.0:8675/
- web_1 | Hit Ctrl-C to quit.
- web_1 |
- web_1 | 10.21.0.2 - - [30/Aug/2023 23:27:07] "GET / HTTP/1.1" 200 4865
- lightning
- Attaching to lightning_lnd_1, lightning_app_proxy_1, lightning_app_1, lightning_tor_1
- app_1 | umbrel-lightning
- app_1 | Waiting for LND...
- app_1 | Checking LND status...
- app_1 | Waiting for LND...
- app_1 | ::ffff:10.21.0.7 - - [Wed, 30 Aug 2023 23:53:29 GMT] "GET /v1/lnd/info/status HTTP/1.1" 304 - "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0"
- app_1 |
- app_1 | umbrel-lightning
- app_1 | Checking LND status...
- app_1 | [backup-monitor] Checking channel backup...
- app_1 | [backup-monitor] Sleeping...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- lnd_1 | 2023-08-30 23:53:16.198 [ERR] RPCS: [/lnrpc.Lightning/ListInvoices]: wallet locked, unlock it to enable full RPC access
- lnd_1 | 2023-08-30 23:53:25.200 [ERR] RPCS: [/lnrpc.Lightning/ChannelBalance]: wallet locked, unlock it to enable full RPC access
- lnd_1 | 2023-08-30 23:53:25.242 [ERR] RPCS: [/lnrpc.Lightning/ListInvoices]: wallet locked, unlock it to enable full RPC access
- lnd_1 | 2023-08-30 23:53:26.493 [ERR] RPCS: [/lnrpc.Lightning/GetInfo]: wallet locked, unlock it to enable full RPC access
- lnd_1 | 2023-08-30 23:53:34.244 [ERR] RPCS: [/lnrpc.Lightning/ChannelBalance]: wallet locked, unlock it to enable full RPC access
- lnd_1 | 2023-08-30 23:53:34.286 [ERR] RPCS: [/lnrpc.Lightning/ListInvoices]: wallet locked, unlock it to enable full RPC access
- lnd_1 | 2023-08-30 23:53:43.287 [ERR] RPCS: [/lnrpc.Lightning/ChannelBalance]: wallet locked, unlock it to enable full RPC access
- lnd_1 | 2023-08-30 23:53:43.334 [ERR] RPCS: [/lnrpc.Lightning/ListInvoices]: wallet locked, unlock it to enable full RPC access
- lnd_1 | 2023-08-30 23:53:52.335 [ERR] RPCS: [/lnrpc.Lightning/ChannelBalance]: wallet locked, unlock it to enable full RPC access
- lnd_1 | 2023-08-30 23:53:52.378 [ERR] RPCS: [/lnrpc.Lightning/ListInvoices]: wallet locked, unlock it to enable full RPC access
- tor_1 | Aug 30 23:52:12.789 [notice] Read configuration file "/etc/tor/torrc".
- tor_1 | Aug 30 23:52:12.791 [warn] Permissions on directory /data/app-lightning-rest are too permissive.
- tor_1 | Aug 30 23:52:12.791 [warn] Failed to parse/validate config: Failed to configure rendezvous options. See logs for details.
- tor_1 | Aug 30 23:52:12.791 [err] Reading config failed--see warnings above.
- tor_1 | Aug 30 23:53:13.219 [notice] Tor 0.4.7.8 running on Linux with Libevent 2.1.12-stable, OpenSSL 1.1.1n, Zlib 1.2.11, Liblzma N/A, Libzstd N/A and Glibc 2.31 as libc.
- tor_1 | Aug 30 23:53:13.219 [notice] Tor can't help you if you use it wrong! Learn how to be safe at https://support.torproject.org/faq/staying-anonymous/
- tor_1 | Aug 30 23:53:13.219 [notice] Read configuration file "/etc/tor/torrc".
- tor_1 | Aug 30 23:53:13.222 [warn] Permissions on directory /data/app-lightning-rest are too permissive.
- tor_1 | Aug 30 23:53:13.222 [warn] Failed to parse/validate config: Failed to configure rendezvous options. See logs for details.
- tor_1 | Aug 30 23:53:13.222 [err] Reading config failed--see warnings above.
- lightning-shell
- Attaching to lightning-shell_web_1, lightning-shell_app_proxy_1
- web_1 | [2023/08/30 23:41:17:1140] N: LWS: 4.3.0-a5aae04, NET CLI SRV H1 H2 WS ConMon IPv6-absent
- web_1 | [2023/08/30 23:41:17:1206] N: elops_init_pt_uv: Using foreign event loop...
- web_1 | [2023/08/30 23:41:17:1207] N: ++ [wsi|0|pipe] (1)
- web_1 | [2023/08/30 23:41:17:1207] N: ++ [vh|0|netlink] (1)
- web_1 | [2023/08/30 23:41:17:1208] N: ++ [vh|1|default||7681] (2)
- web_1 | [2023/08/30 23:41:17:1208] N: [null wsi]: lws_socket_bind: source ads 0.0.0.0
- web_1 | [2023/08/30 23:41:17:1209] N: ++ [wsi|1|listen|default||7681] (2)
- web_1 | [2023/08/30 23:41:17:1209] N: Listening on port: 7681
- web_1 | [2023/08/30 23:41:17:6791] N: ++ [wsisrv|0|adopted] (1)
- web_1 | [2023/08/30 23:41:17:6833] N: -- [wsisrv|0|adopted] (0) 4.271ms
- app_proxy_1 | yarn run v1.22.19
- app_proxy_1 | $ node ./bin/www
- app_proxy_1 | [HPM] Proxy created: / -> http://lightning-shell_web_1:7681
- app_proxy_1 | Waiting for lightning-shell_web_1:7681 to open...
- app_proxy_1 | Lightning Shell is now ready...
- app_proxy_1 | Listening on port: 7681
- llama-gpt
- Attaching to llama-gpt_llama-gpt-ui_1, llama-gpt_llama-gpt-api_1, llama-gpt_app_proxy_1
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- llama-gpt-ui_1 | [INFO wait] docker-compose-wait - Everything's fine, the application can now start!
- llama-gpt-ui_1 | [INFO wait] --------------------------------------------------------
- llama-gpt-ui_1 |
- llama-gpt-ui_1 | > [email protected] start
- llama-gpt-ui_1 | > next start
- llama-gpt-ui_1 |
- llama-gpt-ui_1 | ready - started server on 0.0.0.0:3000, url: http://localhost:3000
- llama-gpt-ui_1 | making request to http://llama-gpt-api:8000/v1/models
- llama-gpt-ui_1 | making request to http://llama-gpt-api:8000/v1/models
- llama-gpt-ui_1 | making request to http://llama-gpt-api:8000/v1/models
- llama-gpt-api_1 | Try increasing RLIMIT_MLOCK ('ulimit -l' as root).
- llama-gpt-api_1 | llama_new_context_with_model: kv self size = 2048.00 MB
- llama-gpt-api_1 | AVX = 1 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | FMA = 1 | NEON = 0 | ARM_FMA = 0 | F16C = 1 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 0 | SSE3 = 1 | VSX = 0 |
- llama-gpt-api_1 | INFO: Started server process [1]
- llama-gpt-api_1 | INFO: Waiting for application startup.
- llama-gpt-api_1 | INFO: Application startup complete.
- llama-gpt-api_1 | INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
- llama-gpt-api_1 | INFO: 10.21.0.24:39130 - "GET /v1/models HTTP/1.1" 200 OK
- llama-gpt-api_1 | INFO: 10.21.0.24:56984 - "GET /v1/models HTTP/1.1" 200 OK
- llama-gpt-api_1 | INFO: 10.21.0.24:60378 - "GET /v1/models HTTP/1.1" 200 OK
- ln-visualizer
- Attaching to ln-visualizer_app_proxy_1, ln-visualizer_web_1, ln-visualizer_api_1
- api_1 | at Object.onReceiveStatus (/usr/local/app/node_modules/@grpc/grpc-js/build/src/client-interceptors.js:336:141)
- api_1 | at Object.onReceiveStatus (/usr/local/app/node_modules/@grpc/grpc-js/build/src/client-interceptors.js:299:181)
- api_1 | at /usr/local/app/node_modules/@grpc/grpc-js/build/src/call-stream.js:160:78
- api_1 | at processTicksAndRejections (internal/process/task_queues.js:77:11) {
- api_1 | code: 2,
- api_1 | details: 'wallet locked, unlock it to enable full RPC access',
- api_1 | metadata: [Metadata]
- api_1 | }
- api_1 | }
- api_1 | ]
- web_1 | 2023/08/30 22:19:33 [emerg] 7#7: host not found in upstream "ln-visualizer_api_1" in /etc/nginx/nginx.conf:22
- web_1 | nginx: [emerg] host not found in upstream "ln-visualizer_api_1" in /etc/nginx/nginx.conf:22
- web_1 | 2023/08/30 22:19:59 [emerg] 7#7: host not found in upstream "ln-visualizer_api_1" in /etc/nginx/nginx.conf:22
- web_1 | nginx: [emerg] host not found in upstream "ln-visualizer_api_1" in /etc/nginx/nginx.conf:22
- web_1 | 2023/08/30 22:20:51 [emerg] 7#7: host not found in upstream "ln-visualizer_api_1" in /etc/nginx/nginx.conf:22
- web_1 | nginx: [emerg] host not found in upstream "ln-visualizer_api_1" in /etc/nginx/nginx.conf:22
- web_1 | 2023/08/30 22:21:51 [emerg] 7#7: host not found in upstream "ln-visualizer_api_1" in /etc/nginx/nginx.conf:22
- web_1 | nginx: [emerg] host not found in upstream "ln-visualizer_api_1" in /etc/nginx/nginx.conf:22
- web_1 | 2023/08/30 22:22:52 [emerg] 7#7: host not found in upstream "ln-visualizer_api_1" in /etc/nginx/nginx.conf:22
- web_1 | nginx: [emerg] host not found in upstream "ln-visualizer_api_1" in /etc/nginx/nginx.conf:22
- app_proxy_1 | Error wating for port: "The address 'ln-visualizer_web_1' cannot be found"
- app_proxy_1 | Retrying...
- app_proxy_1 | Error wating for port: "The address 'ln-visualizer_web_1' cannot be found"
- app_proxy_1 | Retrying...
- app_proxy_1 | Error wating for port: "The address 'ln-visualizer_web_1' cannot be found"
- app_proxy_1 | Retrying...
- app_proxy_1 | Error wating for port: "The address 'ln-visualizer_web_1' cannot be found"
- app_proxy_1 | Retrying...
- app_proxy_1 | LnVisualizer is now ready...
- app_proxy_1 | Listening on port: 5646
- lnplus
- Attaching to lnplus_app_proxy_1, lnplus_web_1
- web_1 | from /gems/ruby/3.0.0/gems/rack-2.2.4/lib/rack/sendfile.rb:110:in `call'
- web_1 | from /gems/ruby/3.0.0/gems/actionpack-7.0.3.1/lib/action_dispatch/middleware/host_authorization.rb:131:in `call'
- web_1 | from /gems/ruby/3.0.0/gems/railties-7.0.3.1/lib/rails/engine.rb:530:in `call'
- web_1 | from /gems/ruby/3.0.0/gems/puma-5.6.4/lib/puma/configuration.rb:252:in `call'
- web_1 | from /gems/ruby/3.0.0/gems/puma-5.6.4/lib/puma/request.rb:77:in `block in handle_request'
- web_1 | from /gems/ruby/3.0.0/gems/puma-5.6.4/lib/puma/thread_pool.rb:340:in `with_force_shutdown'
- web_1 | from /gems/ruby/3.0.0/gems/puma-5.6.4/lib/puma/request.rb:76:in `handle_request'
- web_1 | from /gems/ruby/3.0.0/gems/puma-5.6.4/lib/puma/server.rb:441:in `process_client'
- web_1 | from /gems/ruby/3.0.0/gems/puma-5.6.4/lib/puma/thread_pool.rb:147:in `block in spawn_thread'
- web_1 | /gems/ruby/3.0.0/gems/actionpack-7.0.3.1/lib/action_dispatch/middleware/debug_exceptions.rb:72: raise exception
- app_proxy_1 | yarn run v1.22.19
- app_proxy_1 | $ node ./bin/www
- app_proxy_1 | [HPM] Proxy created: / -> http://lnplus_web_1:3777
- app_proxy_1 | Waiting for lnplus_web_1:3777 to open...
- app_proxy_1 | Lightning Network+ is now ready...
- app_proxy_1 | Listening on port: 3777
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- n8n
- Attaching to n8n_app_proxy_1, n8n_server_1
- app_proxy_1 | Error wating for port: "The address 'n8n_server_1' cannot be found"
- app_proxy_1 | Retrying...
- app_proxy_1 | Error wating for port: "The address 'n8n_server_1' cannot be found"
- app_proxy_1 | Retrying...
- app_proxy_1 | Error wating for port: "The address 'n8n_server_1' cannot be found"
- app_proxy_1 | Retrying...
- app_proxy_1 | Error wating for port: "The address 'n8n_server_1' cannot be found"
- app_proxy_1 | Retrying...
- app_proxy_1 | n8n is now ready...
- app_proxy_1 | Listening on port: 5678
- server_1 | http://natron.local:5678/
- server_1 |
- server_1 | Stopping n8n...
- server_1 | License manager not initialized
- server_1 | n8n ready on 0.0.0.0, port 5678
- server_1 | Initializing n8n process
- server_1 | Version: 0.234.1
- server_1 |
- server_1 | Editor is now accessible via:
- server_1 | http://natron.local:5678/
- penpot
- Attaching to penpot_penpot-frontend_1, penpot_penpot-backend_1, penpot_penpot-redis_1, penpot_app_proxy_1, penpot_penpot-postgres_1, penpot_penpot-exporter_1
- app_proxy_1 | Error wating for port: "The address 'penpot_penpot-frontend_1' cannot be found"
- app_proxy_1 | Retrying...
- app_proxy_1 | Error wating for port: "The address 'penpot_penpot-frontend_1' cannot be found"
- app_proxy_1 | Retrying...
- app_proxy_1 | Error wating for port: "The address 'penpot_penpot-frontend_1' cannot be found"
- app_proxy_1 | Retrying...
- app_proxy_1 | Error wating for port: "The address 'penpot_penpot-frontend_1' cannot be found"
- app_proxy_1 | Retrying...
- app_proxy_1 | Penpot is now ready...
- app_proxy_1 | Listening on port: 9001
- penpot-backend_1 | [2023-08-30 22:19:23.372] I app.storage.tmp - hint="started tmp file cleaner"
- penpot-backend_1 | [2023-08-30 22:19:23.379] I app.worker - hint="registry initialized", tasks=13
- penpot-backend_1 | [2023-08-30 22:19:23.383] I app.worker - hint="cron: started", tasks=8
- penpot-backend_1 | [2023-08-30 22:19:23.430] I app.worker - hint="dispatcher: started"
- penpot-backend_1 | [2023-08-30 22:19:23.432] I app.worker - hint="monitor: started", name="default"
- penpot-backend_1 | [2023-08-30 22:19:23.435] I app.worker - hint="worker: started", worker-id=0, queue="default"
- penpot-backend_1 | [2023-08-30 22:19:23.437] I app.worker - hint="worker: started", worker-id=0, queue="webhooks"
- penpot-backend_1 | [2023-08-30 22:19:23.438] I app.srepl - msg="initializing repl server", name="prepl", port=6063, host="localhost"
- penpot-backend_1 | [2023-08-30 22:19:23.443] I app.main - hint="welcome to penpot", flags="login-with-password,backend-api-doc,backend-worker,registration,prepl-server", worker?=true, version="1.18.3-9885-g353de39d4"
- penpot-backend_1 | [2023-08-30 23:00:00.106] I app.tasks.file-xlog-gc - hint="task finished", min-age="72h", total=0
- penpot-redis_1 | 1:M 30 Aug 2023 22:15:35.763 # Error trying to save the DB, can't exit.
- penpot-redis_1 | 1:M 30 Aug 2023 22:15:35.763 # Errors trying to shut down the server. Check the logs for more information.
- penpot-redis_1 | 1:C 30 Aug 2023 22:16:34.582 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
- penpot-redis_1 | 1:C 30 Aug 2023 22:16:34.582 # Redis version=7.0.11, bits=64, commit=00000000, modified=0, pid=1, just started
- penpot-redis_1 | 1:C 30 Aug 2023 22:16:34.582 # Warning: no config file specified, using the default config. In order to specify a config file use redis-server /path/to/redis.conf
- penpot-redis_1 | 1:M 30 Aug 2023 22:16:34.583 * monotonic clock: POSIX clock_gettime
- penpot-redis_1 | 1:M 30 Aug 2023 22:16:34.601 * Running mode=standalone, port=6379.
- penpot-redis_1 | 1:M 30 Aug 2023 22:16:34.601 # Server initialized
- penpot-redis_1 | 1:M 30 Aug 2023 22:16:34.601 # WARNING Memory overcommit must be enabled! Without it, a background save or replication may fail under low memory condition. Being disabled, it can can also cause failures without low memory condition, see https://github.com/jemalloc/jemalloc/issues/1328. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
- penpot-redis_1 | 1:M 30 Aug 2023 22:16:34.614 * Ready to accept connections
- penpot-exporter_1 | INF [app.core] msg="initializing", public-uri="http://penpot-frontend", version="1.18.3-9885-g353de39d4"
- penpot-exporter_1 | INF [app.browser] hint="initializing browser pool", opts=#js {:max 5, :min 0, :testOnBorrow true, :evictionRunIntervalMillis 5000, :numTestsPerEvictionRun 5, :acquireTimeoutMillis 10000, :idleTimeoutMillis 10000}
- penpot-exporter_1 | INF [app.http] hint="welcome to penpot", module="exporter", version="1.18.3-9885-g353de39d4"
- penpot-exporter_1 | INF [app.http] hint="starting http server", port=6061
- penpot-exporter_1 | INF [app.redis] hint="redis connection established", uri="redis://penpot-redis/0"
- penpot-exporter_1 | INF [app.core] msg="initializing", public-uri="http://penpot-frontend", version="1.18.3-9885-g353de39d4"
- penpot-exporter_1 | INF [app.browser] hint="initializing browser pool", opts=#js {:max 5, :min 0, :testOnBorrow true, :evictionRunIntervalMillis 5000, :numTestsPerEvictionRun 5, :acquireTimeoutMillis 10000, :idleTimeoutMillis 10000}
- penpot-exporter_1 | INF [app.http] hint="welcome to penpot", module="exporter", version="1.18.3-9885-g353de39d4"
- penpot-exporter_1 | INF [app.http] hint="starting http server", port=6061
- penpot-exporter_1 | INF [app.redis] hint="redis connection established", uri="redis://penpot-redis/0"
- penpot-postgres_1 | 2023-08-30 22:19:03.462 UTC [1] LOG: listening on IPv6 address "::", port 5432
- penpot-postgres_1 | 2023-08-30 22:19:03.498 UTC [1] LOG: listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
- penpot-postgres_1 | 2023-08-30 22:19:03.543 UTC [15] LOG: database system was shut down at 2023-08-30 22:15:35 UTC
- penpot-postgres_1 | 2023-08-30 22:19:03.591 UTC [1] LOG: database system is ready to accept connections
- penpot-postgres_1 | 2023-08-30 22:24:03.609 UTC [13] LOG: checkpoint starting: time
- penpot-postgres_1 | 2023-08-30 22:24:03.724 UTC [13] LOG: checkpoint complete: wrote 4 buffers (0.0%); 0 WAL file(s) added, 0 removed, 0 recycled; write=0.105 s, sync=0.003 s, total=0.116 s; sync files=3, longest=0.002 s, average=0.001 s; distance=5 kB, estimate=5 kB
- penpot-postgres_1 | 2023-08-30 23:04:04.470 UTC [13] LOG: checkpoint starting: time
- penpot-postgres_1 | 2023-08-30 23:04:04.588 UTC [13] LOG: checkpoint complete: wrote 2 buffers (0.0%); 0 WAL file(s) added, 0 removed, 0 recycled; write=0.103 s, sync=0.003 s, total=0.119 s; sync files=2, longest=0.002 s, average=0.002 s; distance=5 kB, estimate=5 kB
- penpot-postgres_1 | 2023-08-30 23:34:05.186 UTC [13] LOG: checkpoint starting: time
- penpot-postgres_1 | 2023-08-30 23:34:05.298 UTC [13] LOG: checkpoint complete: wrote 2 buffers (0.0%); 0 WAL file(s) added, 0 removed, 0 recycled; write=0.104 s, sync=0.001 s, total=0.113 s; sync files=2, longest=0.001 s, average=0.001 s; distance=5 kB, estimate=5 kB
- snort
- Attaching to snort_app_proxy_1, snort_web_1
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- web_1 | 10.21.0.8 - - [30/Aug/2023:22:27:29 +0000] "GET / HTTP/1.1" 200 789 "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0" "::ffff:192.168.178.10"
- web_1 | 10.21.0.8 - - [30/Aug/2023:22:27:29 +0000] "GET /main.7c424e09525cb4aa655e.css HTTP/1.1" 200 55874 "http://192.168.178.101:52027/" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0" "::ffff:192.168.178.10"
- web_1 | 10.21.0.8 - - [30/Aug/2023:22:27:29 +0000] "GET /main.7c424e09525cb4aa655e.js HTTP/1.1" 200 1202175 "http://192.168.178.101:52027/" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0" "::ffff:192.168.178.10"
- web_1 | 10.21.0.8 - - [30/Aug/2023:22:27:29 +0000] "GET /e85d84dcfe3b365aaaa3.woff2 HTTP/1.1" 200 37780 "http://192.168.178.101:52027/main.7c424e09525cb4aa655e.css" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0" "::ffff:192.168.178.10"
- web_1 | 10.21.0.8 - - [30/Aug/2023:22:27:29 +0000] "GET /pow.js HTTP/1.1" 200 9853 "http://192.168.178.101:52027/" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0" "::ffff:192.168.178.10"
- web_1 | 10.21.0.8 - - [30/Aug/2023:22:27:29 +0000] "GET /favicon.ico HTTP/1.1" 200 3290 "http://192.168.178.101:52027/" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0" "::ffff:192.168.178.10"
- web_1 | 10.21.0.8 - - [30/Aug/2023:22:27:30 +0000] "GET /1041fe76ab47400d0920.svg HTTP/1.1" 200 77333 "http://192.168.178.101:52027/" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0" "::ffff:192.168.178.10"
- web_1 | 10.21.0.8 - - [30/Aug/2023:22:27:31 +0000] "GET /nostrich_512.png HTTP/1.1" 200 540613 "http://192.168.178.101:52027/" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0" "::ffff:192.168.178.10"
- web_1 | 10.21.0.8 - - [30/Aug/2023:22:27:43 +0000] "GET /nostrich_512.png HTTP/1.1" 200 540613 "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0" "::ffff:192.168.178.10"
- web_1 | 10.21.0.8 - - [30/Aug/2023:22:27:55 +0000] "GET /p/npub1uerlt3yee5gy44kr97th45cypnv8s7lwnqll0t3hvady99g7ffdsgj99m0 HTTP/1.1" 200 789 "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0" "::ffff:192.168.178.10"
- thunderhub
- Attaching to thunderhub_app_proxy_1, thunderhub_web_1
- web_1 | ],
- web_1 | level: 'error',
- web_1 | message: 'Error connecting to node',
- web_1 | timestamp: '2023-08-30T23:53:26.494Z'
- web_1 | }
- web_1 | {
- web_1 | message: 'No node available for healthcheck ping',
- web_1 | level: 'error',
- web_1 | timestamp: '2023-08-30T23:53:26.495Z'
- web_1 | }
- app_proxy_1 | [HPM] Upgrading to WebSocket
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | [HPM] Client disconnected
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- app_proxy_1 | Validating token: bdaf587c825d ...
- torq
- Attaching to torq_web_1, torq_db_1, torq_app_proxy_1
- app_proxy_1 | Error wating for port: "The address 'torq_web_1' cannot be found"
- app_proxy_1 | Retrying...
- app_proxy_1 | Error wating for port: "The address 'torq_web_1' cannot be found"
- app_proxy_1 | Retrying...
- app_proxy_1 | Error wating for port: "The address 'torq_web_1' cannot be found"
- app_proxy_1 | Retrying...
- app_proxy_1 | Error wating for port: "The address 'torq_web_1' cannot be found"
- app_proxy_1 | Retrying...
- app_proxy_1 | Torq is now ready...
- app_proxy_1 | Listening on port: 7028
- web_1 | {"level":"info","protocol":"grpc","grpc.component":"client","grpc.service":"lnrpc.Lightning","grpc.method":"ListPeers","grpc.method_type":"unary","grpc.start_time":"2023-08-30T23:53:07Z","grpc.request.deadline":"2023-08-30T23:55:07Z","grpc.code":"Unknown","grpc.error":"rpc error: code = Unknown desc = wallet locked, unlock it to enable full RPC access","grpc.time_ms":"0.778","message":"finished call"}
- web_1 | {"level":"error","error":"get list of peers from lnd for nodeId: 1: rpc error: code = Unknown desc = wallet locked, unlock it to enable full RPC access","time":"2023-08-30T23:53:07Z","message":"Failed to import peer status."}
- web_1 | {"level":"error","error":"get list of peers from lnd for nodeId: 1: rpc error: code = Unknown desc = wallet locked, unlock it to enable full RPC access","time":"2023-08-30T23:53:07Z","message":"LND import Peers for nodeId: 1"}
- web_1 | {"level":"info","time":"2023-08-30T23:53:07Z","message":"LndServiceChannelEventStream terminated for nodeId: 1"}
- web_1 | {"level":"info","time":"2023-08-30T23:54:08Z","message":"LndServiceChannelEventStream boot attempt for nodeId: 1."}
- web_1 | {"level":"info","time":"2023-08-30T23:54:08Z","message":"LndServiceChannelEventStream service booted for nodeId: 1"}
- web_1 | {"level":"info","protocol":"grpc","grpc.component":"client","grpc.service":"lnrpc.Lightning","grpc.method":"ListPeers","grpc.method_type":"unary","grpc.start_time":"2023-08-30T23:54:08Z","grpc.request.deadline":"2023-08-30T23:56:08Z","grpc.code":"Unknown","grpc.error":"rpc error: code = Unknown desc = wallet locked, unlock it to enable full RPC access","grpc.time_ms":"0.668","message":"finished call"}
- web_1 | {"level":"error","error":"get list of peers from lnd for nodeId: 1: rpc error: code = Unknown desc = wallet locked, unlock it to enable full RPC access","time":"2023-08-30T23:54:08Z","message":"Failed to import peer status."}
- web_1 | {"level":"error","error":"get list of peers from lnd for nodeId: 1: rpc error: code = Unknown desc = wallet locked, unlock it to enable full RPC access","time":"2023-08-30T23:54:08Z","message":"LND import Peers for nodeId: 1"}
- web_1 | {"level":"info","time":"2023-08-30T23:54:08Z","message":"LndServiceChannelEventStream terminated for nodeId: 1"}
- db_1 |
- db_1 | PostgreSQL Database directory appears to contain a database; Skipping initialization
- db_1 |
- db_1 | 2023-08-30 22:19:01.220 UTC [1] LOG: starting PostgreSQL 14.5 on x86_64-pc-linux-musl, compiled by gcc (Alpine 11.2.1_git20220219) 11.2.1 20220219, 64-bit
- db_1 | 2023-08-30 22:19:01.220 UTC [1] LOG: listening on IPv4 address "0.0.0.0", port 5432
- db_1 | 2023-08-30 22:19:01.220 UTC [1] LOG: listening on IPv6 address "::", port 5432
- db_1 | 2023-08-30 22:19:01.248 UTC [1] LOG: listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
- db_1 | 2023-08-30 22:19:01.253 UTC [12] LOG: database system was shut down at 2023-08-30 22:15:35 UTC
- db_1 | 2023-08-30 22:19:01.268 UTC [1] LOG: database system is ready to accept connections
- db_1 | 2023-08-30 22:19:01.276 UTC [18] LOG: TimescaleDB background worker launcher connected to shared catalogs
- ================
- ==== Result ====
- ================
- ==== END =====
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement