Advertisement
Guest User

Untitled

a guest
Aug 30th, 2023
60
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 41.72 KB | None | 0 0
  1. =====================
  2. = Umbrel debug info =
  3. =====================
  4.  
  5. Umbrel version
  6. --------------
  7. 0.5.4
  8.  
  9. Memory usage
  10. ------------
  11. total used free shared buff/cache available
  12. Mem: 33G 6.5G 19G 119M 8.0G 26G
  13. Swap: 1.0G 0B 1.0G
  14.  
  15. total: 19.6%
  16. llama-gpt: 12.3%
  17. penpot: 3.2%
  18. lightning: 1.5%
  19. thunderhub: 1.4%
  20. ipfs-podcasting: 1.2%
  21. n8n: 0.8%
  22. torq: 0.7%
  23. lnplus: 0.4%
  24. bitcoin: 0.4%
  25. lightning-shell: 0.3%
  26. snort: 0.2%
  27. ln-visualizer: 0.2%
  28. helipad: 0.2%
  29. system: 0%
  30.  
  31. Memory monitor logs
  32. -------------------
  33. 2023-08-15 22:35:32 Memory monitor running!
  34. 2023-08-16 20:16:08 Memory monitor running!
  35. 2023-08-16 22:51:23 Memory monitor running!
  36. 2023-08-25 15:29:06 Memory monitor running!
  37. 2023-08-30 17:46:26 Memory monitor running!
  38. 2023-08-30 18:27:41 Memory monitor running!
  39. 2023-08-30 22:47:11 Memory monitor running!
  40. 2023-08-30 22:54:56 Memory monitor running!
  41. 2023-08-30 23:06:06 Memory monitor running!
  42. 2023-08-30 23:18:52 Memory monitor running!
  43.  
  44. Filesystem information
  45. ----------------------
  46. Filesystem Size Used Avail Use% Mounted on
  47. /dev/sda2 1.8T 1.4T 316G 82% /
  48. /dev/sda2 1.8T 1.4T 316G 82% /
  49.  
  50. Karen logs
  51. ----------
  52.  
  53. Pulling web ... extracting (100.0%)
  54. Pulling web ... extracting (100.0%)
  55. Pulling web ... pull complete
  56. Pulling web ... extracting (100.0%)
  57. Pulling web ... extracting (100.0%)
  58. Pulling web ... pull complete
  59. Pulling web ... extracting (100.0%)
  60. Pulling web ... extracting (100.0%)
  61. Pulling web ... pull complete
  62. Pulling web ... extracting (100.0%)
  63. Pulling web ... extracting (100.0%)
  64. Pulling web ... pull complete
  65. Pulling web ... extracting (100.0%)
  66. Pulling web ... extracting (100.0%)
  67. Pulling web ... pull complete
  68. Pulling web ... digest: sha256:4ea6aafee8ddd092b2...
  69. Pulling web ... status: downloaded newer image fo...
  70. Pulling web ... done
  71. Starting app lightning-shell...
  72. Creating lightning-shell_app_proxy_1 ...
  73. Creating lightning-shell_web_1 ...
  74. Creating lightning-shell_app_proxy_1 ... done
  75. Creating lightning-shell_web_1 ... done
  76. Saving app lightning-shell in DB...
  77. Successfully installed app lightning-shell
  78. Got signal: app-uninstall-photoprism
  79. karen is getting triggered!
  80. Removing images for app photoprism...
  81. Stopping photoprism_app_proxy_1 ...
  82. Stopping photoprism_db_1 ...
  83. Stopping photoprism_web_1 ...
  84. Stopping photoprism_web_1 ... done
  85. Stopping photoprism_app_proxy_1 ... done
  86. Stopping photoprism_db_1 ... done
  87. Removing photoprism_app_proxy_1 ...
  88. Removing photoprism_db_1 ...
  89. Removing photoprism_web_1 ...
  90. Removing photoprism_db_1 ... done
  91. Removing photoprism_web_1 ... done
  92. Removing photoprism_app_proxy_1 ... done
  93. Network umbrel_main_network is external, skipping
  94. Removing image getumbrel/app-proxy:v0.5.2@sha256:a2e3e0ddfcf84838bf0ba66f4b839ec958832d51f0ac9ace47962459c838b2d6
  95. Failed to remove image for service app_proxy: 409 Client Error for http+docker://localhost/v1.41/images/getumbrel/app-proxy:v0.5.2%40sha256:a2e3e0ddfcf84838bf0ba66f4b839ec958832d51f0ac9ace47962459c838b2d6?force=False&noprune=False: Conflict ("conflict: unable to remove repository reference "getumbrel/app-proxy:v0.5.2@sha256:a2e3e0ddfcf84838bf0ba66f4b839ec958832d51f0ac9ace47962459c838b2d6" (must force) - container 16679417de62 is using its referenced image ca53932ff8f1")
  96. Removing image mariadb:10.5.12@sha256:dfcba5641bdbfd7cbf5b07eeed707e6a3672f46823695a0d3aba2e49bbd9b1dd
  97. Removing image photoprism/photoprism:230625@sha256:3b6a64d86abb566b5314dc7b168476e421ca7322b9102c1bd9c79834c6bc6756
  98. Deleting app data for app photoprism...
  99. Removing app photoprism from DB...
  100. Successfully uninstalled app photoprism
  101. Got signal: debug
  102. karen is getting triggered!
  103.  
  104. Docker containers
  105. -----------------
  106. NAMES STATUS
  107. lightning-shell_web_1 Up 12 minutes
  108. lightning-shell_app_proxy_1 Up 12 minutes
  109. ipfs-podcasting_app_proxy_1 Up 30 minutes
  110. ipfs-podcasting_web_1 Up 30 minutes
  111. bitcoin_server_1 Up 2 hours
  112. bitcoin_i2pd_daemon_1 Up 2 hours
  113. bitcoin_tor_1 Restarting (1) 35 seconds ago
  114. bitcoin_app_proxy_1 Up 2 hours
  115. bitcoin_bitcoind_1 Restarting (1) 23 seconds ago
  116. penpot_penpot-frontend_1 Up 2 hours
  117. penpot_penpot-backend_1 Up 2 hours
  118. penpot_penpot-redis_1 Up 2 hours
  119. penpot_app_proxy_1 Up 2 hours
  120. penpot_penpot-postgres_1 Up 2 hours
  121. penpot_penpot-exporter_1 Up 2 hours
  122. llama-gpt_llama-gpt-ui_1 Up 2 hours
  123. llama-gpt_llama-gpt-api_1 Up 2 hours
  124. llama-gpt_app_proxy_1 Up 2 hours
  125. torq_web_1 Up 2 hours
  126. torq_db_1 Up 2 hours
  127. torq_app_proxy_1 Up 2 hours
  128. lnplus_app_proxy_1 Up 2 hours
  129. lnplus_web_1 Up 2 hours
  130. ln-visualizer_app_proxy_1 Up 2 hours
  131. thunderhub_app_proxy_1 Up 2 hours
  132. ln-visualizer_web_1 Up About an hour
  133. thunderhub_web_1 Up 2 hours
  134. ln-visualizer_api_1 Restarting (1) 50 seconds ago
  135. helipad_web_1 Up 2 hours
  136. helipad_app_proxy_1 Up 2 hours
  137. lightning_lnd_1 Up 2 hours
  138. n8n_app_proxy_1 Up 2 hours
  139. lightning_app_proxy_1 Up 2 hours
  140. n8n_server_1 Up 2 hours
  141. lightning_app_1 Up 2 hours
  142. lightning_tor_1 Restarting (1) 39 seconds ago
  143. snort_app_proxy_1 Up 2 hours
  144. snort_web_1 Up 2 hours
  145. nginx Up 2 hours
  146. dashboard Up 2 hours
  147. manager Up 2 hours
  148. auth Up 2 hours
  149. tor_proxy Up 2 hours
  150. magical_mccarthy Up 2 hours
  151. grafana-grafana-1 Up 2 hours
  152.  
  153. Umbrel logs
  154. -----------
  155.  
  156. Attaching to manager
  157. manager | ::ffff:10.21.0.16 - - [Wed, 30 Aug 2023 23:53:49 GMT] "GET /v1/account/token?token=bdaf587c825d1739a420d08f67a16a2a1881311346a9827f318bf8117dbb3e04 HTTP/1.1" 200 16 "-" "app-proxy/0.0.1"
  158. manager |
  159. manager | umbrel-manager
  160. manager | ::ffff:10.21.21.2 - - [Wed, 30 Aug 2023 23:53:49 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0"
  161. manager |
  162. manager | umbrel-manager
  163. manager | ::ffff:10.21.21.2 - - [Wed, 30 Aug 2023 23:53:50 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0"
  164. manager |
  165. manager | umbrel-manager
  166. manager | ::ffff:10.21.21.2 - - [Wed, 30 Aug 2023 23:53:51 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0"
  167. manager |
  168. manager | umbrel-manager
  169. manager | ::ffff:10.21.0.15 - - [Wed, 30 Aug 2023 23:53:52 GMT] "GET /v1/account/token?token=bdaf587c825d1739a420d08f67a16a2a1881311346a9827f318bf8117dbb3e04 HTTP/1.1" 200 16 "-" "app-proxy/0.0.1"
  170. manager |
  171. manager | umbrel-manager
  172. manager | ::ffff:10.21.0.15 - - [Wed, 30 Aug 2023 23:53:52 GMT] "GET /v1/account/token?token=bdaf587c825d1739a420d08f67a16a2a1881311346a9827f318bf8117dbb3e04 HTTP/1.1" 200 16 "-" "app-proxy/0.0.1"
  173. manager |
  174. manager | umbrel-manager
  175. manager | ::ffff:10.21.0.16 - - [Wed, 30 Aug 2023 23:53:52 GMT] "GET /v1/account/token?token=bdaf587c825d1739a420d08f67a16a2a1881311346a9827f318bf8117dbb3e04 HTTP/1.1" 200 16 "-" "app-proxy/0.0.1"
  176. manager |
  177. manager | umbrel-manager
  178. manager | ::ffff:10.21.0.15 - - [Wed, 30 Aug 2023 23:53:52 GMT] "GET /v1/account/token?token=bdaf587c825d1739a420d08f67a16a2a1881311346a9827f318bf8117dbb3e04 HTTP/1.1" 200 16 "-" "app-proxy/0.0.1"
  179. manager |
  180. manager | umbrel-manager
  181. manager | ::ffff:10.21.0.15 - - [Wed, 30 Aug 2023 23:53:52 GMT] "GET /v1/account/token?token=bdaf587c825d1739a420d08f67a16a2a1881311346a9827f318bf8117dbb3e04 HTTP/1.1" 200 16 "-" "app-proxy/0.0.1"
  182. manager |
  183. manager | umbrel-manager
  184. manager | ::ffff:10.21.21.2 - - [Wed, 30 Aug 2023 23:53:52 GMT] "GET /v1/system/debug-result HTTP/1.0" 304 - "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0"
  185. manager |
  186. manager | umbrel-manager
  187.  
  188. Tor Proxy logs
  189. --------
  190.  
  191. Attaching to tor_proxy
  192. tor_proxy | Aug 30 22:18:55.000 [notice] Bootstrapped 10% (conn_done): Connected to a relay
  193. tor_proxy | Aug 30 22:18:55.000 [notice] Bootstrapped 14% (handshake): Handshaking with a relay
  194. tor_proxy | Aug 30 22:18:55.000 [notice] Bootstrapped 15% (handshake_done): Handshake with a relay done
  195. tor_proxy | Aug 30 22:18:55.000 [notice] Bootstrapped 75% (enough_dirinfo): Loaded enough directory info to build circuits
  196. tor_proxy | Aug 30 22:18:56.000 [notice] Bootstrapped 80% (ap_conn): Connecting to a relay to build circuits
  197. tor_proxy | Aug 30 22:18:56.000 [notice] Bootstrapped 85% (ap_conn_done): Connected to a relay to build circuits
  198. tor_proxy | Aug 30 22:18:56.000 [notice] Bootstrapped 89% (ap_handshake): Finishing handshake with a relay to build circuits
  199. tor_proxy | Aug 30 22:18:56.000 [notice] Bootstrapped 90% (ap_handshake_done): Handshake finished with a relay to build circuits
  200. tor_proxy | Aug 30 22:18:56.000 [notice] Bootstrapped 95% (circuit_create): Establishing a Tor circuit
  201. tor_proxy | Aug 30 22:18:57.000 [notice] Bootstrapped 100% (done): Done
  202.  
  203. App logs
  204. --------
  205.  
  206. bitcoin
  207.  
  208. Attaching to bitcoin_server_1, bitcoin_i2pd_daemon_1, bitcoin_tor_1, bitcoin_app_proxy_1, bitcoin_bitcoind_1
  209. bitcoind_1 | Error: Settings file could not be written:
  210. bitcoind_1 | - Error: Unable to open settings file /data/.bitcoin/settings.json.tmp for writing
  211. bitcoind_1 | Error: Settings file could not be written:
  212. bitcoind_1 | - Error: Unable to open settings file /data/.bitcoin/settings.json.tmp for writing
  213. bitcoind_1 | Error: Settings file could not be written:
  214. bitcoind_1 | - Error: Unable to open settings file /data/.bitcoin/settings.json.tmp for writing
  215. bitcoind_1 | Error: Settings file could not be written:
  216. bitcoind_1 | - Error: Unable to open settings file /data/.bitcoin/settings.json.tmp for writing
  217. bitcoind_1 | Error: Settings file could not be written:
  218. bitcoind_1 | - Error: Unable to open settings file /data/.bitcoin/settings.json.tmp for writing
  219. i2pd_daemon_1 | 22:57:10@116/error - Tunnel: Tunnel with id 2112003669 already exists
  220. i2pd_daemon_1 | 22:59:07@116/error - Tunnel: Tunnel with id 4248552299 already exists
  221. i2pd_daemon_1 | 23:01:50@925/error - SSU2: RelayIntro unknown router to introduce
  222. i2pd_daemon_1 | 23:06:57@820/error - Garlic: Can't handle ECIES-X25519-AEAD-Ratchet message
  223. i2pd_daemon_1 | 23:16:41@116/error - Tunnels: Can't select next hop for m21UyuFnkjv~H0Fq7LO~2WE~Dg8AOM8zISoKKVa5lxM=
  224. i2pd_daemon_1 | 23:16:41@116/error - Tunnels: Can't create inbound tunnel, no peers available
  225. i2pd_daemon_1 | 23:16:42@820/error - Garlic: Can't handle ECIES-X25519-AEAD-Ratchet message
  226. i2pd_daemon_1 | 23:16:52@820/error - Garlic: Can't handle ECIES-X25519-AEAD-Ratchet message
  227. i2pd_daemon_1 | 23:17:21@820/error - Garlic: Can't handle ECIES-X25519-AEAD-Ratchet message
  228. i2pd_daemon_1 | 23:53:42@820/error - Garlic: Can't handle ECIES-X25519-AEAD-Ratchet message
  229. server_1 | umbrel-middleware
  230. server_1 | ::ffff:10.21.0.16 - - [Wed, 30 Aug 2023 23:53:47 GMT] "GET /v1/bitcoind/info/status HTTP/1.1" 304 - "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0"
  231. server_1 |
  232. server_1 | umbrel-middleware
  233. server_1 | ::ffff:10.21.0.16 - - [Wed, 30 Aug 2023 23:53:50 GMT] "GET /v1/bitcoind/info/status HTTP/1.1" 304 - "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0"
  234. server_1 |
  235. server_1 | umbrel-middleware
  236. server_1 | ::ffff:10.21.0.16 - - [Wed, 30 Aug 2023 23:53:53 GMT] "GET /v1/bitcoind/info/status HTTP/1.1" 304 - "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0"
  237. server_1 |
  238. server_1 | umbrel-middleware
  239. tor_1 | Aug 30 23:52:16.156 [warn] Failed to parse/validate config: Failed to configure rendezvous options. See logs for details.
  240. tor_1 | Aug 30 23:52:16.156 [err] Reading config failed--see warnings above.
  241. tor_1 | Aug 30 23:53:16.605 [notice] Tor 0.4.7.8 running on Linux with Libevent 2.1.12-stable, OpenSSL 1.1.1n, Zlib 1.2.11, Liblzma N/A, Libzstd N/A and Glibc 2.31 as libc.
  242. tor_1 | Aug 30 23:53:16.605 [notice] Tor can't help you if you use it wrong! Learn how to be safe at https://support.torproject.org/faq/staying-anonymous/
  243. tor_1 | Aug 30 23:53:16.605 [notice] Read configuration file "/etc/tor/torrc".
  244. tor_1 | Aug 30 23:53:16.607 [warn] You have a ControlPort set to accept connections from a non-local address. This means that programs not running on your computer can reconfigure your Tor. That's pretty bad, since the controller protocol isn't encrypted! Maybe you should just listen on 127.0.0.1 and use a tool like stunnel or ssh to encrypt remote connections to your control port.
  245. tor_1 | Aug 30 23:53:16.607 [warn] CookieAuthFileGroupReadable is set, but will have no effect: you must specify an explicit CookieAuthFile to have it group-readable.
  246. tor_1 | Aug 30 23:53:16.607 [warn] Permissions on directory /data/app-bitcoin-p2p are too permissive.
  247. tor_1 | Aug 30 23:53:16.607 [warn] Failed to parse/validate config: Failed to configure rendezvous options. See logs for details.
  248. tor_1 | Aug 30 23:53:16.607 [err] Reading config failed--see warnings above.
  249. app_proxy_1 | Validating token: bdaf587c825d ...
  250. app_proxy_1 | Validating token: bdaf587c825d ...
  251. app_proxy_1 | Validating token: bdaf587c825d ...
  252. app_proxy_1 | Validating token: bdaf587c825d ...
  253. app_proxy_1 | Validating token: bdaf587c825d ...
  254. app_proxy_1 | Validating token: bdaf587c825d ...
  255. app_proxy_1 | Validating token: bdaf587c825d ...
  256. app_proxy_1 | Validating token: bdaf587c825d ...
  257. app_proxy_1 | Validating token: bdaf587c825d ...
  258. app_proxy_1 | Validating token: bdaf587c825d ...
  259.  
  260. helipad
  261.  
  262. Attaching to helipad_web_1, helipad_app_proxy_1
  263. app_proxy_1 | Validating token: bdaf587c825d ...
  264. app_proxy_1 | Validating token: bdaf587c825d ...
  265. app_proxy_1 | Validating token: bdaf587c825d ...
  266. app_proxy_1 | Validating token: bdaf587c825d ...
  267. app_proxy_1 | Validating token: bdaf587c825d ...
  268. app_proxy_1 | Validating token: bdaf587c825d ...
  269. app_proxy_1 | Validating token: bdaf587c825d ...
  270. app_proxy_1 | Validating token: bdaf587c825d ...
  271. app_proxy_1 | Validating token: bdaf587c825d ...
  272. app_proxy_1 | Validating token: bdaf587c825d ...
  273. web_1 | source: None,
  274. web_1 | }
  275. web_1 | ** get_last_boost_index_from_db() -> [3]
  276. web_1 | ** get_last_boost_index_from_db() -> [3]
  277. web_1 | ** Supplied index from call: [3]
  278. web_1 | ** Supplied boostcount from call: [100]
  279. web_1 | ** Supplied index from call: [3]
  280. web_1 | ** Supplied boostcount from call: [100]
  281. web_1 | lnd::Lnd::list_invoices failed: status: Unknown, message: "wallet locked, unlock it to enable full RPC access", details: [], metadata: MetadataMap { headers: {"content-type": "application/grpc"} }
  282. web_1 | Current index: 3
  283.  
  284. ipfs-podcasting
  285.  
  286. Attaching to ipfs-podcasting_app_proxy_1, ipfs-podcasting_web_1
  287. app_proxy_1 | $ node ./bin/www
  288. app_proxy_1 | [HPM] Proxy created: / -> http://ipfs-podcasting_web_1:8675
  289. app_proxy_1 | Waiting for ipfs-podcasting_web_1:8675 to open...
  290. app_proxy_1 | IPFS Podcasting is now ready...
  291. app_proxy_1 | Listening on port: 8675
  292. app_proxy_1 | Validating token: bdaf587c825d ...
  293. app_proxy_1 | Validating token: bdaf587c825d ...
  294. app_proxy_1 | Validating token: bdaf587c825d ...
  295. app_proxy_1 | Validating token: bdaf587c825d ...
  296. app_proxy_1 | Validating token: bdaf587c825d ...
  297. web_1 | Bottle v0.12.21 server starting up (using WSGIRefServer())...
  298. web_1 | Listening on http://0.0.0.0:8675/
  299. web_1 | Hit Ctrl-C to quit.
  300. web_1 |
  301. web_1 | 10.21.0.2 - - [30/Aug/2023 23:27:07] "GET / HTTP/1.1" 200 4865
  302.  
  303. lightning
  304.  
  305. Attaching to lightning_lnd_1, lightning_app_proxy_1, lightning_app_1, lightning_tor_1
  306. app_1 | umbrel-lightning
  307. app_1 | Waiting for LND...
  308. app_1 | Checking LND status...
  309. app_1 | Waiting for LND...
  310. app_1 | ::ffff:10.21.0.7 - - [Wed, 30 Aug 2023 23:53:29 GMT] "GET /v1/lnd/info/status HTTP/1.1" 304 - "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0"
  311. app_1 |
  312. app_1 | umbrel-lightning
  313. app_1 | Checking LND status...
  314. app_1 | [backup-monitor] Checking channel backup...
  315. app_1 | [backup-monitor] Sleeping...
  316. app_proxy_1 | Validating token: bdaf587c825d ...
  317. app_proxy_1 | Validating token: bdaf587c825d ...
  318. app_proxy_1 | Validating token: bdaf587c825d ...
  319. app_proxy_1 | Validating token: bdaf587c825d ...
  320. app_proxy_1 | Validating token: bdaf587c825d ...
  321. app_proxy_1 | Validating token: bdaf587c825d ...
  322. app_proxy_1 | Validating token: bdaf587c825d ...
  323. app_proxy_1 | Validating token: bdaf587c825d ...
  324. app_proxy_1 | Validating token: bdaf587c825d ...
  325. app_proxy_1 | Validating token: bdaf587c825d ...
  326. lnd_1 | 2023-08-30 23:53:16.198 [ERR] RPCS: [/lnrpc.Lightning/ListInvoices]: wallet locked, unlock it to enable full RPC access
  327. lnd_1 | 2023-08-30 23:53:25.200 [ERR] RPCS: [/lnrpc.Lightning/ChannelBalance]: wallet locked, unlock it to enable full RPC access
  328. lnd_1 | 2023-08-30 23:53:25.242 [ERR] RPCS: [/lnrpc.Lightning/ListInvoices]: wallet locked, unlock it to enable full RPC access
  329. lnd_1 | 2023-08-30 23:53:26.493 [ERR] RPCS: [/lnrpc.Lightning/GetInfo]: wallet locked, unlock it to enable full RPC access
  330. lnd_1 | 2023-08-30 23:53:34.244 [ERR] RPCS: [/lnrpc.Lightning/ChannelBalance]: wallet locked, unlock it to enable full RPC access
  331. lnd_1 | 2023-08-30 23:53:34.286 [ERR] RPCS: [/lnrpc.Lightning/ListInvoices]: wallet locked, unlock it to enable full RPC access
  332. lnd_1 | 2023-08-30 23:53:43.287 [ERR] RPCS: [/lnrpc.Lightning/ChannelBalance]: wallet locked, unlock it to enable full RPC access
  333. lnd_1 | 2023-08-30 23:53:43.334 [ERR] RPCS: [/lnrpc.Lightning/ListInvoices]: wallet locked, unlock it to enable full RPC access
  334. lnd_1 | 2023-08-30 23:53:52.335 [ERR] RPCS: [/lnrpc.Lightning/ChannelBalance]: wallet locked, unlock it to enable full RPC access
  335. lnd_1 | 2023-08-30 23:53:52.378 [ERR] RPCS: [/lnrpc.Lightning/ListInvoices]: wallet locked, unlock it to enable full RPC access
  336. tor_1 | Aug 30 23:52:12.789 [notice] Read configuration file "/etc/tor/torrc".
  337. tor_1 | Aug 30 23:52:12.791 [warn] Permissions on directory /data/app-lightning-rest are too permissive.
  338. tor_1 | Aug 30 23:52:12.791 [warn] Failed to parse/validate config: Failed to configure rendezvous options. See logs for details.
  339. tor_1 | Aug 30 23:52:12.791 [err] Reading config failed--see warnings above.
  340. tor_1 | Aug 30 23:53:13.219 [notice] Tor 0.4.7.8 running on Linux with Libevent 2.1.12-stable, OpenSSL 1.1.1n, Zlib 1.2.11, Liblzma N/A, Libzstd N/A and Glibc 2.31 as libc.
  341. tor_1 | Aug 30 23:53:13.219 [notice] Tor can't help you if you use it wrong! Learn how to be safe at https://support.torproject.org/faq/staying-anonymous/
  342. tor_1 | Aug 30 23:53:13.219 [notice] Read configuration file "/etc/tor/torrc".
  343. tor_1 | Aug 30 23:53:13.222 [warn] Permissions on directory /data/app-lightning-rest are too permissive.
  344. tor_1 | Aug 30 23:53:13.222 [warn] Failed to parse/validate config: Failed to configure rendezvous options. See logs for details.
  345. tor_1 | Aug 30 23:53:13.222 [err] Reading config failed--see warnings above.
  346.  
  347. lightning-shell
  348.  
  349. Attaching to lightning-shell_web_1, lightning-shell_app_proxy_1
  350. web_1 | [2023/08/30 23:41:17:1140] N: LWS: 4.3.0-a5aae04, NET CLI SRV H1 H2 WS ConMon IPv6-absent
  351. web_1 | [2023/08/30 23:41:17:1206] N: elops_init_pt_uv: Using foreign event loop...
  352. web_1 | [2023/08/30 23:41:17:1207] N: ++ [wsi|0|pipe] (1)
  353. web_1 | [2023/08/30 23:41:17:1207] N: ++ [vh|0|netlink] (1)
  354. web_1 | [2023/08/30 23:41:17:1208] N: ++ [vh|1|default||7681] (2)
  355. web_1 | [2023/08/30 23:41:17:1208] N: [null wsi]: lws_socket_bind: source ads 0.0.0.0
  356. web_1 | [2023/08/30 23:41:17:1209] N: ++ [wsi|1|listen|default||7681] (2)
  357. web_1 | [2023/08/30 23:41:17:1209] N: Listening on port: 7681
  358. web_1 | [2023/08/30 23:41:17:6791] N: ++ [wsisrv|0|adopted] (1)
  359. web_1 | [2023/08/30 23:41:17:6833] N: -- [wsisrv|0|adopted] (0) 4.271ms
  360. app_proxy_1 | yarn run v1.22.19
  361. app_proxy_1 | $ node ./bin/www
  362. app_proxy_1 | [HPM] Proxy created: / -> http://lightning-shell_web_1:7681
  363. app_proxy_1 | Waiting for lightning-shell_web_1:7681 to open...
  364. app_proxy_1 | Lightning Shell is now ready...
  365. app_proxy_1 | Listening on port: 7681
  366.  
  367. llama-gpt
  368.  
  369. Attaching to llama-gpt_llama-gpt-ui_1, llama-gpt_llama-gpt-api_1, llama-gpt_app_proxy_1
  370. app_proxy_1 | Validating token: bdaf587c825d ...
  371. app_proxy_1 | Validating token: bdaf587c825d ...
  372. app_proxy_1 | Validating token: bdaf587c825d ...
  373. app_proxy_1 | Validating token: bdaf587c825d ...
  374. app_proxy_1 | Validating token: bdaf587c825d ...
  375. app_proxy_1 | Validating token: bdaf587c825d ...
  376. app_proxy_1 | Validating token: bdaf587c825d ...
  377. app_proxy_1 | Validating token: bdaf587c825d ...
  378. app_proxy_1 | Validating token: bdaf587c825d ...
  379. app_proxy_1 | Validating token: bdaf587c825d ...
  380. llama-gpt-ui_1 | [INFO wait] docker-compose-wait - Everything's fine, the application can now start!
  381. llama-gpt-ui_1 | [INFO wait] --------------------------------------------------------
  382. llama-gpt-ui_1 |
  383. llama-gpt-ui_1 | > [email protected] start
  384. llama-gpt-ui_1 | > next start
  385. llama-gpt-ui_1 |
  386. llama-gpt-ui_1 | ready - started server on 0.0.0.0:3000, url: http://localhost:3000
  387. llama-gpt-ui_1 | making request to http://llama-gpt-api:8000/v1/models
  388. llama-gpt-ui_1 | making request to http://llama-gpt-api:8000/v1/models
  389. llama-gpt-ui_1 | making request to http://llama-gpt-api:8000/v1/models
  390. llama-gpt-api_1 | Try increasing RLIMIT_MLOCK ('ulimit -l' as root).
  391. llama-gpt-api_1 | llama_new_context_with_model: kv self size = 2048.00 MB
  392. llama-gpt-api_1 | AVX = 1 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | FMA = 1 | NEON = 0 | ARM_FMA = 0 | F16C = 1 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 0 | SSE3 = 1 | VSX = 0 |
  393. llama-gpt-api_1 | INFO: Started server process [1]
  394. llama-gpt-api_1 | INFO: Waiting for application startup.
  395. llama-gpt-api_1 | INFO: Application startup complete.
  396. llama-gpt-api_1 | INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
  397. llama-gpt-api_1 | INFO: 10.21.0.24:39130 - "GET /v1/models HTTP/1.1" 200 OK
  398. llama-gpt-api_1 | INFO: 10.21.0.24:56984 - "GET /v1/models HTTP/1.1" 200 OK
  399. llama-gpt-api_1 | INFO: 10.21.0.24:60378 - "GET /v1/models HTTP/1.1" 200 OK
  400.  
  401. ln-visualizer
  402.  
  403. Attaching to ln-visualizer_app_proxy_1, ln-visualizer_web_1, ln-visualizer_api_1
  404. api_1 | at Object.onReceiveStatus (/usr/local/app/node_modules/@grpc/grpc-js/build/src/client-interceptors.js:336:141)
  405. api_1 | at Object.onReceiveStatus (/usr/local/app/node_modules/@grpc/grpc-js/build/src/client-interceptors.js:299:181)
  406. api_1 | at /usr/local/app/node_modules/@grpc/grpc-js/build/src/call-stream.js:160:78
  407. api_1 | at processTicksAndRejections (internal/process/task_queues.js:77:11) {
  408. api_1 | code: 2,
  409. api_1 | details: 'wallet locked, unlock it to enable full RPC access',
  410. api_1 | metadata: [Metadata]
  411. api_1 | }
  412. api_1 | }
  413. api_1 | ]
  414. web_1 | 2023/08/30 22:19:33 [emerg] 7#7: host not found in upstream "ln-visualizer_api_1" in /etc/nginx/nginx.conf:22
  415. web_1 | nginx: [emerg] host not found in upstream "ln-visualizer_api_1" in /etc/nginx/nginx.conf:22
  416. web_1 | 2023/08/30 22:19:59 [emerg] 7#7: host not found in upstream "ln-visualizer_api_1" in /etc/nginx/nginx.conf:22
  417. web_1 | nginx: [emerg] host not found in upstream "ln-visualizer_api_1" in /etc/nginx/nginx.conf:22
  418. web_1 | 2023/08/30 22:20:51 [emerg] 7#7: host not found in upstream "ln-visualizer_api_1" in /etc/nginx/nginx.conf:22
  419. web_1 | nginx: [emerg] host not found in upstream "ln-visualizer_api_1" in /etc/nginx/nginx.conf:22
  420. web_1 | 2023/08/30 22:21:51 [emerg] 7#7: host not found in upstream "ln-visualizer_api_1" in /etc/nginx/nginx.conf:22
  421. web_1 | nginx: [emerg] host not found in upstream "ln-visualizer_api_1" in /etc/nginx/nginx.conf:22
  422. web_1 | 2023/08/30 22:22:52 [emerg] 7#7: host not found in upstream "ln-visualizer_api_1" in /etc/nginx/nginx.conf:22
  423. web_1 | nginx: [emerg] host not found in upstream "ln-visualizer_api_1" in /etc/nginx/nginx.conf:22
  424. app_proxy_1 | Error wating for port: "The address 'ln-visualizer_web_1' cannot be found"
  425. app_proxy_1 | Retrying...
  426. app_proxy_1 | Error wating for port: "The address 'ln-visualizer_web_1' cannot be found"
  427. app_proxy_1 | Retrying...
  428. app_proxy_1 | Error wating for port: "The address 'ln-visualizer_web_1' cannot be found"
  429. app_proxy_1 | Retrying...
  430. app_proxy_1 | Error wating for port: "The address 'ln-visualizer_web_1' cannot be found"
  431. app_proxy_1 | Retrying...
  432. app_proxy_1 | LnVisualizer is now ready...
  433. app_proxy_1 | Listening on port: 5646
  434.  
  435. lnplus
  436.  
  437. Attaching to lnplus_app_proxy_1, lnplus_web_1
  438. web_1 | from /gems/ruby/3.0.0/gems/rack-2.2.4/lib/rack/sendfile.rb:110:in `call'
  439. web_1 | from /gems/ruby/3.0.0/gems/actionpack-7.0.3.1/lib/action_dispatch/middleware/host_authorization.rb:131:in `call'
  440. web_1 | from /gems/ruby/3.0.0/gems/railties-7.0.3.1/lib/rails/engine.rb:530:in `call'
  441. web_1 | from /gems/ruby/3.0.0/gems/puma-5.6.4/lib/puma/configuration.rb:252:in `call'
  442. web_1 | from /gems/ruby/3.0.0/gems/puma-5.6.4/lib/puma/request.rb:77:in `block in handle_request'
  443. web_1 | from /gems/ruby/3.0.0/gems/puma-5.6.4/lib/puma/thread_pool.rb:340:in `with_force_shutdown'
  444. web_1 | from /gems/ruby/3.0.0/gems/puma-5.6.4/lib/puma/request.rb:76:in `handle_request'
  445. web_1 | from /gems/ruby/3.0.0/gems/puma-5.6.4/lib/puma/server.rb:441:in `process_client'
  446. web_1 | from /gems/ruby/3.0.0/gems/puma-5.6.4/lib/puma/thread_pool.rb:147:in `block in spawn_thread'
  447. web_1 | /gems/ruby/3.0.0/gems/actionpack-7.0.3.1/lib/action_dispatch/middleware/debug_exceptions.rb:72: raise exception
  448. app_proxy_1 | yarn run v1.22.19
  449. app_proxy_1 | $ node ./bin/www
  450. app_proxy_1 | [HPM] Proxy created: / -> http://lnplus_web_1:3777
  451. app_proxy_1 | Waiting for lnplus_web_1:3777 to open...
  452. app_proxy_1 | Lightning Network+ is now ready...
  453. app_proxy_1 | Listening on port: 3777
  454. app_proxy_1 | Validating token: bdaf587c825d ...
  455. app_proxy_1 | Validating token: bdaf587c825d ...
  456. app_proxy_1 | Validating token: bdaf587c825d ...
  457. app_proxy_1 | Validating token: bdaf587c825d ...
  458.  
  459. n8n
  460.  
  461. Attaching to n8n_app_proxy_1, n8n_server_1
  462. app_proxy_1 | Error wating for port: "The address 'n8n_server_1' cannot be found"
  463. app_proxy_1 | Retrying...
  464. app_proxy_1 | Error wating for port: "The address 'n8n_server_1' cannot be found"
  465. app_proxy_1 | Retrying...
  466. app_proxy_1 | Error wating for port: "The address 'n8n_server_1' cannot be found"
  467. app_proxy_1 | Retrying...
  468. app_proxy_1 | Error wating for port: "The address 'n8n_server_1' cannot be found"
  469. app_proxy_1 | Retrying...
  470. app_proxy_1 | n8n is now ready...
  471. app_proxy_1 | Listening on port: 5678
  472. server_1 | http://natron.local:5678/
  473. server_1 |
  474. server_1 | Stopping n8n...
  475. server_1 | License manager not initialized
  476. server_1 | n8n ready on 0.0.0.0, port 5678
  477. server_1 | Initializing n8n process
  478. server_1 | Version: 0.234.1
  479. server_1 |
  480. server_1 | Editor is now accessible via:
  481. server_1 | http://natron.local:5678/
  482.  
  483. penpot
  484.  
  485. Attaching to penpot_penpot-frontend_1, penpot_penpot-backend_1, penpot_penpot-redis_1, penpot_app_proxy_1, penpot_penpot-postgres_1, penpot_penpot-exporter_1
  486. app_proxy_1 | Error wating for port: "The address 'penpot_penpot-frontend_1' cannot be found"
  487. app_proxy_1 | Retrying...
  488. app_proxy_1 | Error wating for port: "The address 'penpot_penpot-frontend_1' cannot be found"
  489. app_proxy_1 | Retrying...
  490. app_proxy_1 | Error wating for port: "The address 'penpot_penpot-frontend_1' cannot be found"
  491. app_proxy_1 | Retrying...
  492. app_proxy_1 | Error wating for port: "The address 'penpot_penpot-frontend_1' cannot be found"
  493. app_proxy_1 | Retrying...
  494. app_proxy_1 | Penpot is now ready...
  495. app_proxy_1 | Listening on port: 9001
  496. penpot-backend_1 | [2023-08-30 22:19:23.372] I app.storage.tmp - hint="started tmp file cleaner"
  497. penpot-backend_1 | [2023-08-30 22:19:23.379] I app.worker - hint="registry initialized", tasks=13
  498. penpot-backend_1 | [2023-08-30 22:19:23.383] I app.worker - hint="cron: started", tasks=8
  499. penpot-backend_1 | [2023-08-30 22:19:23.430] I app.worker - hint="dispatcher: started"
  500. penpot-backend_1 | [2023-08-30 22:19:23.432] I app.worker - hint="monitor: started", name="default"
  501. penpot-backend_1 | [2023-08-30 22:19:23.435] I app.worker - hint="worker: started", worker-id=0, queue="default"
  502. penpot-backend_1 | [2023-08-30 22:19:23.437] I app.worker - hint="worker: started", worker-id=0, queue="webhooks"
  503. penpot-backend_1 | [2023-08-30 22:19:23.438] I app.srepl - msg="initializing repl server", name="prepl", port=6063, host="localhost"
  504. penpot-backend_1 | [2023-08-30 22:19:23.443] I app.main - hint="welcome to penpot", flags="login-with-password,backend-api-doc,backend-worker,registration,prepl-server", worker?=true, version="1.18.3-9885-g353de39d4"
  505. penpot-backend_1 | [2023-08-30 23:00:00.106] I app.tasks.file-xlog-gc - hint="task finished", min-age="72h", total=0
  506. penpot-redis_1 | 1:M 30 Aug 2023 22:15:35.763 # Error trying to save the DB, can't exit.
  507. penpot-redis_1 | 1:M 30 Aug 2023 22:15:35.763 # Errors trying to shut down the server. Check the logs for more information.
  508. penpot-redis_1 | 1:C 30 Aug 2023 22:16:34.582 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
  509. penpot-redis_1 | 1:C 30 Aug 2023 22:16:34.582 # Redis version=7.0.11, bits=64, commit=00000000, modified=0, pid=1, just started
  510. penpot-redis_1 | 1:C 30 Aug 2023 22:16:34.582 # Warning: no config file specified, using the default config. In order to specify a config file use redis-server /path/to/redis.conf
  511. penpot-redis_1 | 1:M 30 Aug 2023 22:16:34.583 * monotonic clock: POSIX clock_gettime
  512. penpot-redis_1 | 1:M 30 Aug 2023 22:16:34.601 * Running mode=standalone, port=6379.
  513. penpot-redis_1 | 1:M 30 Aug 2023 22:16:34.601 # Server initialized
  514. penpot-redis_1 | 1:M 30 Aug 2023 22:16:34.601 # WARNING Memory overcommit must be enabled! Without it, a background save or replication may fail under low memory condition. Being disabled, it can can also cause failures without low memory condition, see https://github.com/jemalloc/jemalloc/issues/1328. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
  515. penpot-redis_1 | 1:M 30 Aug 2023 22:16:34.614 * Ready to accept connections
  516. penpot-exporter_1 | INF [app.core] msg="initializing", public-uri="http://penpot-frontend", version="1.18.3-9885-g353de39d4"
  517. penpot-exporter_1 | INF [app.browser] hint="initializing browser pool", opts=#js {:max 5, :min 0, :testOnBorrow true, :evictionRunIntervalMillis 5000, :numTestsPerEvictionRun 5, :acquireTimeoutMillis 10000, :idleTimeoutMillis 10000}
  518. penpot-exporter_1 | INF [app.http] hint="welcome to penpot", module="exporter", version="1.18.3-9885-g353de39d4"
  519. penpot-exporter_1 | INF [app.http] hint="starting http server", port=6061
  520. penpot-exporter_1 | INF [app.redis] hint="redis connection established", uri="redis://penpot-redis/0"
  521. penpot-exporter_1 | INF [app.core] msg="initializing", public-uri="http://penpot-frontend", version="1.18.3-9885-g353de39d4"
  522. penpot-exporter_1 | INF [app.browser] hint="initializing browser pool", opts=#js {:max 5, :min 0, :testOnBorrow true, :evictionRunIntervalMillis 5000, :numTestsPerEvictionRun 5, :acquireTimeoutMillis 10000, :idleTimeoutMillis 10000}
  523. penpot-exporter_1 | INF [app.http] hint="welcome to penpot", module="exporter", version="1.18.3-9885-g353de39d4"
  524. penpot-exporter_1 | INF [app.http] hint="starting http server", port=6061
  525. penpot-exporter_1 | INF [app.redis] hint="redis connection established", uri="redis://penpot-redis/0"
  526. penpot-postgres_1 | 2023-08-30 22:19:03.462 UTC [1] LOG: listening on IPv6 address "::", port 5432
  527. penpot-postgres_1 | 2023-08-30 22:19:03.498 UTC [1] LOG: listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
  528. penpot-postgres_1 | 2023-08-30 22:19:03.543 UTC [15] LOG: database system was shut down at 2023-08-30 22:15:35 UTC
  529. penpot-postgres_1 | 2023-08-30 22:19:03.591 UTC [1] LOG: database system is ready to accept connections
  530. penpot-postgres_1 | 2023-08-30 22:24:03.609 UTC [13] LOG: checkpoint starting: time
  531. penpot-postgres_1 | 2023-08-30 22:24:03.724 UTC [13] LOG: checkpoint complete: wrote 4 buffers (0.0%); 0 WAL file(s) added, 0 removed, 0 recycled; write=0.105 s, sync=0.003 s, total=0.116 s; sync files=3, longest=0.002 s, average=0.001 s; distance=5 kB, estimate=5 kB
  532. penpot-postgres_1 | 2023-08-30 23:04:04.470 UTC [13] LOG: checkpoint starting: time
  533. penpot-postgres_1 | 2023-08-30 23:04:04.588 UTC [13] LOG: checkpoint complete: wrote 2 buffers (0.0%); 0 WAL file(s) added, 0 removed, 0 recycled; write=0.103 s, sync=0.003 s, total=0.119 s; sync files=2, longest=0.002 s, average=0.002 s; distance=5 kB, estimate=5 kB
  534. penpot-postgres_1 | 2023-08-30 23:34:05.186 UTC [13] LOG: checkpoint starting: time
  535. penpot-postgres_1 | 2023-08-30 23:34:05.298 UTC [13] LOG: checkpoint complete: wrote 2 buffers (0.0%); 0 WAL file(s) added, 0 removed, 0 recycled; write=0.104 s, sync=0.001 s, total=0.113 s; sync files=2, longest=0.001 s, average=0.001 s; distance=5 kB, estimate=5 kB
  536.  
  537. snort
  538.  
  539. Attaching to snort_app_proxy_1, snort_web_1
  540. app_proxy_1 | Validating token: bdaf587c825d ...
  541. app_proxy_1 | Validating token: bdaf587c825d ...
  542. app_proxy_1 | Validating token: bdaf587c825d ...
  543. app_proxy_1 | Validating token: bdaf587c825d ...
  544. app_proxy_1 | Validating token: bdaf587c825d ...
  545. app_proxy_1 | Validating token: bdaf587c825d ...
  546. app_proxy_1 | Validating token: bdaf587c825d ...
  547. app_proxy_1 | Validating token: bdaf587c825d ...
  548. app_proxy_1 | Validating token: bdaf587c825d ...
  549. app_proxy_1 | Validating token: bdaf587c825d ...
  550. web_1 | 10.21.0.8 - - [30/Aug/2023:22:27:29 +0000] "GET / HTTP/1.1" 200 789 "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0" "::ffff:192.168.178.10"
  551. web_1 | 10.21.0.8 - - [30/Aug/2023:22:27:29 +0000] "GET /main.7c424e09525cb4aa655e.css HTTP/1.1" 200 55874 "http://192.168.178.101:52027/" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0" "::ffff:192.168.178.10"
  552. web_1 | 10.21.0.8 - - [30/Aug/2023:22:27:29 +0000] "GET /main.7c424e09525cb4aa655e.js HTTP/1.1" 200 1202175 "http://192.168.178.101:52027/" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0" "::ffff:192.168.178.10"
  553. web_1 | 10.21.0.8 - - [30/Aug/2023:22:27:29 +0000] "GET /e85d84dcfe3b365aaaa3.woff2 HTTP/1.1" 200 37780 "http://192.168.178.101:52027/main.7c424e09525cb4aa655e.css" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0" "::ffff:192.168.178.10"
  554. web_1 | 10.21.0.8 - - [30/Aug/2023:22:27:29 +0000] "GET /pow.js HTTP/1.1" 200 9853 "http://192.168.178.101:52027/" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0" "::ffff:192.168.178.10"
  555. web_1 | 10.21.0.8 - - [30/Aug/2023:22:27:29 +0000] "GET /favicon.ico HTTP/1.1" 200 3290 "http://192.168.178.101:52027/" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0" "::ffff:192.168.178.10"
  556. web_1 | 10.21.0.8 - - [30/Aug/2023:22:27:30 +0000] "GET /1041fe76ab47400d0920.svg HTTP/1.1" 200 77333 "http://192.168.178.101:52027/" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0" "::ffff:192.168.178.10"
  557. web_1 | 10.21.0.8 - - [30/Aug/2023:22:27:31 +0000] "GET /nostrich_512.png HTTP/1.1" 200 540613 "http://192.168.178.101:52027/" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0" "::ffff:192.168.178.10"
  558. web_1 | 10.21.0.8 - - [30/Aug/2023:22:27:43 +0000] "GET /nostrich_512.png HTTP/1.1" 200 540613 "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0" "::ffff:192.168.178.10"
  559. web_1 | 10.21.0.8 - - [30/Aug/2023:22:27:55 +0000] "GET /p/npub1uerlt3yee5gy44kr97th45cypnv8s7lwnqll0t3hvady99g7ffdsgj99m0 HTTP/1.1" 200 789 "-" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/117.0" "::ffff:192.168.178.10"
  560.  
  561. thunderhub
  562.  
  563. Attaching to thunderhub_app_proxy_1, thunderhub_web_1
  564. web_1 | ],
  565. web_1 | level: 'error',
  566. web_1 | message: 'Error connecting to node',
  567. web_1 | timestamp: '2023-08-30T23:53:26.494Z'
  568. web_1 | }
  569. web_1 | {
  570. web_1 | message: 'No node available for healthcheck ping',
  571. web_1 | level: 'error',
  572. web_1 | timestamp: '2023-08-30T23:53:26.495Z'
  573. web_1 | }
  574. app_proxy_1 | [HPM] Upgrading to WebSocket
  575. app_proxy_1 | Validating token: bdaf587c825d ...
  576. app_proxy_1 | Validating token: bdaf587c825d ...
  577. app_proxy_1 | Validating token: bdaf587c825d ...
  578. app_proxy_1 | [HPM] Client disconnected
  579. app_proxy_1 | Validating token: bdaf587c825d ...
  580. app_proxy_1 | Validating token: bdaf587c825d ...
  581. app_proxy_1 | Validating token: bdaf587c825d ...
  582. app_proxy_1 | Validating token: bdaf587c825d ...
  583. app_proxy_1 | Validating token: bdaf587c825d ...
  584.  
  585. torq
  586.  
  587. Attaching to torq_web_1, torq_db_1, torq_app_proxy_1
  588. app_proxy_1 | Error wating for port: "The address 'torq_web_1' cannot be found"
  589. app_proxy_1 | Retrying...
  590. app_proxy_1 | Error wating for port: "The address 'torq_web_1' cannot be found"
  591. app_proxy_1 | Retrying...
  592. app_proxy_1 | Error wating for port: "The address 'torq_web_1' cannot be found"
  593. app_proxy_1 | Retrying...
  594. app_proxy_1 | Error wating for port: "The address 'torq_web_1' cannot be found"
  595. app_proxy_1 | Retrying...
  596. app_proxy_1 | Torq is now ready...
  597. app_proxy_1 | Listening on port: 7028
  598. web_1 | {"level":"info","protocol":"grpc","grpc.component":"client","grpc.service":"lnrpc.Lightning","grpc.method":"ListPeers","grpc.method_type":"unary","grpc.start_time":"2023-08-30T23:53:07Z","grpc.request.deadline":"2023-08-30T23:55:07Z","grpc.code":"Unknown","grpc.error":"rpc error: code = Unknown desc = wallet locked, unlock it to enable full RPC access","grpc.time_ms":"0.778","message":"finished call"}
  599. web_1 | {"level":"error","error":"get list of peers from lnd for nodeId: 1: rpc error: code = Unknown desc = wallet locked, unlock it to enable full RPC access","time":"2023-08-30T23:53:07Z","message":"Failed to import peer status."}
  600. web_1 | {"level":"error","error":"get list of peers from lnd for nodeId: 1: rpc error: code = Unknown desc = wallet locked, unlock it to enable full RPC access","time":"2023-08-30T23:53:07Z","message":"LND import Peers for nodeId: 1"}
  601. web_1 | {"level":"info","time":"2023-08-30T23:53:07Z","message":"LndServiceChannelEventStream terminated for nodeId: 1"}
  602. web_1 | {"level":"info","time":"2023-08-30T23:54:08Z","message":"LndServiceChannelEventStream boot attempt for nodeId: 1."}
  603. web_1 | {"level":"info","time":"2023-08-30T23:54:08Z","message":"LndServiceChannelEventStream service booted for nodeId: 1"}
  604. web_1 | {"level":"info","protocol":"grpc","grpc.component":"client","grpc.service":"lnrpc.Lightning","grpc.method":"ListPeers","grpc.method_type":"unary","grpc.start_time":"2023-08-30T23:54:08Z","grpc.request.deadline":"2023-08-30T23:56:08Z","grpc.code":"Unknown","grpc.error":"rpc error: code = Unknown desc = wallet locked, unlock it to enable full RPC access","grpc.time_ms":"0.668","message":"finished call"}
  605. web_1 | {"level":"error","error":"get list of peers from lnd for nodeId: 1: rpc error: code = Unknown desc = wallet locked, unlock it to enable full RPC access","time":"2023-08-30T23:54:08Z","message":"Failed to import peer status."}
  606. web_1 | {"level":"error","error":"get list of peers from lnd for nodeId: 1: rpc error: code = Unknown desc = wallet locked, unlock it to enable full RPC access","time":"2023-08-30T23:54:08Z","message":"LND import Peers for nodeId: 1"}
  607. web_1 | {"level":"info","time":"2023-08-30T23:54:08Z","message":"LndServiceChannelEventStream terminated for nodeId: 1"}
  608. db_1 |
  609. db_1 | PostgreSQL Database directory appears to contain a database; Skipping initialization
  610. db_1 |
  611. db_1 | 2023-08-30 22:19:01.220 UTC [1] LOG: starting PostgreSQL 14.5 on x86_64-pc-linux-musl, compiled by gcc (Alpine 11.2.1_git20220219) 11.2.1 20220219, 64-bit
  612. db_1 | 2023-08-30 22:19:01.220 UTC [1] LOG: listening on IPv4 address "0.0.0.0", port 5432
  613. db_1 | 2023-08-30 22:19:01.220 UTC [1] LOG: listening on IPv6 address "::", port 5432
  614. db_1 | 2023-08-30 22:19:01.248 UTC [1] LOG: listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
  615. db_1 | 2023-08-30 22:19:01.253 UTC [12] LOG: database system was shut down at 2023-08-30 22:15:35 UTC
  616. db_1 | 2023-08-30 22:19:01.268 UTC [1] LOG: database system is ready to accept connections
  617. db_1 | 2023-08-30 22:19:01.276 UTC [18] LOG: TimescaleDB background worker launcher connected to shared catalogs
  618. ================
  619. ==== Result ====
  620. ================
  621. ==== END =====
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement