Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- 2019-12-12 19:31:12 kubewatch [10 TMainThread] aes-1.0.0-ea1-dirty DEBUG: looking up ID for namespace default
- 2019-12-12 19:31:12 kubewatch [10 TMainThread] aes-1.0.0-ea1-dirty DEBUG: cluster ID URL is d6e_id://3b46007d-d543-45e0-ae75-12bd56085474/default
- 2019-12-12 19:31:14 kubewatch [10 TMainThread] aes-1.0.0-ea1-dirty DEBUG: cluster ID is dfe1b58a-d423-5de7-a98c-dcb4b6d918d6 (from namespace default)
- 2019-12-12 19:31:14 AMBASSADOR INFO starting with environment:
- 2019-12-12 19:31:14 AMBASSADOR INFO ====
- AMBASSADOR_ADMIN_K3D_NODEPORT_PORT=tcp://10.43.152.229:8877
- AMBASSADOR_ADMIN_K3D_NODEPORT_PORT_8877_TCP=tcp://10.43.152.229:8877
- AMBASSADOR_ADMIN_K3D_NODEPORT_PORT_8877_TCP_ADDR=10.43.152.229
- AMBASSADOR_ADMIN_K3D_NODEPORT_PORT_8877_TCP_PORT=8877
- AMBASSADOR_ADMIN_K3D_NODEPORT_PORT_8877_TCP_PROTO=tcp
- AMBASSADOR_ADMIN_K3D_NODEPORT_SERVICE_HOST=10.43.152.229
- AMBASSADOR_ADMIN_K3D_NODEPORT_SERVICE_PORT=8877
- AMBASSADOR_ADMIN_K3D_NODEPORT_SERVICE_PORT_AMBASSADOR_ADMIN=8877
- AMBASSADOR_ADMIN_URL=http://127.0.0.1:8877
- AMBASSADOR_CLUSTER_ID=dfe1b58a-d423-5de7-a98c-dcb4b6d918d6
- AMBASSADOR_CONFIG_BASE_DIR=/ambassador
- AMBASSADOR_INTERNAL_URL=https://127.0.0.1:8443
- AMBASSADOR_K3D_NODEPORT_PORT=tcp://10.43.126.4:80
- AMBASSADOR_K3D_NODEPORT_PORT_443_TCP=tcp://10.43.126.4:443
- AMBASSADOR_K3D_NODEPORT_PORT_443_TCP_ADDR=10.43.126.4
- AMBASSADOR_K3D_NODEPORT_PORT_443_TCP_PORT=443
- AMBASSADOR_K3D_NODEPORT_PORT_443_TCP_PROTO=tcp
- AMBASSADOR_K3D_NODEPORT_PORT_80_TCP=tcp://10.43.126.4:80
- AMBASSADOR_K3D_NODEPORT_PORT_80_TCP_ADDR=10.43.126.4
- AMBASSADOR_K3D_NODEPORT_PORT_80_TCP_PORT=80
- AMBASSADOR_K3D_NODEPORT_PORT_80_TCP_PROTO=tcp
- AMBASSADOR_K3D_NODEPORT_SERVICE_HOST=10.43.126.4
- AMBASSADOR_K3D_NODEPORT_SERVICE_PORT=80
- AMBASSADOR_K3D_NODEPORT_SERVICE_PORT_HTTP=80
- AMBASSADOR_K3D_NODEPORT_SERVICE_PORT_HTTPS=443
- AMBASSADOR_NAMESPACE=default
- AMBASSADOR_REDIS_PORT=tcp://10.43.14.215:6379
- AMBASSADOR_REDIS_PORT_6379_TCP=tcp://10.43.14.215:6379
- AMBASSADOR_REDIS_PORT_6379_TCP_ADDR=10.43.14.215
- AMBASSADOR_REDIS_PORT_6379_TCP_PORT=6379
- AMBASSADOR_REDIS_PORT_6379_TCP_PROTO=tcp
- AMBASSADOR_REDIS_SERVICE_HOST=10.43.14.215
- AMBASSADOR_REDIS_SERVICE_PORT=6379
- AMBASSADOR_URL=https://ambassador.default.svc.cluster.local
- 2019-12-12 19:31:14 AMBASSADOR INFO ====
- 2019-12-12 19:31:14 AMBASSADOR INFO launching worker process 'ambex': 'ambex' '-ads' '8003' '/ambassador/envoy'
- 2019-12-12 19:31:14 AMBASSADOR INFO ambex is PID 286
- 2019-12-12 19:31:14 AMBASSADOR INFO launching worker process 'diagd': 'diagd' '/ambassador/snapshots' '/ambassador/bootstrap-ads.json' '/ambassador/envoy/envoy.json' '--notices' '/ambassador/notices.json' '--kick' 'kill -HUP 1'
- time="2019-12-12T19:31:14Z" level=info msg="Ambex -no-version- starting..."
- time="2019-12-12T19:31:14Z" level=info msg=Listening port=8003
- time="2019-12-12T19:31:14Z" level=info msg="Wrote PID" file=ambex.pid pid=286
- time="2019-12-12T19:31:14Z" level=info msg="Pushing snapshot v0"
- 2019-12-12 19:31:14 AMBASSADOR INFO diagd is PID 293
- 2019-12-12 19:31:14 diagd aes-1.0.0-ea1-dirty [P293TMainThread] INFO: thread count 33, listening on 0.0.0.0:8877
- 2019-12-12 19:31:15 diagd aes-1.0.0-ea1-dirty [P293TMainThread] INFO: BOOT: Scout result {"latest_version": "0.86.1", "application": "ambassador", "notices": [{"level": "INFO", "message": "Upgrade available! to Ambassador version 0.86.1"}], "cached": false, "timestamp": 1576179075.247466}
- [2019-12-12 19:31:15 +0000] [293] [INFO] Starting gunicorn 19.9.0
- [2019-12-12 19:31:15 +0000] [293] [INFO] Listening at: http://0.0.0.0:8877 (293)
- [2019-12-12 19:31:15 +0000] [293] [INFO] Using worker: threads
- [2019-12-12 19:31:15 +0000] [306] [INFO] Booting worker with pid: 306
- 2019-12-12 19:31:15 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: starting Scout checker
- 2019-12-12 19:31:15 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: starting event watcher
- 2019-12-12 19:31:17 AMBASSADOR INFO diagd running
- 2019-12-12 19:31:17 AMBASSADOR INFO launching worker process 'watt': 'watt' '--port' '8002' '--notify' 'python /ambassador/post_update.py --watt ' '-s' 'service' '-s' 'ingresses' '-s' 'AuthService' '-s' 'Mapping' '-s' 'Module' '-s' 'RateLimitService' '-s' 'TCPMapping' '-s' 'TLSContext' '-s' 'TracingService' '-s' 'ConsulResolver' '-s' 'KubernetesEndpointResolver' '-s' 'KubernetesServiceResolver' '-s' 'Host' '-s' 'LogService' '--watch' 'python /ambassador/watch_hook.py'
- 2019-12-12 19:31:17 AMBASSADOR INFO watt is PID 316
- 2019/12/12 19:31:17 starting watt...
- 2019/12/12 19:31:17 kubebootstrap: starting
- 2019/12/12 19:31:17 consulwatchman: starting
- 2019/12/12 19:31:17 kubewatchman: starting
- 2019/12/12 19:31:17 aggregator: starting
- 2019/12/12 19:31:17 invoker: starting
- 2019/12/12 19:31:17 api: starting
- 2019/12/12 19:31:17 kubebootstrap: adding kubernetes watch for "service" in namespace "*"
- 2019/12/12 19:31:17 api: snapshot server listening on: :8002
- 2019/12/12 19:31:17 kubebootstrap: adding kubernetes watch for "ingresses" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: adding kubernetes watch for "AuthService" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: adding kubernetes watch for "Mapping" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: adding kubernetes watch for "Module" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: adding kubernetes watch for "RateLimitService" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: adding kubernetes watch for "TCPMapping" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: adding kubernetes watch for "TLSContext" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: adding kubernetes watch for "TracingService" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: adding kubernetes watch for "ConsulResolver" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: adding kubernetes watch for "KubernetesEndpointResolver" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: adding kubernetes watch for "KubernetesServiceResolver" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: adding kubernetes watch for "Host" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: adding kubernetes watch for "LogService" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: found 0 "ConsulResolver" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: sent "ConsulResolver" to 1 receivers
- 2019/12/12 19:31:17 kubebootstrap: found 5 "Mapping" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: sent "Mapping" to 1 receivers
- 2019/12/12 19:31:17 kubebootstrap: found 0 "Module" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: sent "Module" to 1 receivers
- 2019/12/12 19:31:17 kubebootstrap: found 0 "TracingService" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: sent "TracingService" to 1 receivers
- 2019/12/12 19:31:17 kubebootstrap: found 12 "service" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: sent "service" to 1 receivers
- 2019/12/12 19:31:17 kubebootstrap: found 0 "ingresses" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: sent "ingresses" to 1 receivers
- 2019/12/12 19:31:17 kubebootstrap: found 1 "AuthService" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: sent "AuthService" to 1 receivers
- 2019/12/12 19:31:17 kubebootstrap: found 1 "RateLimitService" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: sent "RateLimitService" to 1 receivers
- 2019/12/12 19:31:17 kubebootstrap: found 0 "KubernetesEndpointResolver" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: sent "KubernetesEndpointResolver" to 1 receivers
- 2019/12/12 19:31:17 kubebootstrap: found 0 "KubernetesServiceResolver" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: sent "KubernetesServiceResolver" to 1 receivers
- 2019/12/12 19:31:17 kubebootstrap: found 0 "Host" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: sent "Host" to 1 receivers
- 2019/12/12 19:31:17 kubebootstrap: found 0 "TCPMapping" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: sent "TCPMapping" to 1 receivers
- 2019/12/12 19:31:17 kubebootstrap: found 0 "TLSContext" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: sent "TLSContext" to 1 receivers
- 2019/12/12 19:31:17 kubebootstrap: found 0 "LogService" in namespace "*"
- 2019/12/12 19:31:17 kubebootstrap: sent "LogService" to 1 receivers
- 2019/12/12 19:31:17 aggregator: watch hook stderr: 2019-12-12 19:31:17 watch-hook INFO: YAML: using C dumper
- 2019/12/12 19:31:17 aggregator: watch hook stderr: 2019-12-12 19:31:17 watch-hook INFO: IR: watching OSS
- 2019/12/12 19:31:17 aggregator: watch hook stderr:
- 2019/12/12 19:31:17 aggregator: found 0 kubernetes watches
- 2019/12/12 19:31:17 aggregator: found 0 consul watches
- 2019/12/12 19:31:17 aggregator: bootstrapped!
- 2019/12/12 19:31:17 kubewatchman: processing 0 kubernetes watch specs
- 2019/12/12 19:31:17 consulwatchman: processing 0 consul watches
- 2019/12/12 19:31:17 notify: python /ambassador/post_update.py --watt http://localhost:8002/snapshots/1
- 2019-12-12 19:31:17 diagd aes-1.0.0-ea1-dirty [P306TThreadPoolExecutor-0_2] INFO: Update requested: watt, http://localhost:8002/snapshots/1
- 2019-12-12 19:31:17 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: copying configuration: watt, http://localhost:8002/snapshots/1 to /ambassador/snapshots/snapshot-tmp.yaml
- 2019-12-12 19:31:17 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: YAML: using C dumper
- 2019-12-12 19:31:17 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: IR: starting OSS
- 2019-12-12 19:31:17 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: V2Listener: Using log_format 'ACCESS [%START_TIME%] "%REQ(:METHOD)% %REQ(X-ENVOY-ORIGINAL-PATH?:PATH)% %PROTOCOL%" %RESPONSE_CODE% %RESPONSE_FLAGS% %BYTES_RECEIVED% %BYTES_SENT% %DURATION% %RESP(X-ENVOY-UPSTREAM-SERVICE-TIME)% "%REQ(X-FORWARDED-FOR)%" "%REQ(USER-AGENT)%" "%REQ(X-REQUEST-ID)%" "%REQ(:AUTHORITY)%" "%UPSTREAM_HOST%"'
- 2019-12-12 19:31:17 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: V2Listener: SNI filter chains
- []
- 2019-12-12 19:31:17 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: V2L: no filter chains, need cleartext
- 2019-12-12 19:31:17 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: V2Listener: SNI filter chains
- [
- {
- "ctx_hosts": [
- "*"
- ],
- "ctx_name": "-cleartext-",
- "filter_chain_match": {},
- "route_count": 8
- }
- ]
- 2019-12-12 19:31:17 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: V2L: leaving TLS inspector disabled
- 2019-12-12 19:31:17 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: -global-: NOTICE: A future Ambassador version will change the GRPC protocol version for AuthServices and RateLimitServices. See the CHANGELOG for details.
- 2019-12-12 19:31:17 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: successfully validated the resulting envoy configuration, continuing...
- 2019-12-12 19:31:17 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: saving Envoy configuration for snapshot 1
- 2019-12-12 19:31:17 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: running 'kill -HUP 1'
- KubeStatus MASTER 306: mark_live Mapping/ambassador-devportal-api.default
- KubeStatus MASTER 306: mark_live Mapping/ambassador-devportal-demo.default
- KubeStatus MASTER 306: mark_live Mapping/ambassador-devportal.default
- KubeStatus MASTER 306: mark_live Mapping/httpbin.default
- KubeStatus MASTER 306: mark_live Mapping/productpage.default
- 2019-12-12 19:31:17 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: K8s status update: Mapping ambassador-devportal-api.default, {"state": "Running"}...
- KubeStatus MASTER 306: Mapping/ambassador-devportal-api.default needs {"state": "Running"}
- 2019-12-12 19:31:17 AMBASSADOR INFO launching worker process 'envoy': 'envoy' '-c' '/ambassador/bootstrap-ads.json'
- 2019-12-12 19:31:17 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: K8s status update: Mapping ambassador-devportal-demo.default, {"state": "Running"}...
- KubeStatus MASTER 306: Mapping/ambassador-devportal-demo.default needs {"state": "Running"}
- 2019-12-12 19:31:17 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: K8s status update: Mapping ambassador-devportal.default, {"state": "Running"}...
- KubeStatus MASTER 306: Mapping/ambassador-devportal.default needs {"state": "Running"}
- 2019-12-12 19:31:17 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: K8s status update: Mapping httpbin.default, {"state": "Running"}...
- KubeStatus MASTER 306: Mapping/httpbin.default needs {"state": "Running"}
- 2019-12-12 19:31:17 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: K8s status update: Mapping productpage.default, {"state": "Running"}...
- 2019-12-12 19:31:17 AMBASSADOR INFO envoy is PID 367
- KubeStatus MASTER 306: Mapping/productpage.default needs {"state": "Running"}
- KubeStatus UPDATE 364: running command: ['kubestatus', 'Mapping', '-f', 'metadata.name=ambassador-devportal-api', '-n', 'default', '-u', '/dev/fd/0']
- 2019-12-12 19:31:17 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: configuration updated from snapshot 1
- 2019-12-12 19:31:17 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: starting Envoy status updater
- KubeStatus UPDATE 365: running command: ['kubestatus', 'Mapping', '-f', 'metadata.name=ambassador-devportal-demo', '-n', 'default', '-u', '/dev/fd/0']
- 2019-12-12 19:31:17 AMBASSADOR INFO KICK: started Envoy as PID 367
- KubeStatus UPDATE 366: running command: ['kubestatus', 'Mapping', '-f', 'metadata.name=ambassador-devportal', '-n', 'default', '-u', '/dev/fd/0']
- KubeStatus UPDATE 370: running command: ['kubestatus', 'Mapping', '-f', 'metadata.name=productpage', '-n', 'default', '-u', '/dev/fd/0']
- KubeStatus UPDATE 368: running command: ['kubestatus', 'Mapping', '-f', 'metadata.name=httpbin', '-n', 'default', '-u', '/dev/fd/0']
- time="2019-12-12T19:31:17Z" level=info msg="Loaded file /ambassador/envoy/envoy.json"
- [2019-12-12 19:31:17.940][367][info][main] [source/server/server.cc:249] initializing epoch 0 (hot restart version=11.104)
- [2019-12-12 19:31:17.941][367][info][main] [source/server/server.cc:251] statically linked extensions:
- [2019-12-12 19:31:17.941][367][info][main] [source/server/server.cc:253] access_loggers: envoy.file_access_log,envoy.http_grpc_access_log,envoy.tcp_grpc_access_log
- [2019-12-12 19:31:17.941][367][info][main] [source/server/server.cc:256] filters.http: envoy.buffer,envoy.cors,envoy.csrf,envoy.ext_authz,envoy.fault,envoy.filters.http.adaptive_concurrency,envoy.filters.http.dynamic_forward_proxy,envoy.filters.http.grpc_http1_reverse_bridge,envoy.filters.http.grpc_stats,envoy.filters.http.header_to_metadata,envoy.filters.http.jwt_authn,envoy.filters.http.original_src,envoy.filters.http.rbac,envoy.filters.http.tap,envoy.grpc_http1_bridge,envoy.grpc_json_transcoder,envoy.grpc_web,envoy.gzip,envoy.health_check,envoy.http_dynamo_filter,envoy.ip_tagging,envoy.lua,envoy.rate_limit,envoy.router,envoy.squash
- [2019-12-12 19:31:17.941][367][info][main] [source/server/server.cc:259] filters.listener: envoy.listener.http_inspector,envoy.listener.original_dst,envoy.listener.original_src,envoy.listener.proxy_protocol,envoy.listener.tls_inspector
- [2019-12-12 19:31:17.941][367][info][main] [source/server/server.cc:262] filters.network: envoy.client_ssl_auth,envoy.echo,envoy.ext_authz,envoy.filters.network.dubbo_proxy,envoy.filters.network.mysql_proxy,envoy.filters.network.rbac,envoy.filters.network.sni_cluster,envoy.filters.network.thrift_proxy,envoy.filters.network.zookeeper_proxy,envoy.http_connection_manager,envoy.mongo_proxy,envoy.ratelimit,envoy.redis_proxy,envoy.tcp_proxy
- [2019-12-12 19:31:17.941][367][info][main] [source/server/server.cc:264] stat_sinks: envoy.dog_statsd,envoy.metrics_service,envoy.stat_sinks.hystrix,envoy.statsd
- [2019-12-12 19:31:17.941][367][info][main] [source/server/server.cc:266] tracers: envoy.dynamic.ot,envoy.lightstep,envoy.tracers.datadog,envoy.tracers.opencensus,envoy.tracers.xray,envoy.zipkin
- [2019-12-12 19:31:17.941][367][info][main] [source/server/server.cc:269] transport_sockets.downstream: envoy.transport_sockets.alts,envoy.transport_sockets.raw_buffer,envoy.transport_sockets.tap,envoy.transport_sockets.tls,raw_buffer,tls
- [2019-12-12 19:31:17.941][367][info][main] [source/server/server.cc:272] transport_sockets.upstream: envoy.transport_sockets.alts,envoy.transport_sockets.raw_buffer,envoy.transport_sockets.tap,envoy.transport_sockets.tls,raw_buffer,tls
- [2019-12-12 19:31:17.941][367][info][main] [source/server/server.cc:278] buffer implementation: new
- [2019-12-12 19:31:17.943][367][info][main] [source/server/server.cc:344] admin address: 127.0.0.1:8001
- [2019-12-12 19:31:17.943][367][info][main] [source/server/server.cc:458] runtime: layers:
- - name: base
- static_layer:
- {}
- - name: admin
- admin_layer:
- {}
- [2019-12-12 19:31:17.943][367][info][config] [source/server/configuration_impl.cc:62] loading 0 static secret(s)
- [2019-12-12 19:31:17.943][367][info][config] [source/server/configuration_impl.cc:68] loading 1 cluster(s)
- [2019-12-12 19:31:17.944][367][info][upstream] [source/common/upstream/cluster_manager_impl.cc:157] cm init: initializing cds
- [2019-12-12 19:31:17.944][367][info][config] [source/server/configuration_impl.cc:72] loading 0 listener(s)
- [2019-12-12 19:31:17.944][367][info][config] [source/server/configuration_impl.cc:97] loading tracing configuration
- [2019-12-12 19:31:17.944][367][info][config] [source/server/configuration_impl.cc:117] loading stats sink configuration
- [2019-12-12 19:31:17.944][367][info][main] [source/server/server.cc:549] starting main dispatch loop
- [2019-12-12 19:31:17.946][367][info][upstream] [source/common/upstream/cds_api_impl.cc:67] cds: add 0 cluster(s), remove 1 cluster(s)
- [2019-12-12 19:31:17.946][367][info][upstream] [source/common/upstream/cluster_manager_impl.cc:161] cm init: all clusters initialized
- [2019-12-12 19:31:17.946][367][info][main] [source/server/server.cc:528] all clusters initialized. initializing init manager
- [2019-12-12 19:31:17.946][367][info][config] [source/server/listener_manager_impl.cc:578] all dependencies initialized. starting workers
- time="2019-12-12T19:31:17Z" level=info msg="Pushing snapshot v1"
- [2019-12-12 19:31:17.947][367][info][upstream] [source/common/upstream/cds_api_impl.cc:67] cds: add 5 cluster(s), remove 1 cluster(s)
- [2019-12-12 19:31:17.948][367][info][upstream] [source/common/upstream/cds_api_impl.cc:83] cds: add/update cluster 'cluster_127_0_0_1_8877'
- [2019-12-12 19:31:17.948][367][info][upstream] [source/common/upstream/cds_api_impl.cc:83] cds: add/update cluster 'cluster_extauth_127_0_0_1_8500'
- [2019-12-12 19:31:17.949][367][info][upstream] [source/common/upstream/cds_api_impl.cc:83] cds: add/update cluster 'cluster_httpbin_org'
- [2019-12-12 19:31:17.949][367][info][upstream] [source/common/upstream/cds_api_impl.cc:83] cds: add/update cluster 'cluster_productpage_9080'
- [2019-12-12 19:31:17.950][367][info][upstream] [source/common/upstream/cds_api_impl.cc:83] cds: add/update cluster 'cluster_127_0_0_1_8500'
- [2019-12-12 19:31:17.950][367][warning][misc] [source/common/protobuf/utility.cc:282] Using deprecated option 'envoy.api.v2.listener.Filter.config' from file listener.proto. This configuration will be removed from Envoy soon. Please see https://www.envoyproxy.io/docs/envoy/latest/intro/deprecated for details.
- [2019-12-12 19:31:17.952][367][warning][misc] [source/common/protobuf/utility.cc:282] Using deprecated option 'envoy.config.filter.network.http_connection_manager.v2.HttpFilter.config' from file http_connection_manager.proto. This configuration will be removed from Envoy soon. Please see https://www.envoyproxy.io/docs/envoy/latest/intro/deprecated for details.
- [2019-12-12 19:31:17.952][367][warning][misc] [source/common/protobuf/utility.cc:282] Using deprecated option 'envoy.config.filter.network.http_connection_manager.v2.HttpFilter.config' from file http_connection_manager.proto. This configuration will be removed from Envoy soon. Please see https://www.envoyproxy.io/docs/envoy/latest/intro/deprecated for details.
- [2019-12-12 19:31:17.952][367][warning][misc] [source/common/protobuf/utility.cc:282] Using deprecated option 'envoy.config.filter.accesslog.v2.AccessLog.config' from file accesslog.proto. This configuration will be removed from Envoy soon. Please see https://www.envoyproxy.io/docs/envoy/latest/intro/deprecated for details.
- [2019-12-12 19:31:17.953][367][warning][config] [source/common/config/grpc_mux_subscription_impl.cc:82] gRPC config for type.googleapis.com/envoy.api.v2.Listener rejected: Error adding/updating listener(s) ambassador-listener-8080: route: unknown cluster 'cluster_productpage_9080'
- [2019-12-12 19:31:17.953][367][warning][misc] [source/common/protobuf/utility.cc:282] Using deprecated option 'envoy.api.v2.listener.Filter.config' from file listener.proto. This configuration will be removed from Envoy soon. Please see https://www.envoyproxy.io/docs/envoy/latest/intro/deprecated for details.
- [2019-12-12 19:31:17.955][367][warning][misc] [source/common/protobuf/utility.cc:282] Using deprecated option 'envoy.config.filter.network.http_connection_manager.v2.HttpFilter.config' from file http_connection_manager.proto. This configuration will be removed from Envoy soon. Please see https://www.envoyproxy.io/docs/envoy/latest/intro/deprecated for details.
- [2019-12-12 19:31:17.955][367][warning][misc] [source/common/protobuf/utility.cc:282] Using deprecated option 'envoy.config.filter.network.http_connection_manager.v2.HttpFilter.config' from file http_connection_manager.proto. This configuration will be removed from Envoy soon. Please see https://www.envoyproxy.io/docs/envoy/latest/intro/deprecated for details.
- [2019-12-12 19:31:17.955][367][warning][misc] [source/common/protobuf/utility.cc:282] Using deprecated option 'envoy.config.filter.accesslog.v2.AccessLog.config' from file accesslog.proto. This configuration will be removed from Envoy soon. Please see https://www.envoyproxy.io/docs/envoy/latest/intro/deprecated for details.
- [2019-12-12 19:31:17.956][367][info][upstream] [source/server/lds_api.cc:63] lds: add/update listener 'ambassador-listener-8080'
- Updating httpbin.default
- Updating productpage.default
- Updating ambassador-devportal-api.default
- Updating ambassador-devportal-demo.default
- Updating ambassador-devportal.default
- KubeStatus DONE 306: result ambassador-devportal-api.default: update OK
- KubeStatus DONE 306: result productpage.default: update OK
- KubeStatus DONE 306: result httpbin.default: update OK
- KubeStatus DONE 306: result ambassador-devportal-demo.default: update OK
- KubeStatus DONE 306: result ambassador-devportal.default: update OK
- 2019-12-12 19:31:18 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: Scout reports {"latest_version": "0.86.1", "application": "ambassador", "cached": false, "timestamp": 1576179078.525621}
- 2019-12-12 19:31:18 diagd aes-1.0.0-ea1-dirty [P306TAmbassadorEventWatcher] INFO: Scout notices: [{"level": "INFO", "message": "Upgrade available! to Ambassador version 0.86.1"}]
- ACCESS [2019-12-12T19:31:28.900Z] "GET /httpbin/ip HTTP/1.1" 504 UAEX 0 0 0 - "10.42.0.1" "curl/7.58.0" "efc12b6f-2b75-4d46-9332-733198026b47" "localhost:30080" "-"
- ACCESS [2019-12-12T19:31:40.424Z] "GET /httpbin/ip HTTP/1.1" 504 UAEX 0 0 0 - "10.42.0.1" "curl/7.58.0" "3a99ec88-8229-4bbb-9722-0621a7248e7a" "localhost:30080" "-"
- ACCESS [2019-12-12T19:31:41.774Z] "GET /httpbin/ip HTTP/1.1" 504 UAEX 0 0 0 - "10.42.0.1" "curl/7.58.0" "c667fb7a-6abc-48fb-a185-7417a5c06d85" "localhost:30080" "-"
- ACCESS [2019-12-12T19:31:42.324Z] "GET /httpbin/ip HTTP/1.1" 504 UAEX 0 0 0 - "10.42.0.1" "curl/7.58.0" "b0d15636-87aa-4744-a352-98a48ae9d38a" "localhost:30080" "-"
- ACCESS [2019-12-12T19:31:42.727Z] "GET /httpbin/ip HTTP/1.1" 504 UAEX 0 0 0 - "10.42.0.1" "curl/7.58.0" "baf31fe7-dd1c-4b2d-a7ce-bb1c07623145" "localhost:30080" "-"
- ACCESS [2019-12-12T19:31:43.043Z] "GET /httpbin/ip HTTP/1.1" 504 UAEX 0 0 0 - "10.42.0.1" "curl/7.58.0" "ea3e10da-6e66-4d30-9fac-23f0b97726c5" "localhost:30080" "-"
- ACCESS [2019-12-12T19:31:43.316Z] "GET /httpbin/ip HTTP/1.1" 504 UAEX 0 0 0 - "10.42.0.1" "curl/7.58.0" "77a8636b-ddaa-4273-904b-0e648e9e9847" "localhost:30080" "-"
- ACCESS [2019-12-12T19:31:44.796Z] "GET /httpbin/ip HTTP/1.1" 504 UAEX 0 0 0 - "10.42.0.1" "curl/7.58.0" "0a6a4ca9-0e9e-4a3a-aa10-0f392cc9bc83" "localhost:30080" "-"
- ACCESS [2019-12-12T19:31:45.141Z] "GET /httpbin/ip HTTP/1.1" 504 UAEX 0 0 0 - "10.42.0.1" "curl/7.58.0" "17ace30c-0d01-4f8f-9eb1-b82b3dec6bb7" "localhost:30080" "-"
- ACCESS [2019-12-12T19:31:45.421Z] "GET /httpbin/ip HTTP/1.1" 504 UAEX 0 0 0 - "10.42.0.1" "curl/7.58.0" "16e20637-da97-47b2-b4c4-bf097fcbc80c" "localhost:30080" "-"
- ACCESS [2019-12-12T19:31:47.737Z] "GET /httpbin/ip HTTP/1.1" 504 UAEX 0 0 0 - "10.42.0.1" "curl/7.58.0" "562a955e-331e-4983-83fa-316c2b9d7369" "localhost:30080" "-"
- ACCESS [2019-12-12T19:31:48.178Z] "GET /httpbin/ip HTTP/1.1" 504 UAEX 0 0 0 - "10.42.0.1" "curl/7.58.0" "d295f7f8-4ad8-4148-bf51-45c700223700" "localhost:30080" "-"
- ACCESS [2019-12-12T19:32:57.730Z] "GET /httpbin/ip HTTP/1.1" 504 UAEX 0 0 0 - "10.42.0.13" "curl/7.66.0" "940f3883-4920-4024-9a6b-dfc7736835e1" "localhost:8080" "-"
- ACCESS [2019-12-12T19:34:59.442Z] "GET /docs/ HTTP/1.1" 504 UAEX 0 0 0 - "10.42.0.13" "curl/7.66.0" "79c30bce-f81b-467e-98eb-fce8bba661f5" "localhost:8080" "-"
- ACCESS [2019-12-12T19:38:47.960Z] "GET /httpbin/ip HTTP/1.1" 504 UAEX 0 0 0 - "10.42.0.13" "curl/7.66.0" "c69b4bc8-32aa-497a-9534-b51aaa7e8fbd" "localhost:8080" "-"
- ACCESS [2019-12-12T19:39:57.922Z] "GET /httpbin/ip HTTP/1.1" 504 UAEX 0 0 0 - "10.42.0.13" "curl/7.66.0" "343710ff-4c66-45f2-9c13-fe9a9283790d" "localhost:8080" "-"
- [2019-12-12 19:46:17.946][367][info][main] [source/server/drain_manager_impl.cc:63] shutting down parent after drain
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement