Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- docker-compose up logstash
- Creating logstash ... done
- Attaching to logstash
- logstash | 2019/11/26 18:16:41 Setting 'xpack.management.elasticsearch.hosts' from environment.
- logstash | 2019/11/26 18:16:41 Setting 'xpack.management.enabled' from environment.
- logstash | 2019/11/26 18:16:41 Setting 'xpack.management.pipeline.id' from environment.
- logstash | 2019/11/26 18:16:41 Setting 'xpack.management.elasticsearch.ssl.certificate_authority' from environment.
- logstash | 2019/11/26 18:16:41 Setting 'xpack.management.elasticsearch.password' from environment.
- logstash | 2019/11/26 18:16:41 Setting 'xpack.management.logstash.poll_interval' from environment.
- logstash | 2019/11/26 18:16:41 Setting 'xpack.management.elasticsearch.username' from environment.
- logstash | OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
- logstash | WARNING: An illegal reflective access operation has occurred
- logstash | WARNING: Illegal reflective access by com.headius.backport9.modules.Modules (file:/usr/share/logstash/logstash-core/lib/jars/jruby-complete-9.2.8.0.jar) to field java.io.FileDescriptor.fd
- logstash | WARNING: Please consider reporting this to the maintainers of com.headius.backport9.modules.Modules
- logstash | WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
- logstash | WARNING: All illegal access operations will be denied in a future release
- logstash | Thread.exclusive is deprecated, use Thread::Mutex
- logstash | Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
- logstash | [2019-11-26T18:16:52,579][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
- logstash | [2019-11-26T18:16:52,591][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
- logstash | [2019-11-26T18:16:52,616][INFO ][logstash.configmanagement.bootstrapcheck] Using Elasticsearch as config store {:pipeline_id=>["test_cpm", "test", "beats"], :poll_interval=>"10000000000ns"}
- logstash | [2019-11-26T18:16:53,070][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://logstash_user:[email protected]:9200/]}}
- logstash | [2019-11-26T18:16:53,353][WARN ][logstash.licensechecker.licensereader] Restored connection to ES instance {:url=>"https://logstash_user:[email protected]:9200/"}
- logstash | [2019-11-26T18:16:53,390][INFO ][logstash.licensechecker.licensereader] ES Output version determined {:es_version=>7}
- logstash | [2019-11-26T18:16:53,393][WARN ][logstash.licensechecker.licensereader] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
- logstash | [2019-11-26T18:16:53,507][INFO ][logstash.configmanagement.elasticsearchsource] Configuration Management License OK
- logstash | [2019-11-26T18:16:53,719][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.4.2"}
- logstash | [2019-11-26T18:16:53,739][INFO ][logstash.agent ] No persistent UUID file found. Generating new UUID {:uuid=>"e7770a5e-c859-4382-8603-5b1a4568a6f0", :path=>"/usr/share/logstash/data/uuid"}
- logstash | [2019-11-26T18:16:54,189][WARN ][logstash.monitoringextension.pipelineregisterhook] xpack.monitoring.enabled has not been defined, but found elasticsearch configuration. Please explicitly set `xpack.monitoring.enabled: true` in logstash.yml
- logstash | [2019-11-26T18:16:54,338][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
- logstash | [2019-11-26T18:16:54,363][WARN ][logstash.licensechecker.licensereader] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"http://elasticsearch:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :error=>"Elasticsearch Unreachable: [http://elasticsearch:9200/][Manticore::ResolutionFailure] elasticsearch: Name or service not known"}
- logstash | [2019-11-26T18:16:54,384][WARN ][logstash.licensechecker.licensereader] Marking url as dead. Last error: [LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError] Elasticsearch Unreachable: [http://elasticsearch:9200/][Manticore::ResolutionFailure] elasticsearch {:url=>http://elasticsearch:9200/, :error_message=>"Elasticsearch Unreachable: [http://elasticsearch:9200/][Manticore::ResolutionFailure] elasticsearch", :error_class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError"}
- logstash | [2019-11-26T18:16:54,387][ERROR][logstash.licensechecker.licensereader] Unable to retrieve license information from license server {:message=>"Elasticsearch Unreachable: [http://elasticsearch:9200/][Manticore::ResolutionFailure] elasticsearch"}
- logstash | [2019-11-26T18:16:54,402][ERROR][logstash.monitoring.internalpipelinesource] Failed to fetch X-Pack information from Elasticsearch. This is likely due to failure to reach a live Elasticsearch cluster.
- logstash | [2019-11-26T18:16:54,535][INFO ][logstash.configmanagement.elasticsearchsource] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://logstash_user:[email protected]:9200/]}}
- logstash | [2019-11-26T18:16:54,572][WARN ][logstash.configmanagement.elasticsearchsource] Restored connection to ES instance {:url=>"https://logstash_user:[email protected]:9200/"}
- logstash | [2019-11-26T18:16:54,590][INFO ][logstash.configmanagement.elasticsearchsource] ES Output version determined {:es_version=>7}
- logstash | [2019-11-26T18:16:54,590][WARN ][logstash.configmanagement.elasticsearchsource] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
- logstash | [2019-11-26T18:16:55,665][INFO ][org.reflections.Reflections] Reflections took 25 ms to scan 1 urls, producing 20 keys and 40 values
- logstash | [2019-11-26T18:16:55,953][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://logstash_user:[email protected]:9200/]}}
- logstash | [2019-11-26T18:16:55,985][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"https://logstash_user:[email protected]:9200/"}
- logstash | [2019-11-26T18:16:55,994][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
- logstash | [2019-11-26T18:16:55,994][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
- logstash | [2019-11-26T18:16:56,060][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://192.168.178.100:9200"]}
- logstash | [2019-11-26T18:16:56,114][INFO ][logstash.outputs.elasticsearch] Using default mapping template
- logstash | [2019-11-26T18:16:56,155][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
- logstash | [2019-11-26T18:16:56,160][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"test", "pipeline.workers"=>1, "pipeline.batch.size"=>10, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>10, :thread=>"#<Thread:0x1bbccaf8 run>"}
- logstash | [2019-11-26T18:16:56,183][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
- logstash | [2019-11-26T18:16:56,199][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"test"}
- logstash | [2019-11-26T18:16:56,260][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:test], :non_running_pipelines=>[]}
- logstash | [2019-11-26T18:16:56,489][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
Advertisement
Add Comment
Please, Sign In to add comment