Advertisement
Guest User

Untitled

a guest
Jul 23rd, 2021
89
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 226.16 KB | None | 0 0
  1.  
  2.  
  3. 2021-07-23 12:43:50,595 WARN [main] Errors:173 - The following warnings have been detected with resource and/or provider classes:
  4. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ExtensionsService.getExtensions(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  5. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ExtensionsService.getExtensionVersions(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  6. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ExtensionsService.getExtensionVersion(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String), should not consume any entity.
  7. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ExtensionsService.getExtensionVersionLinks(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String), should not consume any entity.
  8. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ExtensionsService.getExtension(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  9. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.users.UserService.getUsers(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  10. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.users.UserService.getUser(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  11. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.KerberosDescriptorService.getKerberosDescriptors(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  12. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.KerberosDescriptorService.getKerberosDescriptor(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  13. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.HostService.getHosts(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  14. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.HostService.getHost(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  15. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.InstanceService.getInstances(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  16. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.InstanceService.getInstance(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  17. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.views.ViewService.getViews(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  18. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.views.ViewService.getView(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  19. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.FeedService.getFeeds(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  20. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.FeedService.getFeed(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  21. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getStacks(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  22. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getStack(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  23. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getStackVersions(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  24. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getStackServices(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String), should not consume any entity.
  25. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getStackConfigurations(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String,java.lang.String), should not consume any entity.
  26. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getStackLevelConfigurations(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String), should not consume any entity.
  27. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getStackConfigurationDependencies(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String,java.lang.String,java.lang.String), should not consume any entity.
  28. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getStackVersion(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String), should not consume any entity.
  29. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getServiceComponents(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String,java.lang.String), should not consume any entity.
  30. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getServiceComponent(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String,java.lang.String,java.lang.String), should not consume any entity.
  31. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getStackVersionLinks(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String), should not consume any entity.
  32. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getStackLevelConfiguration(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String,java.lang.String,java.lang.String), should not consume any entity.
  33. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getStackService(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String,java.lang.String), should not consume any entity.
  34. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getStackArtifacts(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String), should not consume any entity.
  35. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getStackArtifact(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String,java.lang.String), should not consume any entity.
  36. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getStackServiceArtifacts(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String,java.lang.String), should not consume any entity.
  37. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getStackServiceThemes(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String,java.lang.String), should not consume any entity.
  38. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getStackServiceTheme(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String,java.lang.String,java.lang.String), should not consume any entity.
  39. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getStackServiceQuickLinksConfigurations(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String,java.lang.String), should not consume any entity.
  40. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getStackServiceQuickLinksConfiguration(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String,java.lang.String,java.lang.String), should not consume any entity.
  41. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getStackServiceArtifact(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String,java.lang.String,java.lang.String), should not consume any entity.
  42. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getStackConfiguration(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String,java.lang.String,java.lang.String), should not consume any entity.
  43. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getServiceComponentDependencies(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String,java.lang.String,java.lang.String), should not consume any entity.
  44. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.StacksService.getServiceComponentDependency(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.lang.String), should not consume any entity.
  45. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.RootServiceService.getRootServices(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  46. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.RootServiceService.getRootServiceComponents(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  47. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.RootServiceService.getRootServiceHostComponent(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String,java.lang.String), should not consume any entity.
  48. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.RootServiceService.getRootServiceHostComponents(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String), should not consume any entity.
  49. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.RootServiceService.getRootServiceComponent(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String), should not consume any entity.
  50. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.RootServiceService.getRootServiceComponentHosts(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String), should not consume any entity.
  51. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.RootServiceService.getRootService(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  52. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.RootServiceService.getRootHosts(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  53. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.RootServiceService.getRootHost(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  54. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ClusterService.getClusters(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  55. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ClusterService.getCluster(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  56. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ClusterService.getClusterArtifacts(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  57. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ClusterService.getClusterArtifact(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String), should not consume any entity.
  58. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.views.ViewVersionService.getVersions(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  59. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.views.ViewVersionService.getVersion(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String), should not consume any entity.
  60. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.TargetClusterService.getTargetClusters(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  61. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.TargetClusterService.getTargetCluster(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  62. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ActionService.getActionDefinitions(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  63. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ActionService.getActionDefinition(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  64. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.users.ActiveWidgetLayoutService.getServices(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  65. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.RequestService.getRequests(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  66. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.RequestService.getRequest(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  67. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.SettingService.getSettings(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  68. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.SettingService.getSetting(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  69. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ExtensionLinksService.getExtensionLinks(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  70. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ExtensionLinksService.getExtensionLink(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  71. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.views.ViewInstanceService.getServices(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String) throws org.apache.ambari.server.security.authorization.AuthorizationException, should not consume any entity.
  72. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.views.ViewInstanceService.getService(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String,java.lang.String) throws org.apache.ambari.server.security.authorization.AuthorizationException, should not consume any entity.
  73. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.BlueprintService.getBlueprints(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  74. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.BlueprintService.getBlueprint(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  75. 2021-07-23 12:43:50,639 INFO [main] AmbariServer:581 - ********* Started Server **********
  76. 2021-07-23 12:43:50,640 INFO [main] AmbariServer:584 - Starting View Directory Watcher
  77. 2021-07-23 12:43:50,644 INFO [main] ActionManager:73 - Starting scheduler thread
  78. 2021-07-23 12:43:50,644 INFO [main] ServerActionExecutor:164 - Starting Server Action Executor thread...
  79. 2021-07-23 12:43:50,645 INFO [main] ServerActionExecutor:191 - Server Action Executor thread started.
  80. 2021-07-23 12:43:50,645 INFO [main] AmbariServer:589 - ********* Started ActionManager **********
  81. 2021-07-23 12:43:50,645 INFO [main] ExecutionScheduleManager:212 - Starting scheduler
  82. 2021-07-23 12:43:50,683 INFO [main] StdSchedulerFactory:1036 - Using ConnectionProvider class 'org.quartz.utils.C3p0PoolingConnectionProvider' for data source 'myDS'
  83. 2021-07-23 12:43:50,705 INFO [MLog-Init-Reporter] MLog:212 - MLog clients using slf4j logging.
  84. 2021-07-23 12:43:50,777 INFO [main] C3P0Registry:212 - Initializing c3p0-0.9.5.4 [built 23-March-2019 23:00:48 -0700; debug? true; trace: 10]
  85. 2021-07-23 12:43:50,827 INFO [main] StdSchedulerFactory:1220 - Using default implementation for ThreadExecutor
  86. 2021-07-23 12:43:50,843 INFO [main] SchedulerSignalerImpl:61 - Initialized Scheduler Signaller of type: class org.quartz.core.SchedulerSignalerImpl
  87. 2021-07-23 12:43:50,843 INFO [main] QuartzScheduler:229 - Quartz Scheduler v.2.3.2 created.
  88. 2021-07-23 12:43:50,844 INFO [main] JobStoreTX:675 - Using thread monitor-based data access locking (synchronization).
  89. 2021-07-23 12:43:50,845 INFO [main] JobStoreTX:59 - JobStoreTX initialized.
  90. 2021-07-23 12:43:50,845 INFO [main] QuartzScheduler:294 - Scheduler meta-data: Quartz Scheduler (v2.3.2) 'ExecutionScheduler' with instanceId 'NON_CLUSTERED'
  91. Scheduler class: 'org.quartz.core.QuartzScheduler' - running locally.
  92. NOT STARTED.
  93. Currently in standby mode.
  94. Number of jobs executed: 0
  95. Using thread pool 'org.quartz.simpl.SimpleThreadPool' - with 5 threads.
  96. Using job-store 'org.quartz.impl.jdbcjobstore.JobStoreTX' - which supports persistence. and is not clustered.
  97.  
  98. 2021-07-23 12:43:50,845 INFO [main] StdSchedulerFactory:1374 - Quartz scheduler 'ExecutionScheduler' initialized from an externally provided properties instance.
  99. 2021-07-23 12:43:50,846 INFO [main] StdSchedulerFactory:1378 - Quartz scheduler version: 2.3.2
  100. 2021-07-23 12:43:50,846 INFO [main] QuartzScheduler:2293 - JobFactory set to: org.apache.ambari.server.state.scheduler.GuiceJobFactory@19203ff3
  101. 2021-07-23 12:43:50,846 INFO [main] AmbariServer:592 - ********* Started Scheduled Request Manager **********
  102. 2021-07-23 12:43:50,851 INFO [main] MetricsRetrievalService:229 - Initializing the Metrics Retrieval Service with core=8, max=16, workerQueue=160, threadPriority=5
  103. 2021-07-23 12:43:50,851 INFO [main] MetricsRetrievalService:234 - Metrics Retrieval Service request TTL cache is enabled and set to 5 seconds
  104. 2021-07-23 12:43:50,851 INFO [RetryUpgradeActionService STARTING] RetryUpgradeActionService:122 - Will not start service RetryUpgradeActionService used to auto-retry failed actions during Stack Upgrade since since the property stack.upgrade.auto.retry.timeout.mins is either invalid/missing or set to 0
  105. 2021-07-23 12:43:50,851 INFO [main] AmbariServer:595 - ********* Started Services **********
  106. 2021-07-23 12:43:50,852 INFO [main] MetricsServiceImpl:51 - ********* Initializing AmbariServer Metrics Service **********
  107. 2021-07-23 12:43:50,887 INFO [AmbariServerAlertService STARTING] AmbariServerAlertService:258 - Scheduled server alert ambari_server_agent_heartbeat to run every 2 minutes
  108. 2021-07-23 12:43:50,890 INFO [AmbariServerAlertService STARTING] AmbariServerAlertService:258 - Scheduled server alert ambari_server_performance to run every 5 minutes
  109. 2021-07-23 12:43:50,900 INFO [AmbariServerAlertService STARTING] AmbariServerAlertService:258 - Scheduled server alert ambari_server_component_version to run every 5 minutes
  110. 2021-07-23 12:43:50,904 INFO [AmbariServerAlertService STARTING] AmbariServerAlertService:258 - Scheduled server alert ambari_server_stale_alerts to run every 5 minutes
  111. 2021-07-23 12:43:51,012 INFO [main] MetricsServiceImpl:84 - ********* Configuring Metric Sink **********
  112. 2021-07-23 12:43:51,031 INFO [main] AmbariMetricSinkImpl:187 - Hostname used for ambari server metrics : np-dev1-hdp315-namenode-01.DOMAIN.COM
  113. 2021-07-23 12:43:51,035 INFO [main] AmbariMetricSinkImpl:207 - Metric Sink initialized with collectorHosts : [datanodeFQDN.DOMAIN.COM]
  114. 2021-07-23 12:43:51,035 INFO [main] MetricsServiceImpl:91 - ********* Configuring Metric Sources **********
  115. 2021-07-23 12:43:51,048 INFO [main] JvmMetricsSource:63 - Initialized JVM Metrics source...
  116. 2021-07-23 12:43:51,048 INFO [main] JvmMetricsSource:80 - Started JVM Metrics source...
  117. 2021-07-23 12:43:51,049 INFO [main] StompEventsMetricsSource:61 - Starting stomp events source...
  118. 2021-07-23 12:43:54,690 INFO [ambari-client-thread-194] AnnotationSizeOfFilter:53 - Using regular expression provided through VM argument net.sf.ehcache.pool.sizeof.ignore.pattern for IgnoreSizeOf annotation : ^.*cache\..*IgnoreSizeOf$
  119. 2021-07-23 12:43:54,695 INFO [ambari-client-thread-194] AgentLoader:88 - Located valid 'tools.jar' at '/usr/jdk64/jdk1.8.0_112/jre/../lib/tools.jar'
  120. 2021-07-23 12:43:54,705 INFO [ambari-client-thread-194] JvmInformation:446 - Detected JVM data model settings of: 64-Bit HotSpot JVM with Compressed OOPs and Concurrent Mark-and-Sweep GC
  121. 2021-07-23 12:43:54,942 INFO [ambari-client-thread-194] AgentLoader:198 - Extracted agent jar to temporary file /tmp/ehcache-sizeof-agent2404571265536762794.jar
  122. 2021-07-23 12:43:54,942 INFO [ambari-client-thread-194] AgentLoader:138 - Trying to load agent @ /tmp/ehcache-sizeof-agent2404571265536762794.jar
  123. 2021-07-23 12:43:54,947 INFO [ambari-client-thread-194] DefaultSizeOfEngine:111 - using Agent sizeof engine
  124. 2021-07-23 12:43:54,959 INFO [ambari-client-thread-194] TimelineMetricsCacheSizeOfEngine:70 - Creating custom sizeof engine for TimelineMetrics.
  125. 2021-07-23 12:43:54,975 INFO [ambari-client-thread-194] TimelineMetricCacheProvider:84 - Creating Metrics Cache with timeouts => ttl = 3600, idle = 1800
  126. 2021-07-23 12:43:55,029 INFO [ambari-client-thread-194] TimelineMetricCacheProvider:95 - Registering metrics cache with provider: name = timelineMetricCache, guid: np-dev1-hdp315-namenode-01/10.120.8.123-eb471f64-1bf6-4fdd-be61-3e8f022190c2
  127. 2021-07-23 12:43:55,111 INFO [ambari-client-thread-194] MetricsCollectorHAManager:63 - Adding collector host : datanodeFQDN.DOMAIN.COM to cluster : hdpcluster
  128. 2021-07-23 12:43:55,115 INFO [ambari-client-thread-194] MetricsCollectorHAClusterState:81 - Refreshing collector host, current collector host : null
  129. 2021-07-23 12:43:55,117 INFO [ambari-client-thread-194] MetricsCollectorHAClusterState:102 - After refresh, new collector host : datanodeFQDN.DOMAIN.COM
  130. 2021-07-23 12:43:55,641 INFO [ambari-client-thread-195] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = b9c81720-870c-8417-eb63-f529da1bc9c5, destination = /events/hostcomponents and id = sub-0
  131. 2021-07-23 12:43:55,665 INFO [ambari-client-thread-195] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = b9c81720-870c-8417-eb63-f529da1bc9c5, destination = /events/alerts and id = sub-1
  132. 2021-07-23 12:43:55,665 INFO [ambari-client-thread-195] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = b9c81720-870c-8417-eb63-f529da1bc9c5, destination = /events/ui_topologies and id = sub-2
  133. 2021-07-23 12:43:55,666 INFO [ambari-client-thread-195] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = b9c81720-870c-8417-eb63-f529da1bc9c5, destination = /events/configs and id = sub-3
  134. 2021-07-23 12:43:55,666 INFO [ambari-client-thread-195] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = b9c81720-870c-8417-eb63-f529da1bc9c5, destination = /events/services and id = sub-4
  135. 2021-07-23 12:43:55,667 INFO [ambari-client-thread-195] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = b9c81720-870c-8417-eb63-f529da1bc9c5, destination = /events/hosts and id = sub-5
  136. 2021-07-23 12:43:55,667 INFO [ambari-client-thread-195] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = b9c81720-870c-8417-eb63-f529da1bc9c5, destination = /events/alert_definitions and id = sub-6
  137. 2021-07-23 12:43:55,668 INFO [ambari-client-thread-195] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = b9c81720-870c-8417-eb63-f529da1bc9c5, destination = /events/alert_group and id = sub-7
  138. 2021-07-23 12:43:55,668 INFO [ambari-client-thread-195] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = b9c81720-870c-8417-eb63-f529da1bc9c5, destination = /events/upgrade and id = sub-8
  139. 2021-07-23 12:43:55,730 INFO [ambari-client-thread-196] TopologyManager:1036 - TopologyManager.replayRequests: Entering
  140. 2021-07-23 12:43:55,730 INFO [ambari-client-thread-196] TopologyManager:1090 - TopologyManager.replayRequests: Exit
  141. 2021-07-23 12:43:55,772 WARN [ambari-client-thread-219] Errors:173 - The following warnings have been detected with resource and/or provider classes:
  142. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ServiceService.getServices(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  143. WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ServiceService.getServices(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), with URI template, "", is treated as a resource method
  144. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ServiceService.getService(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  145. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ServiceService.getArtifacts(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  146. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ServiceService.getArtifact(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String), should not consume any entity.
  147. 2021-07-23 12:43:55,772 WARN [ambari-client-thread-219] Errors:173 - The following warnings have been detected with resource and/or provider classes:
  148. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ServiceService.getServices(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  149. WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ServiceService.getServices(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), with URI template, "", is treated as a resource method
  150. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ServiceService.getService(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  151. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ServiceService.getArtifacts(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  152. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ServiceService.getArtifact(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String), should not consume any entity.
  153. 2021-07-23 12:43:55,773 INFO [ambari-client-thread-249] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = b9c81720-870c-8417-eb63-f529da1bc9c5, destination = /events/requests and id = sub-9
  154. 2021-07-23 12:43:56,001 WARN [ambari-client-thread-195] Errors:173 - The following warnings have been detected with resource and/or provider classes:
  155. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ComponentService.getComponents(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  156. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ComponentService.getComponent(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String), should not consume any entity.
  157. 2021-07-23 12:43:56,001 WARN [ambari-client-thread-195] Errors:173 - The following warnings have been detected with resource and/or provider classes:
  158. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ComponentService.getComponents(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  159. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ComponentService.getComponent(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String), should not consume any entity.
  160. 2021-07-23 12:43:56,019 INFO [ambari-client-thread-249] AMSPropertyProvider:626 - METRICS_COLLECTOR host is not live. Skip populating resources with metrics, next message will be logged after 1000 attempts.
  161. 2021-07-23 12:43:56,871 INFO [agent-register-processor-0] HeartBeatHandler:321 - agentOsType = centos7
  162. 2021-07-23 12:43:56,873 INFO [pool-2-thread-1] StackAdvisorHelper:245 - Clear stack advisor caches, host: datanodeFQDN.DOMAIN.COM
  163. 2021-07-23 12:43:56,877 INFO [agent-register-processor-0] HostImpl:343 - Received host registration, host=[hostname=datanodeFQDN,fqdn=datanodeFQDN.DOMAIN.COM,domain=DOMAIN.COM,architecture=x86_64,processorcount=4,physicalprocessorcount=4,osname=centos,osversion=7.9.2009,osfamily=redhat,memory=16266536,uptime_hours=16,mounts=(available=17333772,mountpoint=/,used=32412424,percent=66%,size=49746196,device=/dev/mapper/centos-root,type=xfs)]
  164. , registrationTime=1627058636870, agentVersion=2.7.5.0
  165. 2021-07-23 12:43:56,877 INFO [pool-2-thread-1] StackAdvisorHelper:245 - Clear stack advisor caches, host: datanodeFQDN.DOMAIN.COM
  166. 2021-07-23 12:43:56,877 INFO [agent-register-processor-0] TopologyManager:665 - TopologyManager.onHostRegistered: Entering
  167. 2021-07-23 12:43:56,877 INFO [agent-register-processor-0] TopologyManager:667 - TopologyManager.onHostRegistered: host = datanodeFQDN.DOMAIN.COM is already associated with the cluster or is currently being processed
  168. 2021-07-23 12:43:56,890 INFO [pool-2-thread-1] StackAdvisorHelper:245 - Clear stack advisor caches, host: datanodeFQDN.DOMAIN.COM
  169. 2021-07-23 12:43:57,086 INFO [clientInboundChannel-9] Configuration:3197 - Ambari properties config file changed.
  170. 2021-07-23 12:43:57,086 INFO [clientInboundChannel-9] Configuration:3226 - Ambari properties config file changed.
  171. 2021-07-23 12:43:57,673 INFO [ambari-client-thread-219] AMSReportPropertyProvider:150 - METRICS_COLLECTOR host is not live. Skip populating resources with metrics, next message will be logged after 1000 attempts.
  172. 2021-07-23 12:44:00,078 WARN [ambari-client-thread-195] Errors:173 - The following warnings have been detected with resource and/or provider classes:
  173. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ConfigurationService.getConfigurations(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  174. WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ConfigurationService.getConfigurations(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), with URI template, "", is treated as a resource method
  175. WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ConfigurationService.createConfigurations(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), with URI template, "", is treated as a resource method
  176. 2021-07-23 12:44:00,078 WARN [ambari-client-thread-195] Errors:173 - The following warnings have been detected with resource and/or provider classes:
  177. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ConfigurationService.getConfigurations(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  178. WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ConfigurationService.getConfigurations(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), with URI template, "", is treated as a resource method
  179. WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ConfigurationService.createConfigurations(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), with URI template, "", is treated as a resource method
  180. 2021-07-23 12:44:00,080 WARN [ambari-client-thread-195] Errors:173 - The following warnings have been detected with resource and/or provider classes:
  181. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ServiceConfigVersionService.getServiceConfigVersions(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  182. WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ServiceConfigVersionService.getServiceConfigVersions(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), with URI template, "", is treated as a resource method
  183. 2021-07-23 12:44:00,080 WARN [ambari-client-thread-195] Errors:173 - The following warnings have been detected with resource and/or provider classes:
  184. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ServiceConfigVersionService.getServiceConfigVersions(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  185. WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.ServiceConfigVersionService.getServiceConfigVersions(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), with URI template, "", is treated as a resource method
  186. 2021-07-23 12:44:07,996 INFO [agent-register-processor-1] HeartBeatHandler:321 - agentOsType = centos7
  187. 2021-07-23 12:44:07,998 INFO [pool-2-thread-1] StackAdvisorHelper:245 - Clear stack advisor caches, host: datanodeFQDN.DOMAIN.COM
  188. 2021-07-23 12:44:07,999 INFO [pool-2-thread-1] StackAdvisorHelper:245 - Clear stack advisor caches, host: datanodeFQDN.DOMAIN.COM
  189. 2021-07-23 12:44:08,001 INFO [agent-register-processor-1] HostImpl:343 - Received host registration, host=[hostname=datanodeFQDN,fqdn=datanodeFQDN.DOMAIN.COM,domain=DOMAIN.COM,architecture=x86_64,processorcount=4,physicalprocessorcount=4,osname=centos,osversion=7.9.2009,osfamily=redhat,memory=16266536,uptime_hours=16,mounts=(available=17333772,mountpoint=/,used=32412424,percent=66%,size=49746196,device=/dev/mapper/centos-root,type=xfs)]
  190. , registrationTime=1627058647996, agentVersion=2.7.5.0
  191. 2021-07-23 12:44:08,001 INFO [agent-register-processor-1] TopologyManager:665 - TopologyManager.onHostRegistered: Entering
  192. 2021-07-23 12:44:08,001 INFO [agent-register-processor-1] TopologyManager:667 - TopologyManager.onHostRegistered: host = datanodeFQDN.DOMAIN.COM is already associated with the cluster or is currently being processed
  193. 2021-07-23 12:44:08,003 INFO [pool-2-thread-1] StackAdvisorHelper:245 - Clear stack advisor caches, host: datanodeFQDN.DOMAIN.COM
  194. 2021-07-23 12:44:08,152 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component RESOURCEMANAGER of service YARN of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  195. 2021-07-23 12:44:08,153 WARN [agent-report-processor-0] HeartbeatProcessor:679 - Received a live status update for a non-initialized service, clusterId=2, serviceName=KERBEROS
  196. 2021-07-23 12:44:08,155 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component ZEPPELIN_MASTER of service ZEPPELIN of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  197. 2021-07-23 12:44:08,156 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component MYSQL_SERVER of service HIVE of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  198. 2021-07-23 12:44:08,158 INFO [pool-2-thread-1] StackAdvisorHelper:245 - Clear stack advisor caches, host: datanodeFQDN.DOMAIN.COM
  199. 2021-07-23 12:44:08,161 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component METRICS_GRAFANA of service AMBARI_METRICS of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  200. 2021-07-23 12:44:08,163 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component NAMENODE of service HDFS of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  201. 2021-07-23 12:44:08,164 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component METRICS_MONITOR of service AMBARI_METRICS of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  202. 2021-07-23 12:44:08,167 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component NODEMANAGER of service YARN of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  203. 2021-07-23 12:44:08,168 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component YARN_REGISTRY_DNS of service YARN of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  204. 2021-07-23 12:44:08,169 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component HISTORYSERVER of service MAPREDUCE2 of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  205. 2021-07-23 12:44:08,171 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component SPARK2_JOBHISTORYSERVER of service SPARK2 of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  206. 2021-07-23 12:44:08,172 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component INFRA_SOLR of service AMBARI_INFRA_SOLR of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  207. 2021-07-23 12:44:08,173 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component APP_TIMELINE_SERVER of service YARN of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  208. 2021-07-23 12:44:08,175 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component HBASE_MASTER of service HBASE of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  209. 2021-07-23 12:44:08,177 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component ACTIVITY_ANALYZER of service SMARTSENSE of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  210. 2021-07-23 12:44:08,178 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component SECONDARY_NAMENODE of service HDFS of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  211. 2021-07-23 12:44:08,179 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component HIVE_SERVER of service HIVE of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  212. 2021-07-23 12:44:08,180 WARN [agent-report-processor-0] HeartbeatProcessor:679 - Received a live status update for a non-initialized service, clusterId=2, serviceName=KERBEROS
  213. 2021-07-23 12:44:08,181 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component HBASE_REGIONSERVER of service HBASE of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  214. 2021-07-23 12:44:08,182 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component METRICS_COLLECTOR of service AMBARI_METRICS of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  215. 2021-07-23 12:44:08,183 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component HST_AGENT of service SMARTSENSE of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  216. 2021-07-23 12:44:08,185 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component HST_SERVER of service SMARTSENSE of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  217. 2021-07-23 12:44:08,186 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component KAFKA_BROKER of service KAFKA of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  218. 2021-07-23 12:44:08,187 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component ATLAS_SERVER of service ATLAS of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  219. 2021-07-23 12:44:08,189 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component ZOOKEEPER_SERVER of service ZOOKEEPER of cluster 2 has changed from UNKNOWN to STARTED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  220. 2021-07-23 12:44:08,191 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component HIVE_METASTORE of service HIVE of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  221. 2021-07-23 12:44:08,192 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component DATANODE of service HDFS of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  222. 2021-07-23 12:44:08,193 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component ACTIVITY_EXPLORER of service SMARTSENSE of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  223. 2021-07-23 12:44:08,195 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component TIMELINE_READER of service YARN of cluster 2 has changed from UNKNOWN to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  224. 2021-07-23 12:44:08,197 INFO [pool-2-thread-1] StackAdvisorHelper:245 - Clear stack advisor caches, host: datanodeFQDN.DOMAIN.COM
  225. 2021-07-23 12:44:10,994 ERROR [alert-event-bus-1] AmbariJpaLocalTxnInterceptor:180 - [DETAILED ERROR] Rollback reason:
  226. Local Exception Stack:
  227. Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.6.2.v20151217-774c696): org.eclipse.persistence.exceptions.DatabaseException
  228. Internal Exception: org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
  229. Error Code: 0
  230. Call: UPDATE alert_current SET latest_timestamp = ?, latest_text = ?, occurrences = ? WHERE (alert_id = ?)
  231. bind => [4 parameters bound]
  232. at org.eclipse.persistence.exceptions.DatabaseException.sqlException(DatabaseException.java:340)
  233. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.processExceptionForCommError(DatabaseAccessor.java:1620)
  234. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeDirectNoSelect(DatabaseAccessor.java:900)
  235. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeNoSelect(DatabaseAccessor.java:964)
  236. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.basicExecuteCall(DatabaseAccessor.java:633)
  237. at org.eclipse.persistence.internal.databaseaccess.ParameterizedSQLBatchWritingMechanism.executeBatch(ParameterizedSQLBatchWritingMechanism.java:149)
  238. at org.eclipse.persistence.internal.databaseaccess.ParameterizedSQLBatchWritingMechanism.executeBatchedStatements(ParameterizedSQLBatchWritingMechanism.java:134)
  239. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.writesCompleted(DatabaseAccessor.java:1845)
  240. at org.eclipse.persistence.internal.sessions.AbstractSession.writesCompleted(AbstractSession.java:4300)
  241. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.writesCompleted(UnitOfWorkImpl.java:5592)
  242. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.acquireWriteLocks(UnitOfWorkImpl.java:1646)
  243. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitTransactionAfterWriteChanges(UnitOfWorkImpl.java:1614)
  244. at org.eclipse.persistence.internal.sessions.RepeatableWriteUnitOfWork.commitRootUnitOfWork(RepeatableWriteUnitOfWork.java:285)
  245. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitAndResume(UnitOfWorkImpl.java:1169)
  246. at org.eclipse.persistence.internal.jpa.transaction.EntityTransactionImpl.commit(EntityTransactionImpl.java:134)
  247. at org.apache.ambari.server.orm.AmbariJpaLocalTxnInterceptor.invoke(AmbariJpaLocalTxnInterceptor.java:153)
  248. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:77)
  249. at com.google.inject.internal.InterceptorStackCallback.intercept(InterceptorStackCallback.java:55)
  250. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.saveEntities(<generated>)
  251. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener.onAlertEvent(AlertReceivedListener.java:388)
  252. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.CGLIB$onAlertEvent$0(<generated>)
  253. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173$$FastClassByGuice$$3f418344.invoke(<generated>)
  254. at com.google.inject.internal.cglib.proxy.$MethodProxy.invokeSuper(MethodProxy.java:228)
  255. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:76)
  256. at org.apache.ambari.server.orm.AmbariLocalSessionInterceptor.invoke(AmbariLocalSessionInterceptor.java:44)
  257. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:77)
  258. at com.google.inject.internal.InterceptorStackCallback.intercept(InterceptorStackCallback.java:55)
  259. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.onAlertEvent(<generated>)
  260. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  261. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  262. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  263. at java.lang.reflect.Method.invoke(Method.java:498)
  264. at com.google.common.eventbus.Subscriber.invokeSubscriberMethod(Subscriber.java:87)
  265. at com.google.common.eventbus.Subscriber$1.run(Subscriber.java:72)
  266. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  267. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  268. at java.lang.Thread.run(Thread.java:745)
  269. Caused by: org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
  270. at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2433)
  271. at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2178)
  272. at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:306)
  273. at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:441)
  274. at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:365)
  275. at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:155)
  276. at org.postgresql.jdbc.PgPreparedStatement.executeUpdate(PgPreparedStatement.java:132)
  277. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeDirectNoSelect(DatabaseAccessor.java:892)
  278. ... 34 more
  279. 2021-07-23 12:44:10,995 ERROR [alert-event-bus-1] AmbariJpaLocalTxnInterceptor:188 - [DETAILED ERROR] Internal exception (1) :
  280. org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
  281. at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2433)
  282. at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2178)
  283. at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:306)
  284. at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:441)
  285. at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:365)
  286. at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:155)
  287. at org.postgresql.jdbc.PgPreparedStatement.executeUpdate(PgPreparedStatement.java:132)
  288. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeDirectNoSelect(DatabaseAccessor.java:892)
  289. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeNoSelect(DatabaseAccessor.java:964)
  290. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.basicExecuteCall(DatabaseAccessor.java:633)
  291. at org.eclipse.persistence.internal.databaseaccess.ParameterizedSQLBatchWritingMechanism.executeBatch(ParameterizedSQLBatchWritingMechanism.java:149)
  292. at org.eclipse.persistence.internal.databaseaccess.ParameterizedSQLBatchWritingMechanism.executeBatchedStatements(ParameterizedSQLBatchWritingMechanism.java:134)
  293. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.writesCompleted(DatabaseAccessor.java:1845)
  294. at org.eclipse.persistence.internal.sessions.AbstractSession.writesCompleted(AbstractSession.java:4300)
  295. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.writesCompleted(UnitOfWorkImpl.java:5592)
  296. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.acquireWriteLocks(UnitOfWorkImpl.java:1646)
  297. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitTransactionAfterWriteChanges(UnitOfWorkImpl.java:1614)
  298. at org.eclipse.persistence.internal.sessions.RepeatableWriteUnitOfWork.commitRootUnitOfWork(RepeatableWriteUnitOfWork.java:285)
  299. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitAndResume(UnitOfWorkImpl.java:1169)
  300. at org.eclipse.persistence.internal.jpa.transaction.EntityTransactionImpl.commit(EntityTransactionImpl.java:134)
  301. at org.apache.ambari.server.orm.AmbariJpaLocalTxnInterceptor.invoke(AmbariJpaLocalTxnInterceptor.java:153)
  302. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:77)
  303. at com.google.inject.internal.InterceptorStackCallback.intercept(InterceptorStackCallback.java:55)
  304. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.saveEntities(<generated>)
  305. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener.onAlertEvent(AlertReceivedListener.java:388)
  306. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.CGLIB$onAlertEvent$0(<generated>)
  307. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173$$FastClassByGuice$$3f418344.invoke(<generated>)
  308. at com.google.inject.internal.cglib.proxy.$MethodProxy.invokeSuper(MethodProxy.java:228)
  309. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:76)
  310. at org.apache.ambari.server.orm.AmbariLocalSessionInterceptor.invoke(AmbariLocalSessionInterceptor.java:44)
  311. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:77)
  312. at com.google.inject.internal.InterceptorStackCallback.intercept(InterceptorStackCallback.java:55)
  313. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.onAlertEvent(<generated>)
  314. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  315. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  316. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  317. at java.lang.reflect.Method.invoke(Method.java:498)
  318. at com.google.common.eventbus.Subscriber.invokeSubscriberMethod(Subscriber.java:87)
  319. at com.google.common.eventbus.Subscriber$1.run(Subscriber.java:72)
  320. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  321. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  322. at java.lang.Thread.run(Thread.java:745)
  323. 2021-07-23 12:44:10,996 ERROR [alert-event-bus-1] default:232 - Exception thrown by subscriber method onAlertEvent(org.apache.ambari.server.events.AlertReceivedEvent) on subscriber org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173@6d229b1c when dispatching event: AlertReceivedEvent{cluserId=0, alerts=[{clusterId=2, state=CRITICAL, name=hive_server_process, service=HIVE, component=HIVE_SERVER, host=datanodeFQDN.DOMAIN.COM, instance=null, text='Connection failed on host datanodeFQDN.DOMAIN.COM:10000 (Traceback (most recent call last):
  324. File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/alerts/alert_hive_thrift_port.py", line 213, in execute
  325. ldap_password=ldap_password, pam_username=pam_username, pam_password=pam_password)
  326. File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/hive_check.py", line 95, in check_thrift_port_sasl
  327. timeout_kill_strategy=TerminateStrategy.KILL_PROCESS_TREE,
  328. File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
  329. self.env.run()
  330. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
  331. self.run_action(resource, action)
  332. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
  333. provider_action()
  334. File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
  335. returns=self.resource.returns)
  336. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
  337. result = function(command, **kwargs)
  338. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
  339. tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  340. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
  341. result = _call(command, **kwargs_copy)
  342. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
  343. raise ExecutionFailed(err_msg, code, out, err)
  344. ExecutionFailed: Execution of '! (beeline -u 'jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary' -n hive -e ';' 2>&1 | awk '{print}' | grep -vz -i -e 'Connected to:' -e 'Transaction isolation:' -e 'inactive HS2 instance; use service discovery')' returned 1. Could not find valid SPARK_HOME while searching ['/home', '/usr/local/bin']
  345.  
  346. Did you install PySpark via a package manager such as pip or Conda? If so,
  347. PySpark was not found in your Python environment. It is possible your
  348. Python environment does not properly bind with your package manager.
  349.  
  350. Please check your default 'python' and if you set PYSPARK_PYTHON and/or
  351. PYSPARK_DRIVER_PYTHON environment variables, and see if you can import
  352. PySpark, for example, 'python -c 'import pyspark'.
  353.  
  354. If you cannot import, you can install by using the Python executable directly,
  355. for example, 'python -m pip install pyspark [--user]'. Otherwise, you can also
  356. explicitly set the Python executable, that has PySpark installed, to
  357. PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,
  358. 'PYSPARK_PYTHON=python3 pyspark'.
  359.  
  360. Connecting to jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  361. 21/07/23 12:43:38 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  362. 21/07/23 12:43:38 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  363. 21/07/23 12:43:38 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  364. 21/07/23 12:43:38 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  365. 21/07/23 12:43:38 INFO HiveConnection: Transport Used for JDBC connection: binary
  366. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  367. 21/07/23 12:43:38 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  368. 21/07/23 12:43:38 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  369. 21/07/23 12:43:38 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  370. 21/07/23 12:43:38 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  371. 21/07/23 12:43:38 INFO HiveConnection: Transport Used for JDBC connection: binary
  372. No current connection
  373. 21/07/23 12:43:38 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  374. 21/07/23 12:43:38 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  375. 21/07/23 12:43:38 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  376. 21/07/23 12:43:38 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  377. 21/07/23 12:43:38 INFO HiveConnection: Transport Used for JDBC connection: binary
  378. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  379.  
  380. )'}]}
  381. javax.persistence.RollbackException: Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.6.2.v20151217-774c696): org.eclipse.persistence.exceptions.DatabaseException
  382. Internal Exception: org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
  383. Error Code: 0
  384. Call: UPDATE alert_current SET latest_timestamp = ?, latest_text = ?, occurrences = ? WHERE (alert_id = ?)
  385. bind => [4 parameters bound]
  386. at org.eclipse.persistence.internal.jpa.transaction.EntityTransactionImpl.commit(EntityTransactionImpl.java:159)
  387. at org.apache.ambari.server.orm.AmbariJpaLocalTxnInterceptor.invoke(AmbariJpaLocalTxnInterceptor.java:153)
  388. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener.onAlertEvent(AlertReceivedListener.java:388)
  389. at org.apache.ambari.server.orm.AmbariLocalSessionInterceptor.invoke(AmbariLocalSessionInterceptor.java:44)
  390. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  391. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  392. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  393. at java.lang.reflect.Method.invoke(Method.java:498)
  394. at com.google.common.eventbus.Subscriber.invokeSubscriberMethod(Subscriber.java:87)
  395. at com.google.common.eventbus.Subscriber$1.run(Subscriber.java:72)
  396. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  397. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  398. at java.lang.Thread.run(Thread.java:745)
  399. Caused by: Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.6.2.v20151217-774c696): org.eclipse.persistence.exceptions.DatabaseException
  400. Internal Exception: org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
  401. Error Code: 0
  402. Call: UPDATE alert_current SET latest_timestamp = ?, latest_text = ?, occurrences = ? WHERE (alert_id = ?)
  403. bind => [4 parameters bound]
  404. at org.eclipse.persistence.exceptions.DatabaseException.sqlException(DatabaseException.java:340)
  405. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.processExceptionForCommError(DatabaseAccessor.java:1620)
  406. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeDirectNoSelect(DatabaseAccessor.java:900)
  407. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeNoSelect(DatabaseAccessor.java:964)
  408. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.basicExecuteCall(DatabaseAccessor.java:633)
  409. at org.eclipse.persistence.internal.databaseaccess.ParameterizedSQLBatchWritingMechanism.executeBatch(ParameterizedSQLBatchWritingMechanism.java:149)
  410. at org.eclipse.persistence.internal.databaseaccess.ParameterizedSQLBatchWritingMechanism.executeBatchedStatements(ParameterizedSQLBatchWritingMechanism.java:134)
  411. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.writesCompleted(DatabaseAccessor.java:1845)
  412. at org.eclipse.persistence.internal.sessions.AbstractSession.writesCompleted(AbstractSession.java:4300)
  413. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.writesCompleted(UnitOfWorkImpl.java:5592)
  414. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.acquireWriteLocks(UnitOfWorkImpl.java:1646)
  415. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitTransactionAfterWriteChanges(UnitOfWorkImpl.java:1614)
  416. at org.eclipse.persistence.internal.sessions.RepeatableWriteUnitOfWork.commitRootUnitOfWork(RepeatableWriteUnitOfWork.java:285)
  417. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitAndResume(UnitOfWorkImpl.java:1169)
  418. at org.eclipse.persistence.internal.jpa.transaction.EntityTransactionImpl.commit(EntityTransactionImpl.java:134)
  419. ... 12 more
  420. Caused by: org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
  421. at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2433)
  422. at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2178)
  423. at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:306)
  424. at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:441)
  425. at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:365)
  426. at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:155)
  427. at org.postgresql.jdbc.PgPreparedStatement.executeUpdate(PgPreparedStatement.java:132)
  428. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeDirectNoSelect(DatabaseAccessor.java:892)
  429. ... 24 more
  430. 2021-07-23 12:44:45,665 INFO [ambari-client-thread-199] ServiceResourceProvider:646 - Received a updateService request, clusterName=hdpcluster, serviceName=KAFKA, request=clusterName=hdpcluster, serviceName=KAFKA, desiredState=STARTED, credentialStoreEnabled=null, credentialStoreSupported=null
  431. 2021-07-23 12:44:45,667 INFO [ambari-client-thread-199] ServiceResourceProvider:646 - Received a updateService request, clusterName=hdpcluster, serviceName=ZOOKEEPER, request=clusterName=hdpcluster, serviceName=ZOOKEEPER, desiredState=STARTED, credentialStoreEnabled=null, credentialStoreSupported=null
  432. 2021-07-23 12:44:45,668 INFO [ambari-client-thread-199] ServiceResourceProvider:646 - Received a updateService request, clusterName=hdpcluster, serviceName=AMBARI_INFRA_SOLR, request=clusterName=hdpcluster, serviceName=AMBARI_INFRA_SOLR, desiredState=STARTED, credentialStoreEnabled=null, credentialStoreSupported=null
  433. 2021-07-23 12:44:45,669 INFO [ambari-client-thread-199] ServiceResourceProvider:646 - Received a updateService request, clusterName=hdpcluster, serviceName=HIVE, request=clusterName=hdpcluster, serviceName=HIVE, desiredState=STARTED, credentialStoreEnabled=null, credentialStoreSupported=null
  434. 2021-07-23 12:44:45,669 INFO [ambari-client-thread-199] ServiceResourceProvider:646 - Received a updateService request, clusterName=hdpcluster, serviceName=ZEPPELIN, request=clusterName=hdpcluster, serviceName=ZEPPELIN, desiredState=STARTED, credentialStoreEnabled=null, credentialStoreSupported=null
  435. 2021-07-23 12:44:45,670 INFO [ambari-client-thread-199] ServiceResourceProvider:646 - Received a updateService request, clusterName=hdpcluster, serviceName=ATLAS, request=clusterName=hdpcluster, serviceName=ATLAS, desiredState=STARTED, credentialStoreEnabled=null, credentialStoreSupported=null
  436. 2021-07-23 12:44:45,670 INFO [ambari-client-thread-199] ServiceResourceProvider:646 - Received a updateService request, clusterName=hdpcluster, serviceName=HBASE, request=clusterName=hdpcluster, serviceName=HBASE, desiredState=STARTED, credentialStoreEnabled=null, credentialStoreSupported=null
  437. 2021-07-23 12:44:45,671 INFO [ambari-client-thread-199] ServiceResourceProvider:646 - Received a updateService request, clusterName=hdpcluster, serviceName=AMBARI_METRICS, request=clusterName=hdpcluster, serviceName=AMBARI_METRICS, desiredState=STARTED, credentialStoreEnabled=null, credentialStoreSupported=null
  438. 2021-07-23 12:44:45,672 INFO [ambari-client-thread-199] ServiceResourceProvider:646 - Received a updateService request, clusterName=hdpcluster, serviceName=HDFS, request=clusterName=hdpcluster, serviceName=HDFS, desiredState=STARTED, credentialStoreEnabled=null, credentialStoreSupported=null
  439. 2021-07-23 12:44:45,672 INFO [ambari-client-thread-199] ServiceResourceProvider:646 - Received a updateService request, clusterName=hdpcluster, serviceName=SPARK2, request=clusterName=hdpcluster, serviceName=SPARK2, desiredState=STARTED, credentialStoreEnabled=null, credentialStoreSupported=null
  440. 2021-07-23 12:44:45,673 INFO [ambari-client-thread-199] ServiceResourceProvider:646 - Received a updateService request, clusterName=hdpcluster, serviceName=MAPREDUCE2, request=clusterName=hdpcluster, serviceName=MAPREDUCE2, desiredState=STARTED, credentialStoreEnabled=null, credentialStoreSupported=null
  441. 2021-07-23 12:44:45,673 INFO [ambari-client-thread-199] ServiceResourceProvider:646 - Received a updateService request, clusterName=hdpcluster, serviceName=TEZ, request=clusterName=hdpcluster, serviceName=TEZ, desiredState=STARTED, credentialStoreEnabled=null, credentialStoreSupported=null
  442. 2021-07-23 12:44:45,674 INFO [ambari-client-thread-199] ServiceResourceProvider:646 - Received a updateService request, clusterName=hdpcluster, serviceName=YARN, request=clusterName=hdpcluster, serviceName=YARN, desiredState=STARTED, credentialStoreEnabled=null, credentialStoreSupported=null
  443. 2021-07-23 12:44:45,674 INFO [ambari-client-thread-199] ServiceResourceProvider:646 - Received a updateService request, clusterName=hdpcluster, serviceName=SMARTSENSE, request=clusterName=hdpcluster, serviceName=SMARTSENSE, desiredState=STARTED, credentialStoreEnabled=null, credentialStoreSupported=null
  444. 2021-07-23 12:44:45,678 INFO [ambari-client-thread-199] AmbariManagementControllerImpl:2363 - Client hosts for reinstall : 9
  445. 2021-07-23 12:44:45,898 INFO [ambari-client-thread-199] RoleGraph:175 - Detecting cycle graphs
  446. 2021-07-23 12:44:45,900 INFO [ambari-client-thread-199] RoleGraph:176 - Graph:
  447. (ACTIVITY_ANALYZER, START, 14)
  448. (ACTIVITY_EXPLORER, START, 14)
  449. (APP_TIMELINE_SERVER, START, 12)
  450. (ATLAS_CLIENT, INSTALL, 0) --> (ACTIVITY_ANALYZER, START, 14) --> (ACTIVITY_EXPLORER, START, 14) --> (APP_TIMELINE_SERVER, START, 12) --> (ATLAS_SERVER, START, 15) --> (DATANODE, START, 10) --> (HBASE_MASTER, START, 12) --> (HBASE_REGIONSERVER, START, 13) --> (HISTORYSERVER, START, 12) --> (HIVE_METASTORE, START, 14) --> (HIVE_SERVER, START, 16) --> (HST_AGENT, START, 10) --> (HST_SERVER, START, 9) --> (INFRA_SOLR, START, 9) --> (KAFKA_BROKER, START, 11) --> (METRICS_COLLECTOR, START, 13) --> (METRICS_GRAFANA, START, 14) --> (METRICS_MONITOR, START, 9) --> (MYSQL_SERVER, START, 9) --> (NAMENODE, START, 10) --> (NODEMANAGER, START, 13) --> (RESOURCEMANAGER, START, 12) --> (SECONDARY_NAMENODE, START, 11) --> (SPARK2_JOBHISTORYSERVER, START, 15) --> (TIMELINE_READER, START, 12) --> (YARN_REGISTRY_DNS, START, 9) --> (ZEPPELIN_MASTER, START, 11)
  451. (ATLAS_SERVER, START, 15)
  452. (DATANODE, START, 10) --> (ACTIVITY_ANALYZER, START, 14) --> (ACTIVITY_EXPLORER, START, 14) --> (APP_TIMELINE_SERVER, START, 12) --> (ATLAS_SERVER, START, 15) --> (HBASE_MASTER, START, 12) --> (HBASE_REGIONSERVER, START, 13) --> (HISTORYSERVER, START, 12) --> (HIVE_METASTORE, START, 14) --> (HIVE_SERVER, START, 16) --> (METRICS_COLLECTOR, START, 13) --> (METRICS_GRAFANA, START, 14) --> (NODEMANAGER, START, 13) --> (RESOURCEMANAGER, START, 12) --> (SPARK2_JOBHISTORYSERVER, START, 15) --> (TIMELINE_READER, START, 12)
  453. (HBASE_CLIENT, INSTALL, 0) --> (ACTIVITY_ANALYZER, START, 14) --> (ACTIVITY_EXPLORER, START, 14) --> (APP_TIMELINE_SERVER, START, 12) --> (ATLAS_SERVER, START, 15) --> (DATANODE, START, 10) --> (HBASE_MASTER, START, 12) --> (HBASE_REGIONSERVER, START, 13) --> (HISTORYSERVER, START, 12) --> (HIVE_METASTORE, START, 14) --> (HIVE_SERVER, START, 16) --> (HST_AGENT, START, 10) --> (HST_SERVER, START, 9) --> (INFRA_SOLR, START, 9) --> (KAFKA_BROKER, START, 11) --> (METRICS_COLLECTOR, START, 13) --> (METRICS_GRAFANA, START, 14) --> (METRICS_MONITOR, START, 9) --> (MYSQL_SERVER, START, 9) --> (NAMENODE, START, 10) --> (NODEMANAGER, START, 13) --> (RESOURCEMANAGER, START, 12) --> (SECONDARY_NAMENODE, START, 11) --> (SPARK2_JOBHISTORYSERVER, START, 15) --> (TIMELINE_READER, START, 12) --> (YARN_REGISTRY_DNS, START, 9) --> (ZEPPELIN_MASTER, START, 11)
  454. (HBASE_MASTER, START, 12) --> (ATLAS_SERVER, START, 15) --> (HBASE_REGIONSERVER, START, 13)
  455. (HBASE_REGIONSERVER, START, 13) --> (ATLAS_SERVER, START, 15)
  456. (HDFS_CLIENT, INSTALL, 0) --> (ACTIVITY_ANALYZER, START, 14) --> (ACTIVITY_EXPLORER, START, 14) --> (APP_TIMELINE_SERVER, START, 12) --> (ATLAS_SERVER, START, 15) --> (DATANODE, START, 10) --> (HBASE_MASTER, START, 12) --> (HBASE_REGIONSERVER, START, 13) --> (HISTORYSERVER, START, 12) --> (HIVE_METASTORE, START, 14) --> (HIVE_SERVER, START, 16) --> (HST_AGENT, START, 10) --> (HST_SERVER, START, 9) --> (INFRA_SOLR, START, 9) --> (KAFKA_BROKER, START, 11) --> (METRICS_COLLECTOR, START, 13) --> (METRICS_GRAFANA, START, 14) --> (METRICS_MONITOR, START, 9) --> (MYSQL_SERVER, START, 9) --> (NAMENODE, START, 10) --> (NODEMANAGER, START, 13) --> (RESOURCEMANAGER, START, 12) --> (SECONDARY_NAMENODE, START, 11) --> (SPARK2_JOBHISTORYSERVER, START, 15) --> (TIMELINE_READER, START, 12) --> (YARN_REGISTRY_DNS, START, 9) --> (ZEPPELIN_MASTER, START, 11)
  457. (HISTORYSERVER, START, 12)
  458. (HIVE_CLIENT, INSTALL, 0) --> (ACTIVITY_ANALYZER, START, 14) --> (ACTIVITY_EXPLORER, START, 14) --> (APP_TIMELINE_SERVER, START, 12) --> (ATLAS_SERVER, START, 15) --> (DATANODE, START, 10) --> (HBASE_MASTER, START, 12) --> (HBASE_REGIONSERVER, START, 13) --> (HISTORYSERVER, START, 12) --> (HIVE_METASTORE, START, 14) --> (HIVE_SERVER, START, 16) --> (HST_AGENT, START, 10) --> (HST_SERVER, START, 9) --> (INFRA_SOLR, START, 9) --> (KAFKA_BROKER, START, 11) --> (METRICS_COLLECTOR, START, 13) --> (METRICS_GRAFANA, START, 14) --> (METRICS_MONITOR, START, 9) --> (MYSQL_SERVER, START, 9) --> (NAMENODE, START, 10) --> (NODEMANAGER, START, 13) --> (RESOURCEMANAGER, START, 12) --> (SECONDARY_NAMENODE, START, 11) --> (SPARK2_JOBHISTORYSERVER, START, 15) --> (TIMELINE_READER, START, 12) --> (YARN_REGISTRY_DNS, START, 9) --> (ZEPPELIN_MASTER, START, 11)
  459. (HIVE_METASTORE, START, 14) --> (HIVE_SERVER, START, 16) --> (SPARK2_JOBHISTORYSERVER, START, 15)
  460. (HIVE_SERVER, START, 16)
  461. (HST_AGENT, START, 10)
  462. (HST_SERVER, START, 9) --> (HST_AGENT, START, 10)
  463. (INFRA_SOLR, START, 9) --> (ACTIVITY_ANALYZER, START, 14) --> (ACTIVITY_EXPLORER, START, 14) --> (APP_TIMELINE_SERVER, START, 12) --> (ATLAS_SERVER, START, 15) --> (DATANODE, START, 10) --> (HBASE_MASTER, START, 12) --> (HBASE_REGIONSERVER, START, 13) --> (HISTORYSERVER, START, 12) --> (HIVE_METASTORE, START, 14) --> (HIVE_SERVER, START, 16) --> (KAFKA_BROKER, START, 11) --> (METRICS_COLLECTOR, START, 13) --> (METRICS_GRAFANA, START, 14) --> (NAMENODE, START, 10) --> (NODEMANAGER, START, 13) --> (RESOURCEMANAGER, START, 12) --> (SECONDARY_NAMENODE, START, 11) --> (SPARK2_JOBHISTORYSERVER, START, 15) --> (TIMELINE_READER, START, 12) --> (ZEPPELIN_MASTER, START, 11)
  464. (INFRA_SOLR_CLIENT, INSTALL, 0) --> (ACTIVITY_ANALYZER, START, 14) --> (ACTIVITY_EXPLORER, START, 14) --> (APP_TIMELINE_SERVER, START, 12) --> (ATLAS_SERVER, START, 15) --> (DATANODE, START, 10) --> (HBASE_MASTER, START, 12) --> (HBASE_REGIONSERVER, START, 13) --> (HISTORYSERVER, START, 12) --> (HIVE_METASTORE, START, 14) --> (HIVE_SERVER, START, 16) --> (HST_AGENT, START, 10) --> (HST_SERVER, START, 9) --> (INFRA_SOLR, START, 9) --> (KAFKA_BROKER, START, 11) --> (METRICS_COLLECTOR, START, 13) --> (METRICS_GRAFANA, START, 14) --> (METRICS_MONITOR, START, 9) --> (MYSQL_SERVER, START, 9) --> (NAMENODE, START, 10) --> (NODEMANAGER, START, 13) --> (RESOURCEMANAGER, START, 12) --> (SECONDARY_NAMENODE, START, 11) --> (SPARK2_JOBHISTORYSERVER, START, 15) --> (TIMELINE_READER, START, 12) --> (YARN_REGISTRY_DNS, START, 9) --> (ZEPPELIN_MASTER, START, 11)
  465. (KAFKA_BROKER, START, 11) --> (ATLAS_SERVER, START, 15)
  466. (MAPREDUCE2_CLIENT, INSTALL, 0) --> (ACTIVITY_ANALYZER, START, 14) --> (ACTIVITY_EXPLORER, START, 14) --> (APP_TIMELINE_SERVER, START, 12) --> (ATLAS_SERVER, START, 15) --> (DATANODE, START, 10) --> (HBASE_MASTER, START, 12) --> (HBASE_REGIONSERVER, START, 13) --> (HISTORYSERVER, START, 12) --> (HIVE_METASTORE, START, 14) --> (HIVE_SERVER, START, 16) --> (HST_AGENT, START, 10) --> (HST_SERVER, START, 9) --> (INFRA_SOLR, START, 9) --> (KAFKA_BROKER, START, 11) --> (METRICS_COLLECTOR, START, 13) --> (METRICS_GRAFANA, START, 14) --> (METRICS_MONITOR, START, 9) --> (MYSQL_SERVER, START, 9) --> (NAMENODE, START, 10) --> (NODEMANAGER, START, 13) --> (RESOURCEMANAGER, START, 12) --> (SECONDARY_NAMENODE, START, 11) --> (SPARK2_JOBHISTORYSERVER, START, 15) --> (TIMELINE_READER, START, 12) --> (YARN_REGISTRY_DNS, START, 9) --> (ZEPPELIN_MASTER, START, 11)
  467. (METRICS_COLLECTOR, START, 13) --> (ACTIVITY_ANALYZER, START, 14) --> (ACTIVITY_EXPLORER, START, 14) --> (METRICS_GRAFANA, START, 14)
  468. (METRICS_GRAFANA, START, 14)
  469. (METRICS_MONITOR, START, 9)
  470. (MYSQL_SERVER, START, 9) --> (HIVE_METASTORE, START, 14) --> (HIVE_SERVER, START, 16) --> (SPARK2_JOBHISTORYSERVER, START, 15)
  471. (NAMENODE, START, 10) --> (ACTIVITY_ANALYZER, START, 14) --> (ACTIVITY_EXPLORER, START, 14) --> (APP_TIMELINE_SERVER, START, 12) --> (ATLAS_SERVER, START, 15) --> (HBASE_MASTER, START, 12) --> (HBASE_REGIONSERVER, START, 13) --> (HISTORYSERVER, START, 12) --> (HIVE_METASTORE, START, 14) --> (HIVE_SERVER, START, 16) --> (KAFKA_BROKER, START, 11) --> (METRICS_COLLECTOR, START, 13) --> (METRICS_GRAFANA, START, 14) --> (NODEMANAGER, START, 13) --> (RESOURCEMANAGER, START, 12) --> (SECONDARY_NAMENODE, START, 11) --> (SPARK2_JOBHISTORYSERVER, START, 15) --> (TIMELINE_READER, START, 12) --> (ZEPPELIN_MASTER, START, 11)
  472. (NODEMANAGER, START, 13) --> (HIVE_SERVER, START, 16)
  473. (RESOURCEMANAGER, START, 12) --> (HIVE_METASTORE, START, 14) --> (HIVE_SERVER, START, 16) --> (NODEMANAGER, START, 13) --> (SPARK2_JOBHISTORYSERVER, START, 15)
  474. (SECONDARY_NAMENODE, START, 11) --> (ACTIVITY_ANALYZER, START, 14) --> (ACTIVITY_EXPLORER, START, 14) --> (METRICS_COLLECTOR, START, 13) --> (METRICS_GRAFANA, START, 14)
  475. (SPARK2_CLIENT, INSTALL, 0) --> (ACTIVITY_ANALYZER, START, 14) --> (ACTIVITY_EXPLORER, START, 14) --> (APP_TIMELINE_SERVER, START, 12) --> (ATLAS_SERVER, START, 15) --> (DATANODE, START, 10) --> (HBASE_MASTER, START, 12) --> (HBASE_REGIONSERVER, START, 13) --> (HISTORYSERVER, START, 12) --> (HIVE_METASTORE, START, 14) --> (HIVE_SERVER, START, 16) --> (HST_AGENT, START, 10) --> (HST_SERVER, START, 9) --> (INFRA_SOLR, START, 9) --> (KAFKA_BROKER, START, 11) --> (METRICS_COLLECTOR, START, 13) --> (METRICS_GRAFANA, START, 14) --> (METRICS_MONITOR, START, 9) --> (MYSQL_SERVER, START, 9) --> (NAMENODE, START, 10) --> (NODEMANAGER, START, 13) --> (RESOURCEMANAGER, START, 12) --> (SECONDARY_NAMENODE, START, 11) --> (SPARK2_JOBHISTORYSERVER, START, 15) --> (TIMELINE_READER, START, 12) --> (YARN_REGISTRY_DNS, START, 9) --> (ZEPPELIN_MASTER, START, 11)
  476. (SPARK2_JOBHISTORYSERVER, START, 15)
  477. (TEZ_CLIENT, INSTALL, 0) --> (ACTIVITY_ANALYZER, START, 14) --> (ACTIVITY_EXPLORER, START, 14) --> (APP_TIMELINE_SERVER, START, 12) --> (ATLAS_SERVER, START, 15) --> (DATANODE, START, 10) --> (HBASE_MASTER, START, 12) --> (HBASE_REGIONSERVER, START, 13) --> (HISTORYSERVER, START, 12) --> (HIVE_METASTORE, START, 14) --> (HIVE_SERVER, START, 16) --> (HST_AGENT, START, 10) --> (HST_SERVER, START, 9) --> (INFRA_SOLR, START, 9) --> (KAFKA_BROKER, START, 11) --> (METRICS_COLLECTOR, START, 13) --> (METRICS_GRAFANA, START, 14) --> (METRICS_MONITOR, START, 9) --> (MYSQL_SERVER, START, 9) --> (NAMENODE, START, 10) --> (NODEMANAGER, START, 13) --> (RESOURCEMANAGER, START, 12) --> (SECONDARY_NAMENODE, START, 11) --> (SPARK2_JOBHISTORYSERVER, START, 15) --> (TIMELINE_READER, START, 12) --> (YARN_REGISTRY_DNS, START, 9) --> (ZEPPELIN_MASTER, START, 11)
  478. (TIMELINE_READER, START, 12)
  479. (YARN_CLIENT, INSTALL, 0) --> (ACTIVITY_ANALYZER, START, 14) --> (ACTIVITY_EXPLORER, START, 14) --> (APP_TIMELINE_SERVER, START, 12) --> (ATLAS_SERVER, START, 15) --> (DATANODE, START, 10) --> (HBASE_MASTER, START, 12) --> (HBASE_REGIONSERVER, START, 13) --> (HISTORYSERVER, START, 12) --> (HIVE_METASTORE, START, 14) --> (HIVE_SERVER, START, 16) --> (HST_AGENT, START, 10) --> (HST_SERVER, START, 9) --> (INFRA_SOLR, START, 9) --> (KAFKA_BROKER, START, 11) --> (METRICS_COLLECTOR, START, 13) --> (METRICS_GRAFANA, START, 14) --> (METRICS_MONITOR, START, 9) --> (MYSQL_SERVER, START, 9) --> (NAMENODE, START, 10) --> (NODEMANAGER, START, 13) --> (RESOURCEMANAGER, START, 12) --> (SECONDARY_NAMENODE, START, 11) --> (SPARK2_JOBHISTORYSERVER, START, 15) --> (TIMELINE_READER, START, 12) --> (YARN_REGISTRY_DNS, START, 9) --> (ZEPPELIN_MASTER, START, 11)
  480. (YARN_REGISTRY_DNS, START, 9)
  481. (ZEPPELIN_MASTER, START, 11)
  482.  
  483. 2021-07-23 12:44:46,161 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role ATLAS_CLIENT, roleCommand INSTALL, and command ID 23-0, task ID 352
  484. 2021-07-23 12:44:46,162 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role HBASE_CLIENT, roleCommand INSTALL, and command ID 23-0, task ID 353
  485. 2021-07-23 12:44:46,162 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role HDFS_CLIENT, roleCommand INSTALL, and command ID 23-0, task ID 354
  486. 2021-07-23 12:44:46,162 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role HIVE_CLIENT, roleCommand INSTALL, and command ID 23-0, task ID 355
  487. 2021-07-23 12:44:46,162 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role INFRA_SOLR_CLIENT, roleCommand INSTALL, and command ID 23-0, task ID 356
  488. 2021-07-23 12:44:46,162 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role MAPREDUCE2_CLIENT, roleCommand INSTALL, and command ID 23-0, task ID 357
  489. 2021-07-23 12:44:46,162 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role SPARK2_CLIENT, roleCommand INSTALL, and command ID 23-0, task ID 358
  490. 2021-07-23 12:44:46,162 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role TEZ_CLIENT, roleCommand INSTALL, and command ID 23-0, task ID 359
  491. 2021-07-23 12:44:46,162 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role YARN_CLIENT, roleCommand INSTALL, and command ID 23-0, task ID 360
  492. 2021-07-23 12:44:46,224 INFO [agent-message-monitor-0] MessageEmitter:218 - Schedule execution command emitting, retry: 0, messageId: 0
  493. 2021-07-23 12:44:46,248 WARN [agent-message-retry-0] MessageEmitter:255 - Reschedule execution command emitting, retry: 1, messageId: 0
  494. 2021-07-23 12:44:47,984 INFO [MessageBroker-1] WebSocketMessageBrokerStats:124 - WebSocketSession[1 current WS(1)-HttpStream(0)-HttpPoll(0), 1 total, 0 closed abnormally (0 connect failure, 0 send limit, 0 transport error)], stompSubProtocol[processed CONNECT(1)-CONNECTED(1)-DISCONNECT(0)], stompBrokerRelay[null], inboundChannel[pool size = 8, active threads = 0, queued tasks = 0, completed tasks = 48], outboundChannel[pool size = 8, active threads = 0, queued tasks = 0, completed tasks = 24], sockJsScheduler[pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 0]
  495. 2021-07-23 12:44:48,117 WARN [ambari-client-thread-219] TaskResourceProvider:271 - Unable to parse task structured output: /var/lib/ambari-agent/data/structured-out-352.json
  496. 2021-07-23 12:44:48,314 INFO [MessageBroker-1] WebSocketMessageBrokerStats:124 - WebSocketSession[1 current WS(1)-HttpStream(0)-HttpPoll(0), 2 total, 0 closed abnormally (0 connect failure, 0 send limit, 1 transport error)], stompSubProtocol[processed CONNECT(2)-CONNECTED(2)-DISCONNECT(1)], stompBrokerRelay[null], inboundChannel[pool size = 10, active threads = 0, queued tasks = 0, completed tasks = 126], outboundChannel[pool size = 10, active threads = 0, queued tasks = 0, completed tasks = 28], sockJsScheduler[pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 0]
  497. 2021-07-23 12:45:01,079 INFO [pool-32-thread-1] AmbariMetricSinkImpl:291 - No live collector to send metrics to. Metrics to be sent will be discarded. This message will be skipped for the next 20 times.
  498. 2021-07-23 12:45:23,644 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=INFRA_SOLR, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  499. 2021-07-23 12:45:23,645 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=YARN_REGISTRY_DNS, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  500. 2021-07-23 12:45:23,645 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=MYSQL_SERVER, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  501. 2021-07-23 12:45:23,645 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=METRICS_MONITOR, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  502. 2021-07-23 12:45:23,645 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=HST_SERVER, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  503. 2021-07-23 12:45:23,652 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role HST_SERVER, roleCommand START, and command ID 23-1, task ID 361
  504. 2021-07-23 12:45:23,652 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role INFRA_SOLR, roleCommand START, and command ID 23-1, task ID 362
  505. 2021-07-23 12:45:23,652 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role METRICS_MONITOR, roleCommand START, and command ID 23-1, task ID 363
  506. 2021-07-23 12:45:23,652 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role MYSQL_SERVER, roleCommand START, and command ID 23-1, task ID 364
  507. 2021-07-23 12:45:23,652 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role YARN_REGISTRY_DNS, roleCommand START, and command ID 23-1, task ID 365
  508. 2021-07-23 12:45:23,653 INFO [agent-message-monitor-0] MessageEmitter:218 - Schedule execution command emitting, retry: 0, messageId: 1
  509. 2021-07-23 12:45:23,654 WARN [agent-message-retry-0] MessageEmitter:255 - Reschedule execution command emitting, retry: 1, messageId: 1
  510. 2021-07-23 12:45:45,337 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=HST_SERVER, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  511. 2021-07-23 12:45:50,878 INFO [Thread-21] AbstractPoolBackedDataSource:212 - Initializing c3p0 pool... com.mchange.v2.c3p0.ComboPooledDataSource [ acquireIncrement -> 3, acquireRetryAttempts -> 30, acquireRetryDelay -> 1000, autoCommitOnClose -> false, automaticTestTable -> null, breakAfterAcquireFailure -> false, checkoutTimeout -> 0, connectionCustomizerClassName -> null, connectionTesterClassName -> com.mchange.v2.c3p0.impl.DefaultConnectionTester, contextClassLoaderSource -> caller, dataSourceName -> 2wkjnfai1mw4fnvtsxf44|466fd19b, debugUnreturnedConnectionStackTraces -> false, description -> null, driverClass -> org.postgresql.Driver, extensions -> {}, factoryClassLocation -> null, forceIgnoreUnresolvedTransactions -> false, forceSynchronousCheckins -> false, forceUseNamedDriverClass -> false, identityToken -> 2wkjnfai1mw4fnvtsxf44|466fd19b, idleConnectionTestPeriod -> 50, initialPoolSize -> 3, jdbcUrl -> jdbc:postgresql://localhost/ambari, maxAdministrativeTaskTime -> 0, maxConnectionAge -> 0, maxIdleTime -> 0, maxIdleTimeExcessConnections -> 0, maxPoolSize -> 5, maxStatements -> 0, maxStatementsPerConnection -> 120, minPoolSize -> 1, numHelperThreads -> 3, preferredTestQuery -> select 0, privilegeSpawnedThreads -> false, properties -> {user=******, password=******}, propertyCycle -> 0, statementCacheNumDeferredCloseThreads -> 0, testConnectionOnCheckin -> true, testConnectionOnCheckout -> false, unreturnedConnectionTimeout -> 0, userOverrides -> {}, usesTraditionalReflectiveProxies -> false ]
  512. 2021-07-23 12:45:50,958 INFO [Thread-21] JobStoreTX:866 - Freed 0 triggers from 'acquired' / 'blocked' state.
  513. 2021-07-23 12:45:50,967 INFO [Thread-21] JobStoreTX:876 - Recovering 0 jobs that were in-progress at the time of the last shut-down.
  514. 2021-07-23 12:45:50,968 INFO [Thread-21] JobStoreTX:889 - Recovery complete.
  515. 2021-07-23 12:45:50,968 INFO [Thread-21] JobStoreTX:896 - Removed 0 'complete' triggers.
  516. 2021-07-23 12:45:50,969 INFO [Thread-21] JobStoreTX:901 - Removed 0 stale fired job entries.
  517. 2021-07-23 12:45:50,971 INFO [Thread-21] QuartzScheduler:547 - Scheduler ExecutionScheduler_$_NON_CLUSTERED started.
  518. 2021-07-23 12:45:54,331 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=INFRA_SOLR, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  519. 2021-07-23 12:45:58,325 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=METRICS_MONITOR, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  520. 2021-07-23 12:46:06,573 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=MYSQL_SERVER, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  521. 2021-07-23 12:46:11,518 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=YARN_REGISTRY_DNS, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  522. 2021-07-23 12:46:12,082 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=DATANODE, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  523. 2021-07-23 12:46:12,083 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=NAMENODE, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  524. 2021-07-23 12:46:12,083 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=HST_AGENT, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  525. 2021-07-23 12:46:12,087 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role DATANODE, roleCommand START, and command ID 23-2, task ID 366
  526. 2021-07-23 12:46:12,087 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role HST_AGENT, roleCommand START, and command ID 23-2, task ID 367
  527. 2021-07-23 12:46:12,087 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role NAMENODE, roleCommand START, and command ID 23-2, task ID 368
  528. 2021-07-23 12:46:12,088 INFO [agent-message-monitor-0] MessageEmitter:218 - Schedule execution command emitting, retry: 0, messageId: 2
  529. 2021-07-23 12:46:12,093 WARN [agent-message-retry-0] MessageEmitter:255 - Reschedule execution command emitting, retry: 1, messageId: 2
  530. 2021-07-23 12:46:16,146 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=DATANODE, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  531. 2021-07-23 12:46:21,327 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=HST_AGENT, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  532. 2021-07-23 12:46:34,516 INFO [ambari-client-thread-198] NamedTasksSubscriptions:72 - Task subscription was added for sessionId = b9c81720-870c-8417-eb63-f529da1bc9c5, taskId = 368, id = sub-10
  533. 2021-07-23 12:46:34,516 INFO [ambari-client-thread-198] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = b9c81720-870c-8417-eb63-f529da1bc9c5, destination = /events/tasks/368 and id = sub-10
  534. 2021-07-23 12:46:34,518 WARN [ambari-client-thread-249] Errors:173 - The following warnings have been detected with resource and/or provider classes:
  535. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.TaskService.getComponents(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  536. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.TaskService.getTask(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  537. 2021-07-23 12:46:34,518 WARN [ambari-client-thread-249] Errors:173 - The following warnings have been detected with resource and/or provider classes:
  538. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.TaskService.getComponents(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity.
  539. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.TaskService.getTask(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  540. 2021-07-23 12:46:41,054 ERROR [alert-event-bus-2] AmbariJpaLocalTxnInterceptor:180 - [DETAILED ERROR] Rollback reason:
  541. Local Exception Stack:
  542. Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.6.2.v20151217-774c696): org.eclipse.persistence.exceptions.DatabaseException
  543. Internal Exception: org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
  544. Error Code: 0
  545. Call: UPDATE alert_current SET latest_timestamp = ?, latest_text = ?, occurrences = ? WHERE (alert_id = ?)
  546. bind => [4 parameters bound]
  547. at org.eclipse.persistence.exceptions.DatabaseException.sqlException(DatabaseException.java:340)
  548. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.processExceptionForCommError(DatabaseAccessor.java:1620)
  549. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeDirectNoSelect(DatabaseAccessor.java:900)
  550. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeNoSelect(DatabaseAccessor.java:964)
  551. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.basicExecuteCall(DatabaseAccessor.java:633)
  552. at org.eclipse.persistence.internal.databaseaccess.ParameterizedSQLBatchWritingMechanism.executeBatch(ParameterizedSQLBatchWritingMechanism.java:149)
  553. at org.eclipse.persistence.internal.databaseaccess.ParameterizedSQLBatchWritingMechanism.executeBatchedStatements(ParameterizedSQLBatchWritingMechanism.java:134)
  554. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.writesCompleted(DatabaseAccessor.java:1845)
  555. at org.eclipse.persistence.internal.sessions.AbstractSession.writesCompleted(AbstractSession.java:4300)
  556. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.writesCompleted(UnitOfWorkImpl.java:5592)
  557. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.acquireWriteLocks(UnitOfWorkImpl.java:1646)
  558. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitTransactionAfterWriteChanges(UnitOfWorkImpl.java:1614)
  559. at org.eclipse.persistence.internal.sessions.RepeatableWriteUnitOfWork.commitRootUnitOfWork(RepeatableWriteUnitOfWork.java:285)
  560. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitAndResume(UnitOfWorkImpl.java:1169)
  561. at org.eclipse.persistence.internal.jpa.transaction.EntityTransactionImpl.commit(EntityTransactionImpl.java:134)
  562. at org.apache.ambari.server.orm.AmbariJpaLocalTxnInterceptor.invoke(AmbariJpaLocalTxnInterceptor.java:153)
  563. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:77)
  564. at com.google.inject.internal.InterceptorStackCallback.intercept(InterceptorStackCallback.java:55)
  565. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.saveEntities(<generated>)
  566. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener.onAlertEvent(AlertReceivedListener.java:388)
  567. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.CGLIB$onAlertEvent$0(<generated>)
  568. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173$$FastClassByGuice$$3f418344.invoke(<generated>)
  569. at com.google.inject.internal.cglib.proxy.$MethodProxy.invokeSuper(MethodProxy.java:228)
  570. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:76)
  571. at org.apache.ambari.server.orm.AmbariLocalSessionInterceptor.invoke(AmbariLocalSessionInterceptor.java:44)
  572. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:77)
  573. at com.google.inject.internal.InterceptorStackCallback.intercept(InterceptorStackCallback.java:55)
  574. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.onAlertEvent(<generated>)
  575. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  576. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  577. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  578. at java.lang.reflect.Method.invoke(Method.java:498)
  579. at com.google.common.eventbus.Subscriber.invokeSubscriberMethod(Subscriber.java:87)
  580. at com.google.common.eventbus.Subscriber$1.run(Subscriber.java:72)
  581. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  582. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  583. at java.lang.Thread.run(Thread.java:745)
  584. Caused by: org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
  585. at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2433)
  586. at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2178)
  587. at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:306)
  588. at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:441)
  589. at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:365)
  590. at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:155)
  591. at org.postgresql.jdbc.PgPreparedStatement.executeUpdate(PgPreparedStatement.java:132)
  592. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeDirectNoSelect(DatabaseAccessor.java:892)
  593. ... 34 more
  594. 2021-07-23 12:46:41,055 ERROR [alert-event-bus-2] AmbariJpaLocalTxnInterceptor:188 - [DETAILED ERROR] Internal exception (1) :
  595. org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
  596. at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2433)
  597. at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2178)
  598. at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:306)
  599. at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:441)
  600. at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:365)
  601. at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:155)
  602. at org.postgresql.jdbc.PgPreparedStatement.executeUpdate(PgPreparedStatement.java:132)
  603. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeDirectNoSelect(DatabaseAccessor.java:892)
  604. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeNoSelect(DatabaseAccessor.java:964)
  605. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.basicExecuteCall(DatabaseAccessor.java:633)
  606. at org.eclipse.persistence.internal.databaseaccess.ParameterizedSQLBatchWritingMechanism.executeBatch(ParameterizedSQLBatchWritingMechanism.java:149)
  607. at org.eclipse.persistence.internal.databaseaccess.ParameterizedSQLBatchWritingMechanism.executeBatchedStatements(ParameterizedSQLBatchWritingMechanism.java:134)
  608. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.writesCompleted(DatabaseAccessor.java:1845)
  609. at org.eclipse.persistence.internal.sessions.AbstractSession.writesCompleted(AbstractSession.java:4300)
  610. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.writesCompleted(UnitOfWorkImpl.java:5592)
  611. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.acquireWriteLocks(UnitOfWorkImpl.java:1646)
  612. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitTransactionAfterWriteChanges(UnitOfWorkImpl.java:1614)
  613. at org.eclipse.persistence.internal.sessions.RepeatableWriteUnitOfWork.commitRootUnitOfWork(RepeatableWriteUnitOfWork.java:285)
  614. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitAndResume(UnitOfWorkImpl.java:1169)
  615. at org.eclipse.persistence.internal.jpa.transaction.EntityTransactionImpl.commit(EntityTransactionImpl.java:134)
  616. at org.apache.ambari.server.orm.AmbariJpaLocalTxnInterceptor.invoke(AmbariJpaLocalTxnInterceptor.java:153)
  617. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:77)
  618. at com.google.inject.internal.InterceptorStackCallback.intercept(InterceptorStackCallback.java:55)
  619. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.saveEntities(<generated>)
  620. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener.onAlertEvent(AlertReceivedListener.java:388)
  621. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.CGLIB$onAlertEvent$0(<generated>)
  622. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173$$FastClassByGuice$$3f418344.invoke(<generated>)
  623. at com.google.inject.internal.cglib.proxy.$MethodProxy.invokeSuper(MethodProxy.java:228)
  624. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:76)
  625. at org.apache.ambari.server.orm.AmbariLocalSessionInterceptor.invoke(AmbariLocalSessionInterceptor.java:44)
  626. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:77)
  627. at com.google.inject.internal.InterceptorStackCallback.intercept(InterceptorStackCallback.java:55)
  628. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.onAlertEvent(<generated>)
  629. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  630. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  631. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  632. at java.lang.reflect.Method.invoke(Method.java:498)
  633. at com.google.common.eventbus.Subscriber.invokeSubscriberMethod(Subscriber.java:87)
  634. at com.google.common.eventbus.Subscriber$1.run(Subscriber.java:72)
  635. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  636. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  637. at java.lang.Thread.run(Thread.java:745)
  638. 2021-07-23 12:46:41,056 ERROR [alert-event-bus-2] default:232 - Exception thrown by subscriber method onAlertEvent(org.apache.ambari.server.events.AlertReceivedEvent) on subscriber org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173@6d229b1c when dispatching event: AlertReceivedEvent{cluserId=0, alerts=[{clusterId=2, state=OK, name=namenode_webui, service=HDFS, component=NAMENODE, host=datanodeFQDN.DOMAIN.COM, instance=null, text='HTTP 200 response in 0.000s'}, {clusterId=2, state=OK, name=upgrade_finalized_state, service=HDFS, component=NAMENODE, host=datanodeFQDN.DOMAIN.COM, instance=null, text='HDFS cluster is not in the upgrade state'}, {clusterId=2, state=CRITICAL, name=hive_server_process, service=HIVE, component=HIVE_SERVER, host=datanodeFQDN.DOMAIN.COM, instance=null, text='Connection failed on host datanodeFQDN.DOMAIN.COM:10000 (Traceback (most recent call last):
  639. File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/alerts/alert_hive_thrift_port.py", line 213, in execute
  640. ldap_password=ldap_password, pam_username=pam_username, pam_password=pam_password)
  641. File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/hive_check.py", line 95, in check_thrift_port_sasl
  642. timeout_kill_strategy=TerminateStrategy.KILL_PROCESS_TREE,
  643. File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
  644. self.env.run()
  645. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
  646. self.run_action(resource, action)
  647. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
  648. provider_action()
  649. File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
  650. returns=self.resource.returns)
  651. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
  652. result = function(command, **kwargs)
  653. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
  654. tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  655. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
  656. result = _call(command, **kwargs_copy)
  657. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
  658. raise ExecutionFailed(err_msg, code, out, err)
  659. ExecutionFailed: Execution of '! (beeline -u 'jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary' -n hive -e ';' 2>&1 | awk '{print}' | grep -vz -i -e 'Connected to:' -e 'Transaction isolation:' -e 'inactive HS2 instance; use service discovery')' returned 1. Could not find valid SPARK_HOME while searching ['/home', '/usr/local/bin']
  660.  
  661. Did you install PySpark via a package manager such as pip or Conda? If so,
  662. PySpark was not found in your Python environment. It is possible your
  663. Python environment does not properly bind with your package manager.
  664.  
  665. Please check your default 'python' and if you set PYSPARK_PYTHON and/or
  666. PYSPARK_DRIVER_PYTHON environment variables, and see if you can import
  667. PySpark, for example, 'python -c 'import pyspark'.
  668.  
  669. If you cannot import, you can install by using the Python executable directly,
  670. for example, 'python -m pip install pyspark [--user]'. Otherwise, you can also
  671. explicitly set the Python executable, that has PySpark installed, to
  672. PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,
  673. 'PYSPARK_PYTHON=python3 pyspark'.
  674.  
  675. Connecting to jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  676. 21/07/23 12:46:38 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  677. 21/07/23 12:46:38 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  678. 21/07/23 12:46:38 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  679. 21/07/23 12:46:38 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  680. 21/07/23 12:46:38 INFO HiveConnection: Transport Used for JDBC connection: binary
  681. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  682. 21/07/23 12:46:38 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  683. 21/07/23 12:46:38 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  684. 21/07/23 12:46:38 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  685. 21/07/23 12:46:38 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  686. 21/07/23 12:46:38 INFO HiveConnection: Transport Used for JDBC connection: binary
  687. No current connection
  688. 21/07/23 12:46:38 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  689. 21/07/23 12:46:38 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  690. 21/07/23 12:46:38 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  691. 21/07/23 12:46:38 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  692. 21/07/23 12:46:38 INFO HiveConnection: Transport Used for JDBC connection: binary
  693. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  694.  
  695. )'}, {clusterId=2, state=OK, name=datanode_process, service=HDFS, component=DATANODE, host=datanodeFQDN.DOMAIN.COM, instance=null, text='TCP OK - 0.000s response on port 50010'}, {clusterId=2, state=OK, name=infra_solr, service=AMBARI_INFRA_SOLR, component=INFRA_SOLR, host=datanodeFQDN.DOMAIN.COM, instance=null, text='HTTP 200 response in 0.000s'}, {clusterId=2, state=OK, name=YARN_REGISTRY_DNS_PROCESS, service=YARN, component=YARN_REGISTRY_DNS, host=datanodeFQDN.DOMAIN.COM, instance=null, text='TCP OK - 0.000s response on port 54'}, {clusterId=2, state=OK, name=datanode_webui, service=HDFS, component=DATANODE, host=datanodeFQDN.DOMAIN.COM, instance=null, text='HTTP 200 response in 0.000s'}, {clusterId=2, state=OK, name=namenode_last_checkpoint, service=HDFS, component=NAMENODE, host=datanodeFQDN.DOMAIN.COM, instance=null, text='Last Checkpoint: [5 hours, 37 minutes, 3177 transactions]'}, {clusterId=2, state=OK, name=datanode_health_summary, service=HDFS, component=NAMENODE, host=datanodeFQDN.DOMAIN.COM, instance=null, text='All 1 DataNode(s) are healthy'}, {clusterId=2, state=WARNING, name=ambari_agent_disk_usage, service=AMBARI, component=AMBARI_AGENT, host=datanodeFQDN.DOMAIN.COM, instance=null, text='Capacity Used: [66.95%, 34.1 GB], Capacity Total: [50.9 GB], path=/usr/hdp'}, {clusterId=2, state=OK, name=ams_metrics_monitor_process, service=AMBARI_METRICS, component=METRICS_MONITOR, host=datanodeFQDN.DOMAIN.COM, instance=null, text='Ambari Monitor is running on datanodeFQDN.DOMAIN.COM'}, {clusterId=2, state=OK, name=namenode_directory_status, service=HDFS, component=NAMENODE, host=datanodeFQDN.DOMAIN.COM, instance=null, text='Directories are healthy'}]}
  696. javax.persistence.RollbackException: Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.6.2.v20151217-774c696): org.eclipse.persistence.exceptions.DatabaseException
  697. Internal Exception: org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
  698. Error Code: 0
  699. Call: UPDATE alert_current SET latest_timestamp = ?, latest_text = ?, occurrences = ? WHERE (alert_id = ?)
  700. bind => [4 parameters bound]
  701. at org.eclipse.persistence.internal.jpa.transaction.EntityTransactionImpl.commit(EntityTransactionImpl.java:159)
  702. at org.apache.ambari.server.orm.AmbariJpaLocalTxnInterceptor.invoke(AmbariJpaLocalTxnInterceptor.java:153)
  703. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener.onAlertEvent(AlertReceivedListener.java:388)
  704. at org.apache.ambari.server.orm.AmbariLocalSessionInterceptor.invoke(AmbariLocalSessionInterceptor.java:44)
  705. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  706. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  707. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  708. at java.lang.reflect.Method.invoke(Method.java:498)
  709. at com.google.common.eventbus.Subscriber.invokeSubscriberMethod(Subscriber.java:87)
  710. at com.google.common.eventbus.Subscriber$1.run(Subscriber.java:72)
  711. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  712. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  713. at java.lang.Thread.run(Thread.java:745)
  714. Caused by: Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.6.2.v20151217-774c696): org.eclipse.persistence.exceptions.DatabaseException
  715. Internal Exception: org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
  716. Error Code: 0
  717. Call: UPDATE alert_current SET latest_timestamp = ?, latest_text = ?, occurrences = ? WHERE (alert_id = ?)
  718. bind => [4 parameters bound]
  719. at org.eclipse.persistence.exceptions.DatabaseException.sqlException(DatabaseException.java:340)
  720. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.processExceptionForCommError(DatabaseAccessor.java:1620)
  721. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeDirectNoSelect(DatabaseAccessor.java:900)
  722. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeNoSelect(DatabaseAccessor.java:964)
  723. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.basicExecuteCall(DatabaseAccessor.java:633)
  724. at org.eclipse.persistence.internal.databaseaccess.ParameterizedSQLBatchWritingMechanism.executeBatch(ParameterizedSQLBatchWritingMechanism.java:149)
  725. at org.eclipse.persistence.internal.databaseaccess.ParameterizedSQLBatchWritingMechanism.executeBatchedStatements(ParameterizedSQLBatchWritingMechanism.java:134)
  726. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.writesCompleted(DatabaseAccessor.java:1845)
  727. at org.eclipse.persistence.internal.sessions.AbstractSession.writesCompleted(AbstractSession.java:4300)
  728. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.writesCompleted(UnitOfWorkImpl.java:5592)
  729. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.acquireWriteLocks(UnitOfWorkImpl.java:1646)
  730. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitTransactionAfterWriteChanges(UnitOfWorkImpl.java:1614)
  731. at org.eclipse.persistence.internal.sessions.RepeatableWriteUnitOfWork.commitRootUnitOfWork(RepeatableWriteUnitOfWork.java:285)
  732. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitAndResume(UnitOfWorkImpl.java:1169)
  733. at org.eclipse.persistence.internal.jpa.transaction.EntityTransactionImpl.commit(EntityTransactionImpl.java:134)
  734. ... 12 more
  735. Caused by: org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
  736. at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2433)
  737. at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2178)
  738. at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:306)
  739. at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:441)
  740. at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:365)
  741. at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:155)
  742. at org.postgresql.jdbc.PgPreparedStatement.executeUpdate(PgPreparedStatement.java:132)
  743. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeDirectNoSelect(DatabaseAccessor.java:892)
  744. ... 24 more
  745. 2021-07-23 12:46:43,739 INFO [agent-report-processor-0] TaskStatusListener:199 - NamedTaskUpdateEvent with id 368 will be send
  746. 2021-07-23 12:46:53,752 INFO [agent-report-processor-0] TaskStatusListener:199 - NamedTaskUpdateEvent with id 368 will be send
  747. 2021-07-23 12:47:08,769 INFO [agent-report-processor-0] TaskStatusListener:199 - NamedTaskUpdateEvent with id 368 will be send
  748. 2021-07-23 12:47:18,772 INFO [agent-report-processor-0] TaskStatusListener:199 - NamedTaskUpdateEvent with id 368 will be send
  749. 2021-07-23 12:47:28,785 INFO [agent-report-processor-0] TaskStatusListener:199 - NamedTaskUpdateEvent with id 368 will be send
  750. 2021-07-23 12:47:30,253 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=NAMENODE, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  751. 2021-07-23 12:47:30,256 INFO [agent-report-processor-0] NamedTasksSubscriptions:101 - Task subscription was removed for sessionId = b9c81720-870c-8417-eb63-f529da1bc9c5 and taskId = 368
  752. 2021-07-23 12:47:30,257 INFO [agent-report-processor-0] TaskStatusListener:199 - NamedTaskUpdateEvent with id 368 will be send
  753. 2021-07-23 12:47:30,316 INFO [ambari-client-thread-195] NamedTasksSubscribeListener:60 - API unsubscribe was arrived with sessionId = b9c81720-870c-8417-eb63-f529da1bc9c5 and id = sub-10
  754. 2021-07-23 12:47:30,608 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=HISTORYSERVER, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  755. 2021-07-23 12:47:30,608 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=SECONDARY_NAMENODE, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  756. 2021-07-23 12:47:30,608 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=HBASE_MASTER, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  757. 2021-07-23 12:47:30,608 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=APP_TIMELINE_SERVER, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  758. 2021-07-23 12:47:30,608 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=RESOURCEMANAGER, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  759. 2021-07-23 12:47:30,608 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=TIMELINE_READER, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  760. 2021-07-23 12:47:30,609 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=ZEPPELIN_MASTER, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  761. 2021-07-23 12:47:30,609 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=KAFKA_BROKER, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  762. 2021-07-23 12:47:30,618 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role APP_TIMELINE_SERVER, roleCommand START, and command ID 23-3, task ID 369
  763. 2021-07-23 12:47:30,618 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role HBASE_MASTER, roleCommand START, and command ID 23-3, task ID 370
  764. 2021-07-23 12:47:30,618 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role HISTORYSERVER, roleCommand START, and command ID 23-3, task ID 371
  765. 2021-07-23 12:47:30,618 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role KAFKA_BROKER, roleCommand START, and command ID 23-3, task ID 372
  766. 2021-07-23 12:47:30,618 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role RESOURCEMANAGER, roleCommand START, and command ID 23-3, task ID 373
  767. 2021-07-23 12:47:30,618 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role SECONDARY_NAMENODE, roleCommand START, and command ID 23-3, task ID 374
  768. 2021-07-23 12:47:30,618 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role TIMELINE_READER, roleCommand START, and command ID 23-3, task ID 375
  769. 2021-07-23 12:47:30,618 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role ZEPPELIN_MASTER, roleCommand START, and command ID 23-3, task ID 376
  770. 2021-07-23 12:47:30,753 INFO [agent-message-monitor-0] MessageEmitter:218 - Schedule execution command emitting, retry: 0, messageId: 3
  771. 2021-07-23 12:47:30,755 WARN [agent-message-retry-0] MessageEmitter:255 - Reschedule execution command emitting, retry: 1, messageId: 3
  772. 2021-07-23 12:47:36,426 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=APP_TIMELINE_SERVER, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  773. 2021-07-23 12:47:40,258 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=HBASE_MASTER, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  774. 2021-07-23 12:48:04,788 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=HISTORYSERVER, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  775. 2021-07-23 12:48:06,905 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=KAFKA_BROKER, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  776. 2021-07-23 12:48:11,903 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=RESOURCEMANAGER, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  777. 2021-07-23 12:48:16,497 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=SECONDARY_NAMENODE, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  778. 2021-07-23 12:48:38,419 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=TIMELINE_READER, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  779. 2021-07-23 12:49:14,853 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=ZEPPELIN_MASTER, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  780. 2021-07-23 12:49:15,382 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=HBASE_REGIONSERVER, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  781. 2021-07-23 12:49:15,382 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=NODEMANAGER, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  782. 2021-07-23 12:49:15,382 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=HIVE_METASTORE, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  783. 2021-07-23 12:49:15,382 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=METRICS_COLLECTOR, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  784. 2021-07-23 12:49:15,387 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role HBASE_REGIONSERVER, roleCommand START, and command ID 23-4, task ID 377
  785. 2021-07-23 12:49:15,387 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role HIVE_METASTORE, roleCommand START, and command ID 23-4, task ID 378
  786. 2021-07-23 12:49:15,387 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role METRICS_COLLECTOR, roleCommand START, and command ID 23-4, task ID 379
  787. 2021-07-23 12:49:15,387 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role NODEMANAGER, roleCommand START, and command ID 23-4, task ID 380
  788. 2021-07-23 12:49:15,458 INFO [agent-message-monitor-0] MessageEmitter:218 - Schedule execution command emitting, retry: 0, messageId: 4
  789. 2021-07-23 12:49:15,459 WARN [agent-message-retry-0] MessageEmitter:255 - Reschedule execution command emitting, retry: 1, messageId: 4
  790. 2021-07-23 12:49:18,355 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=HBASE_REGIONSERVER, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  791. 2021-07-23 12:49:36,552 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=HIVE_METASTORE, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  792. 2021-07-23 12:49:41,127 ERROR [alert-event-bus-2] AmbariJpaLocalTxnInterceptor:180 - [DETAILED ERROR] Rollback reason:
  793. Local Exception Stack:
  794. Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.6.2.v20151217-774c696): org.eclipse.persistence.exceptions.DatabaseException
  795. Internal Exception: java.sql.BatchUpdateException: Batch entry 6 UPDATE alert_current SET latest_timestamp = 1627058980541, latest_text = 'Connection failed on host datanodeFQDN.DOMAIN.COM:10000 (Traceback (most recent call last):
  796. File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/alerts/alert_hive_thrift_port.py", line 213, in execute
  797. ldap_password=ldap_password, pam_username=pam_username, pam_password=pam_password)
  798. File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/hive_check.py", line 95, in check_thrift_port_sasl
  799. timeout_kill_strategy=TerminateStrategy.KILL_PROCESS_TREE,
  800. File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
  801. self.env.run()
  802. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
  803. self.run_action(resource, action)
  804. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
  805. provider_action()
  806. File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
  807. returns=self.resource.returns)
  808. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
  809. result = function(command, **kwargs)
  810. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
  811. tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  812. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
  813. result = _call(command, **kwargs_copy)
  814. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
  815. raise ExecutionFailed(err_msg, code, out, err)
  816. ExecutionFailed: Execution of ''! (beeline -u ''jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary'' -n hive -e '';'' 2>&1 | awk ''{print}'' | grep -vz -i -e ''Connected to:'' -e ''Transaction isolation:'' -e ''inactive HS2 instance; use service discovery'')'' returned 1. Could not find valid SPARK_HOME while searching [''/home'', ''/usr/local/bin'']
  817.  
  818. Did you install PySpark via a package manager such as pip or Conda? If so,
  819. PySpark was not found in your Python environment. It is possible your
  820. Python environment does not properly bind with your package manager.
  821.  
  822. Please check your default ''python'' and if you set PYSPARK_PYTHON and/or
  823. PYSPARK_DRIVER_PYTHON environment variables, and see if you can import
  824. PySpark, for example, ''python -c ''import pyspark''.
  825.  
  826. If you cannot import, you can install by using the Python executable directly,
  827. for example, ''python -m pip install pyspark [--user]''. Otherwise, you can also
  828. explicitly set the Python executable, that has PySpark installed, to
  829. PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,
  830. ''PYSPARK_PYTHON=python3 pyspark''.
  831.  
  832. Connecting to jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  833. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  834. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  835. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  836. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  837. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  838. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  839. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  840. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  841. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  842. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  843. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  844. No current connection
  845. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  846. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  847. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  848. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  849. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  850. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  851. Connection failed on host datanodeFQDN.DOMAIN.COM:10000 (Traceback (most recent call last):
  852. File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/alerts/alert_hive_thrift_port.py", line 213, in execute
  853. ldap_password=ldap_password, pam_username=pam_username, pam_password=pam_password)
  854. File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/hive_check.py", line 95, in check_thrift_port_sasl
  855. timeout_kill_strategy=TerminateStrategy.KILL_PROCESS_TREE,
  856. File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
  857. self.env.run()
  858. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
  859. self.run_action(resource, action)
  860. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
  861. provider_action()
  862. File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
  863. returns=self.resource.returns)
  864. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
  865. result = function(command, **kwargs)
  866. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
  867. tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  868. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
  869. result = _call(command, **kwargs_copy)
  870. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
  871. raise ExecutionFailed(err_msg, code, out, err)
  872. ExecutionFailed: Execution of '! (beeline -u 'jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary' -n hive -e ';' 2>&1 | awk '{print}' | grep -vz -i -e 'Connected to:' -e 'Transaction isolation:' -e 'inactive HS2 instance; use service discovery')' returned 1. Could not find valid SPARK_HOME while searching ['/home', '/usr/local/bin']
  873.  
  874. Did you install PySpark via a package manager such as pip or Conda? If so,
  875. PySpark was not found in your Python environment. It is possible your
  876. Python environment does not properly bind with your package manager.
  877.  
  878. Please check your default 'python' and if you set PYSPARK_PYTHON and/or
  879. PYSPARK_DRIVER_PYTHON environment variables, and see if you can import
  880. PySpark, for example, 'python -c 'import pyspark'.
  881.  
  882. If you cannot import, you can install by using the Python executable directly,
  883. for example, 'python -m pip install pyspark [--user]'. Otherwise, you can also
  884. explicitly set the Python executable, that has PySpark installed, to
  885. PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,
  886. 'PYSPARK_PYTHON=python3 pyspark'.
  887.  
  888. Connecting to jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  889. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  890. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  891. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  892. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  893. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  894. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  895. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  896. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  897. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  898. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  899. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  900. No current connection
  901. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  902. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  903. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  904. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  905. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  906. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  907.  
  908. )', occurrences = 2 WHERE (alert_id = 84) was aborted: ERROR: invalid byte sequence for encoding "UTF8": 0x00 Call getNextException to see other errors in the batch.
  909. Error Code: 0
  910. Call: UPDATE alert_current SET latest_timestamp = ?, latest_text = ?, occurrences = ? WHERE (alert_id = ?)
  911. bind => [4 parameters bound]
  912. at org.eclipse.persistence.exceptions.DatabaseException.sqlException(DatabaseException.java:340)
  913. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.processExceptionForCommError(DatabaseAccessor.java:1620)
  914. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeJDK12BatchStatement(DatabaseAccessor.java:926)
  915. at org.eclipse.persistence.internal.databaseaccess.ParameterizedSQLBatchWritingMechanism.executeBatch(ParameterizedSQLBatchWritingMechanism.java:179)
  916. at org.eclipse.persistence.internal.databaseaccess.ParameterizedSQLBatchWritingMechanism.executeBatchedStatements(ParameterizedSQLBatchWritingMechanism.java:134)
  917. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.writesCompleted(DatabaseAccessor.java:1845)
  918. at org.eclipse.persistence.internal.sessions.AbstractSession.writesCompleted(AbstractSession.java:4300)
  919. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.writesCompleted(UnitOfWorkImpl.java:5592)
  920. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.acquireWriteLocks(UnitOfWorkImpl.java:1646)
  921. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitTransactionAfterWriteChanges(UnitOfWorkImpl.java:1614)
  922. at org.eclipse.persistence.internal.sessions.RepeatableWriteUnitOfWork.commitRootUnitOfWork(RepeatableWriteUnitOfWork.java:285)
  923. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitAndResume(UnitOfWorkImpl.java:1169)
  924. at org.eclipse.persistence.internal.jpa.transaction.EntityTransactionImpl.commit(EntityTransactionImpl.java:134)
  925. at org.apache.ambari.server.orm.AmbariJpaLocalTxnInterceptor.invoke(AmbariJpaLocalTxnInterceptor.java:153)
  926. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:77)
  927. at com.google.inject.internal.InterceptorStackCallback.intercept(InterceptorStackCallback.java:55)
  928. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.saveEntities(<generated>)
  929. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener.onAlertEvent(AlertReceivedListener.java:388)
  930. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.CGLIB$onAlertEvent$0(<generated>)
  931. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173$$FastClassByGuice$$3f418344.invoke(<generated>)
  932. at com.google.inject.internal.cglib.proxy.$MethodProxy.invokeSuper(MethodProxy.java:228)
  933. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:76)
  934. at org.apache.ambari.server.orm.AmbariLocalSessionInterceptor.invoke(AmbariLocalSessionInterceptor.java:44)
  935. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:77)
  936. at com.google.inject.internal.InterceptorStackCallback.intercept(InterceptorStackCallback.java:55)
  937. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.onAlertEvent(<generated>)
  938. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  939. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  940. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  941. at java.lang.reflect.Method.invoke(Method.java:498)
  942. at com.google.common.eventbus.Subscriber.invokeSubscriberMethod(Subscriber.java:87)
  943. at com.google.common.eventbus.Subscriber$1.run(Subscriber.java:72)
  944. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  945. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  946. at java.lang.Thread.run(Thread.java:745)
  947. Caused by: java.sql.BatchUpdateException: Batch entry 6 UPDATE alert_current SET latest_timestamp = 1627058980541, latest_text = 'Connection failed on host datanodeFQDN.DOMAIN.COM:10000 (Traceback (most recent call last):
  948. File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/alerts/alert_hive_thrift_port.py", line 213, in execute
  949. ldap_password=ldap_password, pam_username=pam_username, pam_password=pam_password)
  950. File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/hive_check.py", line 95, in check_thrift_port_sasl
  951. timeout_kill_strategy=TerminateStrategy.KILL_PROCESS_TREE,
  952. File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
  953. self.env.run()
  954. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
  955. self.run_action(resource, action)
  956. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
  957. provider_action()
  958. File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
  959. returns=self.resource.returns)
  960. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
  961. result = function(command, **kwargs)
  962. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
  963. tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  964. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
  965. result = _call(command, **kwargs_copy)
  966. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
  967. raise ExecutionFailed(err_msg, code, out, err)
  968. ExecutionFailed: Execution of ''! (beeline -u ''jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary'' -n hive -e '';'' 2>&1 | awk ''{print}'' | grep -vz -i -e ''Connected to:'' -e ''Transaction isolation:'' -e ''inactive HS2 instance; use service discovery'')'' returned 1. Could not find valid SPARK_HOME while searching [''/home'', ''/usr/local/bin'']
  969.  
  970. Did you install PySpark via a package manager such as pip or Conda? If so,
  971. PySpark was not found in your Python environment. It is possible your
  972. Python environment does not properly bind with your package manager.
  973.  
  974. Please check your default ''python'' and if you set PYSPARK_PYTHON and/or
  975. PYSPARK_DRIVER_PYTHON environment variables, and see if you can import
  976. PySpark, for example, ''python -c ''import pyspark''.
  977.  
  978. If you cannot import, you can install by using the Python executable directly,
  979. for example, ''python -m pip install pyspark [--user]''. Otherwise, you can also
  980. explicitly set the Python executable, that has PySpark installed, to
  981. PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,
  982. ''PYSPARK_PYTHON=python3 pyspark''.
  983.  
  984. Connecting to jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  985. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  986. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  987. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  988. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  989. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  990. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  991. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  992. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  993. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  994. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  995. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  996. No current connection
  997. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  998. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  999. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1000. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1001. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1002. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1003. Connection failed on host datanodeFQDN.DOMAIN.COM:10000 (Traceback (most recent call last):
  1004. File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/alerts/alert_hive_thrift_port.py", line 213, in execute
  1005. ldap_password=ldap_password, pam_username=pam_username, pam_password=pam_password)
  1006. File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/hive_check.py", line 95, in check_thrift_port_sasl
  1007. timeout_kill_strategy=TerminateStrategy.KILL_PROCESS_TREE,
  1008. File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
  1009. self.env.run()
  1010. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
  1011. self.run_action(resource, action)
  1012. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
  1013. provider_action()
  1014. File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
  1015. returns=self.resource.returns)
  1016. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
  1017. result = function(command, **kwargs)
  1018. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
  1019. tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  1020. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
  1021. result = _call(command, **kwargs_copy)
  1022. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
  1023. raise ExecutionFailed(err_msg, code, out, err)
  1024. ExecutionFailed: Execution of '! (beeline -u 'jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary' -n hive -e ';' 2>&1 | awk '{print}' | grep -vz -i -e 'Connected to:' -e 'Transaction isolation:' -e 'inactive HS2 instance; use service discovery')' returned 1. Could not find valid SPARK_HOME while searching ['/home', '/usr/local/bin']
  1025.  
  1026. Did you install PySpark via a package manager such as pip or Conda? If so,
  1027. PySpark was not found in your Python environment. It is possible your
  1028. Python environment does not properly bind with your package manager.
  1029.  
  1030. Please check your default 'python' and if you set PYSPARK_PYTHON and/or
  1031. PYSPARK_DRIVER_PYTHON environment variables, and see if you can import
  1032. PySpark, for example, 'python -c 'import pyspark'.
  1033.  
  1034. If you cannot import, you can install by using the Python executable directly,
  1035. for example, 'python -m pip install pyspark [--user]'. Otherwise, you can also
  1036. explicitly set the Python executable, that has PySpark installed, to
  1037. PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,
  1038. 'PYSPARK_PYTHON=python3 pyspark'.
  1039.  
  1040. Connecting to jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1041. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1042. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1043. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1044. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1045. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1046. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1047. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1048. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1049. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1050. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1051. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1052. No current connection
  1053. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1054. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1055. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1056. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1057. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1058. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1059.  
  1060. )', occurrences = 2 WHERE (alert_id = 84) was aborted: ERROR: invalid byte sequence for encoding "UTF8": 0x00 Call getNextException to see other errors in the batch.
  1061. at org.postgresql.jdbc.BatchResultHandler.handleError(BatchResultHandler.java:148)
  1062. at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2179)
  1063. at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:479)
  1064. at org.postgresql.jdbc.PgStatement.executeBatch(PgStatement.java:835)
  1065. at org.postgresql.jdbc.PgPreparedStatement.executeBatch(PgPreparedStatement.java:1556)
  1066. at org.eclipse.persistence.internal.databaseaccess.DatabasePlatform.executeBatch(DatabasePlatform.java:2336)
  1067. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeJDK12BatchStatement(DatabaseAccessor.java:922)
  1068. ... 32 more
  1069. Caused by: org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
  1070. at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2433)
  1071. at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2178)
  1072. ... 37 more
  1073. 2021-07-23 12:49:41,130 ERROR [alert-event-bus-2] AmbariJpaLocalTxnInterceptor:188 - [DETAILED ERROR] Internal exception (1) :
  1074. java.sql.BatchUpdateException: Batch entry 6 UPDATE alert_current SET latest_timestamp = 1627058980541, latest_text = 'Connection failed on host datanodeFQDN.DOMAIN.COM:10000 (Traceback (most recent call last):
  1075. File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/alerts/alert_hive_thrift_port.py", line 213, in execute
  1076. ldap_password=ldap_password, pam_username=pam_username, pam_password=pam_password)
  1077. File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/hive_check.py", line 95, in check_thrift_port_sasl
  1078. timeout_kill_strategy=TerminateStrategy.KILL_PROCESS_TREE,
  1079. File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
  1080. self.env.run()
  1081. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
  1082. self.run_action(resource, action)
  1083. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
  1084. provider_action()
  1085. File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
  1086. returns=self.resource.returns)
  1087. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
  1088. result = function(command, **kwargs)
  1089. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
  1090. tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  1091. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
  1092. result = _call(command, **kwargs_copy)
  1093. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
  1094. raise ExecutionFailed(err_msg, code, out, err)
  1095. ExecutionFailed: Execution of ''! (beeline -u ''jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary'' -n hive -e '';'' 2>&1 | awk ''{print}'' | grep -vz -i -e ''Connected to:'' -e ''Transaction isolation:'' -e ''inactive HS2 instance; use service discovery'')'' returned 1. Could not find valid SPARK_HOME while searching [''/home'', ''/usr/local/bin'']
  1096.  
  1097. Did you install PySpark via a package manager such as pip or Conda? If so,
  1098. PySpark was not found in your Python environment. It is possible your
  1099. Python environment does not properly bind with your package manager.
  1100.  
  1101. Please check your default ''python'' and if you set PYSPARK_PYTHON and/or
  1102. PYSPARK_DRIVER_PYTHON environment variables, and see if you can import
  1103. PySpark, for example, ''python -c ''import pyspark''.
  1104.  
  1105. If you cannot import, you can install by using the Python executable directly,
  1106. for example, ''python -m pip install pyspark [--user]''. Otherwise, you can also
  1107. explicitly set the Python executable, that has PySpark installed, to
  1108. PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,
  1109. ''PYSPARK_PYTHON=python3 pyspark''.
  1110.  
  1111. Connecting to jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1112. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1113. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1114. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1115. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1116. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1117. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1118. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1119. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1120. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1121. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1122. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1123. No current connection
  1124. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1125. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1126. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1127. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1128. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1129. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1130. Connection failed on host datanodeFQDN.DOMAIN.COM:10000 (Traceback (most recent call last):
  1131. File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/alerts/alert_hive_thrift_port.py", line 213, in execute
  1132. ldap_password=ldap_password, pam_username=pam_username, pam_password=pam_password)
  1133. File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/hive_check.py", line 95, in check_thrift_port_sasl
  1134. timeout_kill_strategy=TerminateStrategy.KILL_PROCESS_TREE,
  1135. File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
  1136. self.env.run()
  1137. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
  1138. self.run_action(resource, action)
  1139. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
  1140. provider_action()
  1141. File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
  1142. returns=self.resource.returns)
  1143. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
  1144. result = function(command, **kwargs)
  1145. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
  1146. tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  1147. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
  1148. result = _call(command, **kwargs_copy)
  1149. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
  1150. raise ExecutionFailed(err_msg, code, out, err)
  1151. ExecutionFailed: Execution of '! (beeline -u 'jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary' -n hive -e ';' 2>&1 | awk '{print}' | grep -vz -i -e 'Connected to:' -e 'Transaction isolation:' -e 'inactive HS2 instance; use service discovery')' returned 1. Could not find valid SPARK_HOME while searching ['/home', '/usr/local/bin']
  1152.  
  1153. Did you install PySpark via a package manager such as pip or Conda? If so,
  1154. PySpark was not found in your Python environment. It is possible your
  1155. Python environment does not properly bind with your package manager.
  1156.  
  1157. Please check your default 'python' and if you set PYSPARK_PYTHON and/or
  1158. PYSPARK_DRIVER_PYTHON environment variables, and see if you can import
  1159. PySpark, for example, 'python -c 'import pyspark'.
  1160.  
  1161. If you cannot import, you can install by using the Python executable directly,
  1162. for example, 'python -m pip install pyspark [--user]'. Otherwise, you can also
  1163. explicitly set the Python executable, that has PySpark installed, to
  1164. PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,
  1165. 'PYSPARK_PYTHON=python3 pyspark'.
  1166.  
  1167. Connecting to jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1168. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1169. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1170. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1171. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1172. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1173. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1174. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1175. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1176. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1177. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1178. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1179. No current connection
  1180. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1181. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1182. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1183. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1184. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1185. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1186.  
  1187. )', occurrences = 2 WHERE (alert_id = 84) was aborted: ERROR: invalid byte sequence for encoding "UTF8": 0x00 Call getNextException to see other errors in the batch.
  1188. at org.postgresql.jdbc.BatchResultHandler.handleError(BatchResultHandler.java:148)
  1189. at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2179)
  1190. at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:479)
  1191. at org.postgresql.jdbc.PgStatement.executeBatch(PgStatement.java:835)
  1192. at org.postgresql.jdbc.PgPreparedStatement.executeBatch(PgPreparedStatement.java:1556)
  1193. at org.eclipse.persistence.internal.databaseaccess.DatabasePlatform.executeBatch(DatabasePlatform.java:2336)
  1194. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeJDK12BatchStatement(DatabaseAccessor.java:922)
  1195. at org.eclipse.persistence.internal.databaseaccess.ParameterizedSQLBatchWritingMechanism.executeBatch(ParameterizedSQLBatchWritingMechanism.java:179)
  1196. at org.eclipse.persistence.internal.databaseaccess.ParameterizedSQLBatchWritingMechanism.executeBatchedStatements(ParameterizedSQLBatchWritingMechanism.java:134)
  1197. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.writesCompleted(DatabaseAccessor.java:1845)
  1198. at org.eclipse.persistence.internal.sessions.AbstractSession.writesCompleted(AbstractSession.java:4300)
  1199. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.writesCompleted(UnitOfWorkImpl.java:5592)
  1200. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.acquireWriteLocks(UnitOfWorkImpl.java:1646)
  1201. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitTransactionAfterWriteChanges(UnitOfWorkImpl.java:1614)
  1202. at org.eclipse.persistence.internal.sessions.RepeatableWriteUnitOfWork.commitRootUnitOfWork(RepeatableWriteUnitOfWork.java:285)
  1203. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitAndResume(UnitOfWorkImpl.java:1169)
  1204. at org.eclipse.persistence.internal.jpa.transaction.EntityTransactionImpl.commit(EntityTransactionImpl.java:134)
  1205. at org.apache.ambari.server.orm.AmbariJpaLocalTxnInterceptor.invoke(AmbariJpaLocalTxnInterceptor.java:153)
  1206. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:77)
  1207. at com.google.inject.internal.InterceptorStackCallback.intercept(InterceptorStackCallback.java:55)
  1208. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.saveEntities(<generated>)
  1209. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener.onAlertEvent(AlertReceivedListener.java:388)
  1210. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.CGLIB$onAlertEvent$0(<generated>)
  1211. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173$$FastClassByGuice$$3f418344.invoke(<generated>)
  1212. at com.google.inject.internal.cglib.proxy.$MethodProxy.invokeSuper(MethodProxy.java:228)
  1213. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:76)
  1214. at org.apache.ambari.server.orm.AmbariLocalSessionInterceptor.invoke(AmbariLocalSessionInterceptor.java:44)
  1215. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:77)
  1216. at com.google.inject.internal.InterceptorStackCallback.intercept(InterceptorStackCallback.java:55)
  1217. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.onAlertEvent(<generated>)
  1218. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  1219. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  1220. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  1221. at java.lang.reflect.Method.invoke(Method.java:498)
  1222. at com.google.common.eventbus.Subscriber.invokeSubscriberMethod(Subscriber.java:87)
  1223. at com.google.common.eventbus.Subscriber$1.run(Subscriber.java:72)
  1224. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  1225. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  1226. at java.lang.Thread.run(Thread.java:745)
  1227. Caused by: org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
  1228. at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2433)
  1229. at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2178)
  1230. ... 37 more
  1231. 2021-07-23 12:49:41,131 ERROR [alert-event-bus-2] AmbariJpaLocalTxnInterceptor:188 - [DETAILED ERROR] Internal exception (2) :
  1232. org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
  1233. at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2433)
  1234. at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2178)
  1235. at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:479)
  1236. at org.postgresql.jdbc.PgStatement.executeBatch(PgStatement.java:835)
  1237. at org.postgresql.jdbc.PgPreparedStatement.executeBatch(PgPreparedStatement.java:1556)
  1238. at org.eclipse.persistence.internal.databaseaccess.DatabasePlatform.executeBatch(DatabasePlatform.java:2336)
  1239. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeJDK12BatchStatement(DatabaseAccessor.java:922)
  1240. at org.eclipse.persistence.internal.databaseaccess.ParameterizedSQLBatchWritingMechanism.executeBatch(ParameterizedSQLBatchWritingMechanism.java:179)
  1241. at org.eclipse.persistence.internal.databaseaccess.ParameterizedSQLBatchWritingMechanism.executeBatchedStatements(ParameterizedSQLBatchWritingMechanism.java:134)
  1242. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.writesCompleted(DatabaseAccessor.java:1845)
  1243. at org.eclipse.persistence.internal.sessions.AbstractSession.writesCompleted(AbstractSession.java:4300)
  1244. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.writesCompleted(UnitOfWorkImpl.java:5592)
  1245. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.acquireWriteLocks(UnitOfWorkImpl.java:1646)
  1246. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitTransactionAfterWriteChanges(UnitOfWorkImpl.java:1614)
  1247. at org.eclipse.persistence.internal.sessions.RepeatableWriteUnitOfWork.commitRootUnitOfWork(RepeatableWriteUnitOfWork.java:285)
  1248. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitAndResume(UnitOfWorkImpl.java:1169)
  1249. at org.eclipse.persistence.internal.jpa.transaction.EntityTransactionImpl.commit(EntityTransactionImpl.java:134)
  1250. at org.apache.ambari.server.orm.AmbariJpaLocalTxnInterceptor.invoke(AmbariJpaLocalTxnInterceptor.java:153)
  1251. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:77)
  1252. at com.google.inject.internal.InterceptorStackCallback.intercept(InterceptorStackCallback.java:55)
  1253. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.saveEntities(<generated>)
  1254. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener.onAlertEvent(AlertReceivedListener.java:388)
  1255. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.CGLIB$onAlertEvent$0(<generated>)
  1256. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173$$FastClassByGuice$$3f418344.invoke(<generated>)
  1257. at com.google.inject.internal.cglib.proxy.$MethodProxy.invokeSuper(MethodProxy.java:228)
  1258. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:76)
  1259. at org.apache.ambari.server.orm.AmbariLocalSessionInterceptor.invoke(AmbariLocalSessionInterceptor.java:44)
  1260. at com.google.inject.internal.InterceptorStackCallback$InterceptedMethodInvocation.proceed(InterceptorStackCallback.java:77)
  1261. at com.google.inject.internal.InterceptorStackCallback.intercept(InterceptorStackCallback.java:55)
  1262. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173.onAlertEvent(<generated>)
  1263. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  1264. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  1265. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  1266. at java.lang.reflect.Method.invoke(Method.java:498)
  1267. at com.google.common.eventbus.Subscriber.invokeSubscriberMethod(Subscriber.java:87)
  1268. at com.google.common.eventbus.Subscriber$1.run(Subscriber.java:72)
  1269. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  1270. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  1271. at java.lang.Thread.run(Thread.java:745)
  1272. 2021-07-23 12:49:41,132 ERROR [alert-event-bus-2] default:232 - Exception thrown by subscriber method onAlertEvent(org.apache.ambari.server.events.AlertReceivedEvent) on subscriber org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener$$EnhancerByGuice$$c6d5f173@6d229b1c when dispatching event: AlertReceivedEvent{cluserId=0, alerts=[{clusterId=2, state=OK, name=namenode_hdfs_blocks_health, service=HDFS, component=NAMENODE, host=datanodeFQDN.DOMAIN.COM, instance=null, text='Total Blocks:[555], Missing Blocks:[0]'}, {clusterId=2, state=OK, name=hbase_regionserver_process, service=HBASE, component=HBASE_REGIONSERVER, host=datanodeFQDN.DOMAIN.COM, instance=null, text='TCP OK - 0.000s response on port 16030'}, {clusterId=2, state=OK, name=datanode_storage, service=HDFS, component=DATANODE, host=datanodeFQDN.DOMAIN.COM, instance=null, text='Remaining Capacity:[15363826532], Total Capacity:[66% Used, 44572591616]'}, {clusterId=2, state=OK, name=namenode_hdfs_capacity_utilization, service=HDFS, component=NAMENODE, host=datanodeFQDN.DOMAIN.COM, instance=null, text='Capacity Used:[9%, 1475305626], Capacity Remaining:[15363748708]'}, {clusterId=2, state=OK, name=namenode_rpc_latency, service=HDFS, component=NAMENODE, host=datanodeFQDN.DOMAIN.COM, instance=null, text='Average Queue Time:[3.0], Average Processing Time:[0.0]'}, {clusterId=2, state=OK, name=yarn_resourcemanager_webui, service=YARN, component=RESOURCEMANAGER, host=datanodeFQDN.DOMAIN.COM, instance=null, text='HTTP 200 response in 0.000s'}, {clusterId=2, state=OK, name=namenode_hdfs_pending_deletion_blocks, service=HDFS, component=NAMENODE, host=datanodeFQDN.DOMAIN.COM, instance=null, text='Pending Deletion Blocks:[6]'}, {clusterId=2, state=OK, name=yarn_timeline_reader_webui, service=YARN, component=TIMELINE_READER, host=datanodeFQDN.DOMAIN.COM, instance=null, text='HTTP 200 response in 0.002s'}, {clusterId=2, state=OK, name=datanode_heap_usage, service=HDFS, component=DATANODE, host=datanodeFQDN.DOMAIN.COM, instance=null, text='Used Heap:[9%, 85.51226 MB], Max Heap: 1004.0 MB'}, {clusterId=2, state=CRITICAL, name=hive_server_process, service=HIVE, component=HIVE_SERVER, host=datanodeFQDN.DOMAIN.COM, instance=null, text='Connection failed on host datanodeFQDN.DOMAIN.COM:10000 (Traceback (most recent call last):
  1273. File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/alerts/alert_hive_thrift_port.py", line 213, in execute
  1274. ldap_password=ldap_password, pam_username=pam_username, pam_password=pam_password)
  1275. File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/hive_check.py", line 95, in check_thrift_port_sasl
  1276. timeout_kill_strategy=TerminateStrategy.KILL_PROCESS_TREE,
  1277. File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
  1278. self.env.run()
  1279. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
  1280. self.run_action(resource, action)
  1281. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
  1282. provider_action()
  1283. File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
  1284. returns=self.resource.returns)
  1285. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
  1286. result = function(command, **kwargs)
  1287. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
  1288. tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  1289. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
  1290. result = _call(command, **kwargs_copy)
  1291. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
  1292. raise ExecutionFailed(err_msg, code, out, err)
  1293. ExecutionFailed: Execution of '! (beeline -u 'jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary' -n hive -e ';' 2>&1 | awk '{print}' | grep -vz -i -e 'Connected to:' -e 'Transaction isolation:' -e 'inactive HS2 instance; use service discovery')' returned 1. Could not find valid SPARK_HOME while searching ['/home', '/usr/local/bin']
  1294.  
  1295. Did you install PySpark via a package manager such as pip or Conda? If so,
  1296. PySpark was not found in your Python environment. It is possible your
  1297. Python environment does not properly bind with your package manager.
  1298.  
  1299. Please check your default 'python' and if you set PYSPARK_PYTHON and/or
  1300. PYSPARK_DRIVER_PYTHON environment variables, and see if you can import
  1301. PySpark, for example, 'python -c 'import pyspark'.
  1302.  
  1303. If you cannot import, you can install by using the Python executable directly,
  1304. for example, 'python -m pip install pyspark [--user]'. Otherwise, you can also
  1305. explicitly set the Python executable, that has PySpark installed, to
  1306. PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,
  1307. 'PYSPARK_PYTHON=python3 pyspark'.
  1308.  
  1309. Connecting to jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1310. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1311. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1312. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1313. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1314. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1315. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1316. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1317. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1318. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1319. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1320. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1321. No current connection
  1322. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1323. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1324. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1325. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1326. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1327. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1328.  
  1329. )'}, {clusterId=2, state=OK, name=namenode_last_checkpoint, service=HDFS, component=NAMENODE, host=datanodeFQDN.DOMAIN.COM, instance=null, text='Last Checkpoint: [0 hours, 0 minutes, 39 transactions]'}, {clusterId=2, state=OK, name=zeppelin_server_status, service=ZEPPELIN, component=ZEPPELIN_MASTER, host=datanodeFQDN.DOMAIN.COM, instance=null, text='Successful connection to Zeppelin'}, {clusterId=2, state=WARNING, name=ambari_agent_disk_usage, service=AMBARI, component=AMBARI_AGENT, host=datanodeFQDN.DOMAIN.COM, instance=null, text='Capacity Used: [68.52%, 34.9 GB], Capacity Total: [50.9 GB], path=/usr/hdp'}]}
  1330. javax.persistence.RollbackException: Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.6.2.v20151217-774c696): org.eclipse.persistence.exceptions.DatabaseException
  1331. Internal Exception: java.sql.BatchUpdateException: Batch entry 6 UPDATE alert_current SET latest_timestamp = 1627058980541, latest_text = 'Connection failed on host datanodeFQDN.DOMAIN.COM:10000 (Traceback (most recent call last):
  1332. File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/alerts/alert_hive_thrift_port.py", line 213, in execute
  1333. ldap_password=ldap_password, pam_username=pam_username, pam_password=pam_password)
  1334. File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/hive_check.py", line 95, in check_thrift_port_sasl
  1335. timeout_kill_strategy=TerminateStrategy.KILL_PROCESS_TREE,
  1336. File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
  1337. self.env.run()
  1338. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
  1339. self.run_action(resource, action)
  1340. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
  1341. provider_action()
  1342. File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
  1343. returns=self.resource.returns)
  1344. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
  1345. result = function(command, **kwargs)
  1346. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
  1347. tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  1348. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
  1349. result = _call(command, **kwargs_copy)
  1350. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
  1351. raise ExecutionFailed(err_msg, code, out, err)
  1352. ExecutionFailed: Execution of ''! (beeline -u ''jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary'' -n hive -e '';'' 2>&1 | awk ''{print}'' | grep -vz -i -e ''Connected to:'' -e ''Transaction isolation:'' -e ''inactive HS2 instance; use service discovery'')'' returned 1. Could not find valid SPARK_HOME while searching [''/home'', ''/usr/local/bin'']
  1353.  
  1354. Did you install PySpark via a package manager such as pip or Conda? If so,
  1355. PySpark was not found in your Python environment. It is possible your
  1356. Python environment does not properly bind with your package manager.
  1357.  
  1358. Please check your default ''python'' and if you set PYSPARK_PYTHON and/or
  1359. PYSPARK_DRIVER_PYTHON environment variables, and see if you can import
  1360. PySpark, for example, ''python -c ''import pyspark''.
  1361.  
  1362. If you cannot import, you can install by using the Python executable directly,
  1363. for example, ''python -m pip install pyspark [--user]''. Otherwise, you can also
  1364. explicitly set the Python executable, that has PySpark installed, to
  1365. PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,
  1366. ''PYSPARK_PYTHON=python3 pyspark''.
  1367.  
  1368. Connecting to jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1369. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1370. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1371. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1372. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1373. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1374. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1375. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1376. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1377. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1378. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1379. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1380. No current connection
  1381. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1382. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1383. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1384. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1385. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1386. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1387. Connection failed on host datanodeFQDN.DOMAIN.COM:10000 (Traceback (most recent call last):
  1388. File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/alerts/alert_hive_thrift_port.py", line 213, in execute
  1389. ldap_password=ldap_password, pam_username=pam_username, pam_password=pam_password)
  1390. File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/hive_check.py", line 95, in check_thrift_port_sasl
  1391. timeout_kill_strategy=TerminateStrategy.KILL_PROCESS_TREE,
  1392. File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
  1393. self.env.run()
  1394. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
  1395. self.run_action(resource, action)
  1396. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
  1397. provider_action()
  1398. File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
  1399. returns=self.resource.returns)
  1400. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
  1401. result = function(command, **kwargs)
  1402. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
  1403. tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  1404. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
  1405. result = _call(command, **kwargs_copy)
  1406. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
  1407. raise ExecutionFailed(err_msg, code, out, err)
  1408. ExecutionFailed: Execution of '! (beeline -u 'jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary' -n hive -e ';' 2>&1 | awk '{print}' | grep -vz -i -e 'Connected to:' -e 'Transaction isolation:' -e 'inactive HS2 instance; use service discovery')' returned 1. Could not find valid SPARK_HOME while searching ['/home', '/usr/local/bin']
  1409.  
  1410. Did you install PySpark via a package manager such as pip or Conda? If so,
  1411. PySpark was not found in your Python environment. It is possible your
  1412. Python environment does not properly bind with your package manager.
  1413.  
  1414. Please check your default 'python' and if you set PYSPARK_PYTHON and/or
  1415. PYSPARK_DRIVER_PYTHON environment variables, and see if you can import
  1416. PySpark, for example, 'python -c 'import pyspark'.
  1417.  
  1418. If you cannot import, you can install by using the Python executable directly,
  1419. for example, 'python -m pip install pyspark [--user]'. Otherwise, you can also
  1420. explicitly set the Python executable, that has PySpark installed, to
  1421. PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,
  1422. 'PYSPARK_PYTHON=python3 pyspark'.
  1423.  
  1424. Connecting to jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1425. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1426. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1427. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1428. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1429. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1430. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1431. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1432. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1433. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1434. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1435. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1436. No current connection
  1437. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1438. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1439. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1440. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1441. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1442. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1443.  
  1444. )', occurrences = 2 WHERE (alert_id = 84) was aborted: ERROR: invalid byte sequence for encoding "UTF8": 0x00 Call getNextException to see other errors in the batch.
  1445. Error Code: 0
  1446. Call: UPDATE alert_current SET latest_timestamp = ?, latest_text = ?, occurrences = ? WHERE (alert_id = ?)
  1447. bind => [4 parameters bound]
  1448. at org.eclipse.persistence.internal.jpa.transaction.EntityTransactionImpl.commit(EntityTransactionImpl.java:159)
  1449. at org.apache.ambari.server.orm.AmbariJpaLocalTxnInterceptor.invoke(AmbariJpaLocalTxnInterceptor.java:153)
  1450. at org.apache.ambari.server.events.listeners.alerts.AlertReceivedListener.onAlertEvent(AlertReceivedListener.java:388)
  1451. at org.apache.ambari.server.orm.AmbariLocalSessionInterceptor.invoke(AmbariLocalSessionInterceptor.java:44)
  1452. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  1453. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  1454. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  1455. at java.lang.reflect.Method.invoke(Method.java:498)
  1456. at com.google.common.eventbus.Subscriber.invokeSubscriberMethod(Subscriber.java:87)
  1457. at com.google.common.eventbus.Subscriber$1.run(Subscriber.java:72)
  1458. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  1459. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  1460. at java.lang.Thread.run(Thread.java:745)
  1461. Caused by: Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.6.2.v20151217-774c696): org.eclipse.persistence.exceptions.DatabaseException
  1462. Internal Exception: java.sql.BatchUpdateException: Batch entry 6 UPDATE alert_current SET latest_timestamp = 1627058980541, latest_text = 'Connection failed on host datanodeFQDN.DOMAIN.COM:10000 (Traceback (most recent call last):
  1463. File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/alerts/alert_hive_thrift_port.py", line 213, in execute
  1464. ldap_password=ldap_password, pam_username=pam_username, pam_password=pam_password)
  1465. File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/hive_check.py", line 95, in check_thrift_port_sasl
  1466. timeout_kill_strategy=TerminateStrategy.KILL_PROCESS_TREE,
  1467. File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
  1468. self.env.run()
  1469. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
  1470. self.run_action(resource, action)
  1471. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
  1472. provider_action()
  1473. File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
  1474. returns=self.resource.returns)
  1475. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
  1476. result = function(command, **kwargs)
  1477. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
  1478. tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  1479. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
  1480. result = _call(command, **kwargs_copy)
  1481. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
  1482. raise ExecutionFailed(err_msg, code, out, err)
  1483. ExecutionFailed: Execution of ''! (beeline -u ''jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary'' -n hive -e '';'' 2>&1 | awk ''{print}'' | grep -vz -i -e ''Connected to:'' -e ''Transaction isolation:'' -e ''inactive HS2 instance; use service discovery'')'' returned 1. Could not find valid SPARK_HOME while searching [''/home'', ''/usr/local/bin'']
  1484.  
  1485. Did you install PySpark via a package manager such as pip or Conda? If so,
  1486. PySpark was not found in your Python environment. It is possible your
  1487. Python environment does not properly bind with your package manager.
  1488.  
  1489. Please check your default ''python'' and if you set PYSPARK_PYTHON and/or
  1490. PYSPARK_DRIVER_PYTHON environment variables, and see if you can import
  1491. PySpark, for example, ''python -c ''import pyspark''.
  1492.  
  1493. If you cannot import, you can install by using the Python executable directly,
  1494. for example, ''python -m pip install pyspark [--user]''. Otherwise, you can also
  1495. explicitly set the Python executable, that has PySpark installed, to
  1496. PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,
  1497. ''PYSPARK_PYTHON=python3 pyspark''.
  1498.  
  1499. Connecting to jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1500. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1501. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1502. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1503. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1504. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1505. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1506. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1507. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1508. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1509. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1510. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1511. No current connection
  1512. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1513. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1514. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1515. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1516. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1517. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1518. Connection failed on host datanodeFQDN.DOMAIN.COM:10000 (Traceback (most recent call last):
  1519. File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/alerts/alert_hive_thrift_port.py", line 213, in execute
  1520. ldap_password=ldap_password, pam_username=pam_username, pam_password=pam_password)
  1521. File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/hive_check.py", line 95, in check_thrift_port_sasl
  1522. timeout_kill_strategy=TerminateStrategy.KILL_PROCESS_TREE,
  1523. File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
  1524. self.env.run()
  1525. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
  1526. self.run_action(resource, action)
  1527. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
  1528. provider_action()
  1529. File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
  1530. returns=self.resource.returns)
  1531. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
  1532. result = function(command, **kwargs)
  1533. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
  1534. tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  1535. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
  1536. result = _call(command, **kwargs_copy)
  1537. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
  1538. raise ExecutionFailed(err_msg, code, out, err)
  1539. ExecutionFailed: Execution of '! (beeline -u 'jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary' -n hive -e ';' 2>&1 | awk '{print}' | grep -vz -i -e 'Connected to:' -e 'Transaction isolation:' -e 'inactive HS2 instance; use service discovery')' returned 1. Could not find valid SPARK_HOME while searching ['/home', '/usr/local/bin']
  1540.  
  1541. Did you install PySpark via a package manager such as pip or Conda? If so,
  1542. PySpark was not found in your Python environment. It is possible your
  1543. Python environment does not properly bind with your package manager.
  1544.  
  1545. Please check your default 'python' and if you set PYSPARK_PYTHON and/or
  1546. PYSPARK_DRIVER_PYTHON environment variables, and see if you can import
  1547. PySpark, for example, 'python -c 'import pyspark'.
  1548.  
  1549. If you cannot import, you can install by using the Python executable directly,
  1550. for example, 'python -m pip install pyspark [--user]'. Otherwise, you can also
  1551. explicitly set the Python executable, that has PySpark installed, to
  1552. PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,
  1553. 'PYSPARK_PYTHON=python3 pyspark'.
  1554.  
  1555. Connecting to jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1556. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1557. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1558. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1559. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1560. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1561. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1562. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1563. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1564. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1565. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1566. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1567. No current connection
  1568. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1569. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1570. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1571. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1572. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1573. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1574.  
  1575. )', occurrences = 2 WHERE (alert_id = 84) was aborted: ERROR: invalid byte sequence for encoding "UTF8": 0x00 Call getNextException to see other errors in the batch.
  1576. Error Code: 0
  1577. Call: UPDATE alert_current SET latest_timestamp = ?, latest_text = ?, occurrences = ? WHERE (alert_id = ?)
  1578. bind => [4 parameters bound]
  1579. at org.eclipse.persistence.exceptions.DatabaseException.sqlException(DatabaseException.java:340)
  1580. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.processExceptionForCommError(DatabaseAccessor.java:1620)
  1581. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeJDK12BatchStatement(DatabaseAccessor.java:926)
  1582. at org.eclipse.persistence.internal.databaseaccess.ParameterizedSQLBatchWritingMechanism.executeBatch(ParameterizedSQLBatchWritingMechanism.java:179)
  1583. at org.eclipse.persistence.internal.databaseaccess.ParameterizedSQLBatchWritingMechanism.executeBatchedStatements(ParameterizedSQLBatchWritingMechanism.java:134)
  1584. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.writesCompleted(DatabaseAccessor.java:1845)
  1585. at org.eclipse.persistence.internal.sessions.AbstractSession.writesCompleted(AbstractSession.java:4300)
  1586. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.writesCompleted(UnitOfWorkImpl.java:5592)
  1587. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.acquireWriteLocks(UnitOfWorkImpl.java:1646)
  1588. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitTransactionAfterWriteChanges(UnitOfWorkImpl.java:1614)
  1589. at org.eclipse.persistence.internal.sessions.RepeatableWriteUnitOfWork.commitRootUnitOfWork(RepeatableWriteUnitOfWork.java:285)
  1590. at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.commitAndResume(UnitOfWorkImpl.java:1169)
  1591. at org.eclipse.persistence.internal.jpa.transaction.EntityTransactionImpl.commit(EntityTransactionImpl.java:134)
  1592. ... 12 more
  1593. Caused by: java.sql.BatchUpdateException: Batch entry 6 UPDATE alert_current SET latest_timestamp = 1627058980541, latest_text = 'Connection failed on host datanodeFQDN.DOMAIN.COM:10000 (Traceback (most recent call last):
  1594. File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/alerts/alert_hive_thrift_port.py", line 213, in execute
  1595. ldap_password=ldap_password, pam_username=pam_username, pam_password=pam_password)
  1596. File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/hive_check.py", line 95, in check_thrift_port_sasl
  1597. timeout_kill_strategy=TerminateStrategy.KILL_PROCESS_TREE,
  1598. File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
  1599. self.env.run()
  1600. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
  1601. self.run_action(resource, action)
  1602. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
  1603. provider_action()
  1604. File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
  1605. returns=self.resource.returns)
  1606. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
  1607. result = function(command, **kwargs)
  1608. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
  1609. tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  1610. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
  1611. result = _call(command, **kwargs_copy)
  1612. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
  1613. raise ExecutionFailed(err_msg, code, out, err)
  1614. ExecutionFailed: Execution of ''! (beeline -u ''jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary'' -n hive -e '';'' 2>&1 | awk ''{print}'' | grep -vz -i -e ''Connected to:'' -e ''Transaction isolation:'' -e ''inactive HS2 instance; use service discovery'')'' returned 1. Could not find valid SPARK_HOME while searching [''/home'', ''/usr/local/bin'']
  1615.  
  1616. Did you install PySpark via a package manager such as pip or Conda? If so,
  1617. PySpark was not found in your Python environment. It is possible your
  1618. Python environment does not properly bind with your package manager.
  1619.  
  1620. Please check your default ''python'' and if you set PYSPARK_PYTHON and/or
  1621. PYSPARK_DRIVER_PYTHON environment variables, and see if you can import
  1622. PySpark, for example, ''python -c ''import pyspark''.
  1623.  
  1624. If you cannot import, you can install by using the Python executable directly,
  1625. for example, ''python -m pip install pyspark [--user]''. Otherwise, you can also
  1626. explicitly set the Python executable, that has PySpark installed, to
  1627. PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,
  1628. ''PYSPARK_PYTHON=python3 pyspark''.
  1629.  
  1630. Connecting to jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1631. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1632. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1633. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1634. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1635. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1636. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1637. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1638. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1639. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1640. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1641. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1642. No current connection
  1643. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1644. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1645. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1646. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1647. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1648. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1649. Connection failed on host datanodeFQDN.DOMAIN.COM:10000 (Traceback (most recent call last):
  1650. File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/alerts/alert_hive_thrift_port.py", line 213, in execute
  1651. ldap_password=ldap_password, pam_username=pam_username, pam_password=pam_password)
  1652. File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/hive_check.py", line 95, in check_thrift_port_sasl
  1653. timeout_kill_strategy=TerminateStrategy.KILL_PROCESS_TREE,
  1654. File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
  1655. self.env.run()
  1656. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
  1657. self.run_action(resource, action)
  1658. File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
  1659. provider_action()
  1660. File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
  1661. returns=self.resource.returns)
  1662. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
  1663. result = function(command, **kwargs)
  1664. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
  1665. tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  1666. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
  1667. result = _call(command, **kwargs_copy)
  1668. File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
  1669. raise ExecutionFailed(err_msg, code, out, err)
  1670. ExecutionFailed: Execution of '! (beeline -u 'jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary' -n hive -e ';' 2>&1 | awk '{print}' | grep -vz -i -e 'Connected to:' -e 'Transaction isolation:' -e 'inactive HS2 instance; use service discovery')' returned 1. Could not find valid SPARK_HOME while searching ['/home', '/usr/local/bin']
  1671.  
  1672. Did you install PySpark via a package manager such as pip or Conda? If so,
  1673. PySpark was not found in your Python environment. It is possible your
  1674. Python environment does not properly bind with your package manager.
  1675.  
  1676. Please check your default 'python' and if you set PYSPARK_PYTHON and/or
  1677. PYSPARK_DRIVER_PYTHON environment variables, and see if you can import
  1678. PySpark, for example, 'python -c 'import pyspark'.
  1679.  
  1680. If you cannot import, you can install by using the Python executable directly,
  1681. for example, 'python -m pip install pyspark [--user]'. Otherwise, you can also
  1682. explicitly set the Python executable, that has PySpark installed, to
  1683. PYSPARK_PYTHON or PYSPARK_DRIVER_PYTHON environment variables, for example,
  1684. 'PYSPARK_PYTHON=python3 pyspark'.
  1685.  
  1686. Connecting to jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1687. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1688. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1689. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1690. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1691. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1692. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1693. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1694. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1695. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1696. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1697. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1698. No current connection
  1699. 21/07/23 12:49:40 INFO Utils: Supplied authorities: datanodeFQDN.DOMAIN.COM:10000
  1700. 21/07/23 12:49:40 INFO Utils: Resolved authority: datanodeFQDN.DOMAIN.COM:10000
  1701. 21/07/23 12:49:40 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1702. 21/07/23 12:49:40 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary
  1703. 21/07/23 12:49:40 INFO HiveConnection: Transport Used for JDBC connection: binary
  1704. Error: Could not open client transport with JDBC Uri: jdbc:hive2://datanodeFQDN.DOMAIN.COM:10000/;transportMode=binary: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0)
  1705.  
  1706. )', occurrences = 2 WHERE (alert_id = 84) was aborted: ERROR: invalid byte sequence for encoding "UTF8": 0x00 Call getNextException to see other errors in the batch.
  1707. at org.postgresql.jdbc.BatchResultHandler.handleError(BatchResultHandler.java:148)
  1708. at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2179)
  1709. at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:479)
  1710. at org.postgresql.jdbc.PgStatement.executeBatch(PgStatement.java:835)
  1711. at org.postgresql.jdbc.PgPreparedStatement.executeBatch(PgPreparedStatement.java:1556)
  1712. at org.eclipse.persistence.internal.databaseaccess.DatabasePlatform.executeBatch(DatabasePlatform.java:2336)
  1713. at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeJDK12BatchStatement(DatabaseAccessor.java:922)
  1714. ... 22 more
  1715. Caused by: org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
  1716. at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2433)
  1717. at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2178)
  1718. ... 27 more
  1719. 2021-07-23 12:50:02,776 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=METRICS_COLLECTOR, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  1720. 2021-07-23 12:50:02,845 ERROR [ambari-client-thread-249] MetricsRequestHelper:112 - Error getting timeline metrics : Connection refused (Connection refused)
  1721. 2021-07-23 12:50:02,846 ERROR [ambari-client-thread-249] MetricsRequestHelper:119 - Cannot connect to collector: SocketTimeoutException for datanodeFQDN.DOMAIN.COM
  1722. 2021-07-23 12:50:07,813 ERROR [ambari-client-thread-219] MetricsRequestHelper:112 - Error getting timeline metrics : Connection refused (Connection refused)
  1723. 2021-07-23 12:50:07,814 ERROR [ambari-client-thread-219] MetricsRequestHelper:119 - Cannot connect to collector: SocketTimeoutException for datanodeFQDN.DOMAIN.COM
  1724. 2021-07-23 12:50:07,982 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=NODEMANAGER, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  1725. 2021-07-23 12:50:08,736 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=ATLAS_SERVER, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  1726. 2021-07-23 12:50:08,736 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=HIVE_SERVER, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  1727. 2021-07-23 12:50:08,736 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=METRICS_GRAFANA, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  1728. 2021-07-23 12:50:08,736 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=ACTIVITY_ANALYZER, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  1729. 2021-07-23 12:50:08,737 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=ACTIVITY_EXPLORER, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  1730. 2021-07-23 12:50:08,737 INFO [ambari-action-scheduler] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=SPARK2_JOBHISTORYSERVER, hostName=datanodeFQDN.DOMAIN.COM, oldState=INSTALLED, currentState=STARTING
  1731. 2021-07-23 12:50:08,744 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role ACTIVITY_ANALYZER, roleCommand START, and command ID 23-5, task ID 381
  1732. 2021-07-23 12:50:08,744 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role ACTIVITY_EXPLORER, roleCommand START, and command ID 23-5, task ID 382
  1733. 2021-07-23 12:50:08,745 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role ATLAS_SERVER, roleCommand START, and command ID 23-5, task ID 383
  1734. 2021-07-23 12:50:08,745 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role HIVE_SERVER, roleCommand START, and command ID 23-5, task ID 384
  1735. 2021-07-23 12:50:08,745 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role METRICS_GRAFANA, roleCommand START, and command ID 23-5, task ID 385
  1736. 2021-07-23 12:50:08,745 INFO [ambari-action-scheduler] AgentCommandsPublisher:124 - AgentCommandsPublisher.sendCommands: sending ExecutionCommand for host datanodeFQDN.DOMAIN.COM, role SPARK2_JOBHISTORYSERVER, roleCommand START, and command ID 23-5, task ID 386
  1737. 2021-07-23 12:50:08,901 INFO [agent-message-monitor-0] MessageEmitter:218 - Schedule execution command emitting, retry: 0, messageId: 5
  1738. 2021-07-23 12:50:08,902 WARN [agent-message-retry-0] MessageEmitter:255 - Reschedule execution command emitting, retry: 1, messageId: 5
  1739. 2021-07-23 12:50:09,040 ERROR [ambari-client-thread-249] MetricsRequestHelper:112 - Error getting timeline metrics : Connection refused (Connection refused)
  1740. 2021-07-23 12:50:09,041 ERROR [ambari-client-thread-249] MetricsRequestHelper:119 - Cannot connect to collector: SocketTimeoutException for datanodeFQDN.DOMAIN.COM
  1741. 2021-07-23 12:50:13,809 ERROR [ambari-client-thread-249] MetricsRequestHelper:112 - Error getting timeline metrics : Connection refused (Connection refused)
  1742. 2021-07-23 12:50:13,809 ERROR [ambari-client-thread-249] MetricsRequestHelper:119 - Cannot connect to collector: SocketTimeoutException for datanodeFQDN.DOMAIN.COM
  1743. 2021-07-23 12:50:15,247 ERROR [ambari-client-thread-219] MetricsRequestHelper:112 - Error getting timeline metrics : Connection refused (Connection refused)
  1744. 2021-07-23 12:50:15,248 ERROR [ambari-client-thread-219] MetricsRequestHelper:119 - Cannot connect to collector: SocketTimeoutException for datanodeFQDN.DOMAIN.COM
  1745. 2021-07-23 12:50:44,007 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=ACTIVITY_ANALYZER, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  1746. 2021-07-23 12:50:48,452 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=ACTIVITY_EXPLORER, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  1747. 2021-07-23 12:50:51,231 INFO [pool-32-thread-1] MetricSinkWriteShardHostnameHashingStrategy:42 - Calculated collector shard datanodeFQDN.DOMAIN.COM based on hostname: np-dev1-hdp315-namenode-01.DOMAIN.COM
  1748. 2021-07-23 12:51:17,665 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=ATLAS_SERVER, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  1749. 2021-07-23 12:52:04,146 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=HIVE_SERVER, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  1750. 2021-07-23 12:52:09,208 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=METRICS_GRAFANA, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  1751. 2021-07-23 12:52:30,363 INFO [agent-report-processor-0] ServiceComponentHostImpl:1054 - Host role transitioned to a new state, serviceComponentName=SPARK2_JOBHISTORYSERVER, hostName=datanodeFQDN.DOMAIN.COM, oldState=STARTING, currentState=STARTED
  1752. 2021-07-23 12:52:30,372 INFO [pool-2-thread-1] StackAdvisorHelper:245 - Clear stack advisor caches, host: datanodeFQDN.DOMAIN.COM
  1753. 2021-07-23 12:52:54,600 WARN [ambari-client-thread-198] Errors:173 - The following warnings have been detected with resource and/or provider classes:
  1754. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.HostKerberosIdentityService.getKerberosIdentities(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  1755. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.HostKerberosIdentityService.getKerberosIdentity(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String), should not consume any entity.
  1756. 2021-07-23 12:52:54,600 WARN [ambari-client-thread-198] Errors:173 - The following warnings have been detected with resource and/or provider classes:
  1757. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.HostKerberosIdentityService.getKerberosIdentities(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity.
  1758. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.HostKerberosIdentityService.getKerberosIdentity(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String), should not consume any entity.
  1759. 2021-07-23 13:14:47,984 INFO [MessageBroker-1] WebSocketMessageBrokerStats:124 - WebSocketSession[1 current WS(1)-HttpStream(0)-HttpPoll(0), 1 total, 0 closed abnormally (0 connect failure, 0 send limit, 0 transport error)], stompSubProtocol[processed CONNECT(1)-CONNECTED(1)-DISCONNECT(0)], stompBrokerRelay[null], inboundChannel[pool size = 8, active threads = 0, queued tasks = 0, completed tasks = 594], outboundChannel[pool size = 6, active threads = 0, queued tasks = 0, completed tasks = 311], sockJsScheduler[pool size = 2, active threads = 1, queued tasks = 0, completed tasks = 1]
  1760. 2021-07-23 13:14:48,314 INFO [MessageBroker-1] WebSocketMessageBrokerStats:124 - WebSocketSession[1 current WS(1)-HttpStream(0)-HttpPoll(0), 2 total, 0 closed abnormally (0 connect failure, 0 send limit, 1 transport error)], stompSubProtocol[processed CONNECT(2)-CONNECTED(2)-DISCONNECT(1)], stompBrokerRelay[null], inboundChannel[pool size = 10, active threads = 0, queued tasks = 0, completed tasks = 1440], outboundChannel[pool size = 8, active threads = 0, queued tasks = 0, completed tasks = 471], sockJsScheduler[pool size = 2, active threads = 1, queued tasks = 0, completed tasks = 1]
  1761. 2021-07-23 13:35:35,580 INFO [agent-report-processor-0] HeartbeatProcessor:647 - State of service component ACTIVITY_ANALYZER of service SMARTSENSE of cluster 2 has changed from STARTED to INSTALLED at host datanodeFQDN.DOMAIN.COM according to STATUS_COMMAND report
  1762. 2021-07-23 13:35:35,581 INFO [pool-2-thread-1] StackAdvisorHelper:245 - Clear stack advisor caches, host: datanodeFQDN.DOMAIN.COM
  1763. 2021-07-23 13:44:47,984 INFO [MessageBroker-2] WebSocketMessageBrokerStats:124 - WebSocketSession[1 current WS(1)-HttpStream(0)-HttpPoll(0), 1 total, 0 closed abnormally (0 connect failure, 0 send limit, 0 transport error)], stompSubProtocol[processed CONNECT(1)-CONNECTED(1)-DISCONNECT(0)], stompBrokerRelay[null], inboundChannel[pool size = 8, active threads = 0, queued tasks = 0, completed tasks = 1134], outboundChannel[pool size = 6, active threads = 0, queued tasks = 0, completed tasks = 493], sockJsScheduler[pool size = 3, active threads = 1, queued tasks = 0, completed tasks = 2]
  1764. 2021-07-23 13:44:48,314 INFO [MessageBroker-2] WebSocketMessageBrokerStats:124 - WebSocketSession[1 current WS(1)-HttpStream(0)-HttpPoll(0), 2 total, 0 closed abnormally (0 connect failure, 0 send limit, 1 transport error)], stompSubProtocol[processed CONNECT(2)-CONNECTED(2)-DISCONNECT(1)], stompBrokerRelay[null], inboundChannel[pool size = 10, active threads = 0, queued tasks = 0, completed tasks = 2163], outboundChannel[pool size = 8, active threads = 0, queued tasks = 0, completed tasks = 712], sockJsScheduler[pool size = 3, active threads = 1, queued tasks = 0, completed tasks = 2]
  1765. 2021-07-23 13:52:46,786 INFO [ambari-client-thread-196] NamedTasksSubscriptions:117 - Task subscriptions were removed for sessionId = b9c81720-870c-8417-eb63-f529da1bc9c5
  1766. 2021-07-23 13:52:46,787 INFO [ambari-client-thread-196] NamedTasksSubscribeListener:72 - API disconnect was arrived with sessionId = b9c81720-870c-8417-eb63-f529da1bc9c5
  1767. 2021-07-23 13:52:48,440 INFO [ambari-client-thread-199] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = e85df779-9445-4ffa-6ae8-c948df6efdce, destination = /events/hostcomponents and id = sub-0
  1768. 2021-07-23 13:52:48,445 INFO [ambari-client-thread-199] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = e85df779-9445-4ffa-6ae8-c948df6efdce, destination = /events/alerts and id = sub-1
  1769. 2021-07-23 13:52:48,445 INFO [ambari-client-thread-199] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = e85df779-9445-4ffa-6ae8-c948df6efdce, destination = /events/ui_topologies and id = sub-2
  1770. 2021-07-23 13:52:48,446 INFO [ambari-client-thread-199] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = e85df779-9445-4ffa-6ae8-c948df6efdce, destination = /events/configs and id = sub-3
  1771. 2021-07-23 13:52:48,446 INFO [ambari-client-thread-199] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = e85df779-9445-4ffa-6ae8-c948df6efdce, destination = /events/services and id = sub-4
  1772. 2021-07-23 13:52:48,446 INFO [ambari-client-thread-199] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = e85df779-9445-4ffa-6ae8-c948df6efdce, destination = /events/hosts and id = sub-5
  1773. 2021-07-23 13:52:48,447 INFO [ambari-client-thread-199] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = e85df779-9445-4ffa-6ae8-c948df6efdce, destination = /events/alert_definitions and id = sub-6
  1774. 2021-07-23 13:52:48,447 INFO [ambari-client-thread-199] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = e85df779-9445-4ffa-6ae8-c948df6efdce, destination = /events/alert_group and id = sub-7
  1775. 2021-07-23 13:52:48,447 INFO [ambari-client-thread-199] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = e85df779-9445-4ffa-6ae8-c948df6efdce, destination = /events/upgrade and id = sub-8
  1776. 2021-07-23 13:52:48,509 INFO [ambari-client-thread-1716] NamedTasksSubscribeListener:47 - API subscribe was arrived with sessionId = e85df779-9445-4ffa-6ae8-c948df6efdce, destination = /events/requests and id = sub-9
  1777. 2021-07-23 13:57:22,613 INFO [pool-32-thread-1] MetricSinkWriteShardHostnameHashingStrategy:42 - Calculated collector shard datanodeFQDN.DOMAIN.COM based on hostname: np-dev1-hdp315-namenode-01.DOMAIN.COM
  1778. 2021-07-23 14:14:47,984 INFO [MessageBroker-1] WebSocketMessageBrokerStats:124 - WebSocketSession[1 current WS(1)-HttpStream(0)-HttpPoll(0), 2 total, 0 closed abnormally (0 connect failure, 0 send limit, 0 transport error)], stompSubProtocol[processed CONNECT(2)-CONNECTED(2)-DISCONNECT(0)], stompBrokerRelay[null], inboundChannel[pool size = 8, active threads = 0, queued tasks = 0, completed tasks = 1707], outboundChannel[pool size = 6, active threads = 0, queued tasks = 0, completed tasks = 673], sockJsScheduler[pool size = 4, active threads = 1, queued tasks = 0, completed tasks = 3]
  1779. 2021-07-23 14:14:48,314 INFO [MessageBroker-1] WebSocketMessageBrokerStats:124 - WebSocketSession[1 current WS(1)-HttpStream(0)-HttpPoll(0), 2 total, 0 closed abnormally (0 connect failure, 0 send limit, 1 transport error)], stompSubProtocol[processed CONNECT(2)-CONNECTED(2)-DISCONNECT(1)], stompBrokerRelay[null], inboundChannel[pool size = 10, active threads = 0, queued tasks = 0, completed tasks = 2883], outboundChannel[pool size = 8, active threads = 0, queued tasks = 0, completed tasks = 952], sockJsScheduler[pool size = 4, active threads = 1, queued tasks = 0, completed tasks = 3]
  1780.  
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement