Advertisement
Guest User

Untitled

a guest
Jun 13th, 2017
74
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 14.63 KB | None | 0 0
  1. unit-spark-0: 13:57:32 INFO unit.spark/0.install Info: Applying configuration version '1497355051'
  2. unit-spark-0: 13:57:32 INFO unit.spark/0.install Debug: Prefetching apt resources for package
  3. unit-spark-0: 13:57:32 INFO unit.spark/0.install Debug: Executing '/usr/bin/dpkg-query -W --showformat '${Status} ${Package} ${Version}\n''
  4. unit-spark-0: 13:57:32 INFO unit.spark/0.install Debug: Executing '/usr/bin/dpkg-query -W --showformat '${Status} ${Package} ${Version}\n' spark-master'
  5. unit-spark-0: 13:57:33 INFO unit.spark/0.install Debug: Executing '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-master'
  6. unit-spark-0: 13:57:33 INFO unit.spark/0.install Error: Could not update: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-master' returned 100: Reading package lists...
  7. unit-spark-0: 13:57:33 INFO unit.spark/0.install Building dependency tree...
  8. unit-spark-0: 13:57:33 INFO unit.spark/0.install Reading state information...
  9. unit-spark-0: 13:57:33 INFO unit.spark/0.install E: Unable to locate package spark-master
  10. unit-spark-0: 13:57:33 INFO unit.spark/0.install Error: /Stage[main]/Spark::Master/Package[spark-master]/ensure: change from purged to latest failed: Could not update: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-master' returned 100: Reading package lists...
  11. unit-spark-0: 13:57:33 INFO unit.spark/0.install Building dependency tree...
  12. unit-spark-0: 13:57:33 INFO unit.spark/0.install Reading state information...
  13. unit-spark-0: 13:57:33 INFO unit.spark/0.install E: Unable to locate package spark-master
  14. unit-spark-0: 13:57:33 INFO unit.spark/0.install Debug: Executing '/usr/bin/dpkg-query -W --showformat '${Status} ${Package} ${Version}\n' spark-datanucleus'
  15. unit-spark-0: 13:57:33 INFO unit.spark/0.install Debug: Executing '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-datanucleus'
  16. unit-spark-0: 13:57:33 INFO unit.spark/0.install Error: Could not update: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-datanucleus' returned 100: Reading package lists...
  17. unit-spark-0: 13:57:33 INFO unit.spark/0.install Building dependency tree...
  18. unit-spark-0: 13:57:33 INFO unit.spark/0.install Reading state information...
  19. unit-spark-0: 13:57:33 INFO unit.spark/0.install E: Unable to locate package spark-datanucleus
  20. unit-spark-0: 13:57:33 INFO unit.spark/0.install Error: /Stage[main]/Spark::Datanucleus/Package[spark-datanucleus]/ensure: change from purged to latest failed: Could not update: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-datanucleus' returned 100: Reading package lists...
  21. unit-spark-0: 13:57:33 INFO unit.spark/0.install Building dependency tree...
  22. unit-spark-0: 13:57:33 INFO unit.spark/0.install Reading state information...
  23. unit-spark-0: 13:57:33 INFO unit.spark/0.install E: Unable to locate package spark-datanucleus
  24. unit-spark-0: 13:57:33 INFO unit.spark/0.install Debug: Executing '/usr/bin/dpkg-query -W --showformat '${Status} ${Package} ${Version}\n' spark-worker'
  25. unit-spark-0: 13:57:33 INFO unit.spark/0.install Debug: Executing '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-worker'
  26. unit-spark-0: 13:57:33 INFO unit.spark/0.install Error: Could not update: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-worker' returned 100: Reading package lists...
  27. unit-spark-0: 13:57:33 INFO unit.spark/0.install Building dependency tree...
  28. unit-spark-0: 13:57:33 INFO unit.spark/0.install Reading state information...
  29. unit-spark-0: 13:57:33 INFO unit.spark/0.install E: Unable to locate package spark-worker
  30. unit-spark-0: 13:57:33 INFO unit.spark/0.install Error: /Stage[main]/Spark::Worker/Package[spark-worker]/ensure: change from purged to latest failed: Could not update: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-worker' returned 100: Reading package lists...
  31. unit-spark-0: 13:57:33 INFO unit.spark/0.install Building dependency tree...
  32. unit-spark-0: 13:57:33 INFO unit.spark/0.install Reading state information...
  33. unit-spark-0: 13:57:33 INFO unit.spark/0.install E: Unable to locate package spark-worker
  34. unit-spark-0: 13:57:33 INFO unit.spark/0.install Debug: Executing '/usr/bin/dpkg-query -W --showformat '${Status} ${Package} ${Version}\n' spark-history-server'
  35. unit-spark-0: 13:57:33 INFO unit.spark/0.install Debug: Executing '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-history-server'
  36. unit-spark-0: 13:57:33 INFO unit.spark/0.install Error: Could not update: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-history-server' returned 100: Reading package lists...
  37. unit-spark-0: 13:57:33 INFO unit.spark/0.install Building dependency tree...
  38. unit-spark-0: 13:57:33 INFO unit.spark/0.install Reading state information...
  39. unit-spark-0: 13:57:33 INFO unit.spark/0.install E: Unable to locate package spark-history-server
  40. unit-spark-0: 13:57:33 INFO unit.spark/0.install Error: /Stage[main]/Spark::History_server/Package[spark-history-server]/ensure: change from purged to latest failed: Could not update: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-history-server' returned 100: Reading package lists...
  41. unit-spark-0: 13:57:33 INFO unit.spark/0.install Building dependency tree...
  42. unit-spark-0: 13:57:33 INFO unit.spark/0.install Reading state information...
  43. unit-spark-0: 13:57:33 INFO unit.spark/0.install E: Unable to locate package spark-history-server
  44. unit-spark-0: 13:57:33 INFO unit.spark/0.install Debug: Executing '/usr/bin/dpkg-query -W --showformat '${Status} ${Package} ${Version}\n' spark-core'
  45. unit-spark-0: 13:57:33 INFO unit.spark/0.install Debug: Executing '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-core'
  46. unit-spark-0: 13:57:34 INFO unit.spark/0.install Error: Could not update: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-core' returned 100: Reading package lists...
  47. unit-spark-0: 13:57:34 INFO unit.spark/0.install Building dependency tree...
  48. unit-spark-0: 13:57:34 INFO unit.spark/0.install Reading state information...
  49. unit-spark-0: 13:57:34 INFO unit.spark/0.install E: Unable to locate package spark-core
  50. unit-spark-0: 13:57:34 INFO unit.spark/0.install Error: /Stage[main]/Spark::Common/Package[spark-core]/ensure: change from purged to latest failed: Could not update: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-core' returned 100: Reading package lists...
  51. unit-spark-0: 13:57:34 INFO unit.spark/0.install Building dependency tree...
  52. unit-spark-0: 13:57:34 INFO unit.spark/0.install Reading state information...
  53. unit-spark-0: 13:57:34 INFO unit.spark/0.install E: Unable to locate package spark-core
  54. unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Package[spark-python]: Dependency Package[spark-core] has failures: true
  55. unit-spark-0: 13:57:34 INFO unit.spark/0.install Warning: /Package[spark-python]: Skipping because of failed dependencies
  56. unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Package[spark-external]: Dependency Package[spark-core] has failures: true
  57. unit-spark-0: 13:57:34 INFO unit.spark/0.install Warning: /Package[spark-external]: Skipping because of failed dependencies
  58. unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Stage[main]/Spark::Common/File[/etc/spark/conf/spark-defaults.conf]: Dependency Package[spark-core] has failures: true
  59. unit-spark-0: 13:57:34 INFO unit.spark/0.install Warning: /Stage[main]/Spark::Common/File[/etc/spark/conf/spark-defaults.conf]: Skipping because of failed dependencies
  60. unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Stage[main]/Spark::Common/File[/etc/spark/conf/log4j.properties]: Dependency Package[spark-core] has failures: true
  61. unit-spark-0: 13:57:34 INFO unit.spark/0.install Warning: /Stage[main]/Spark::Common/File[/etc/spark/conf/log4j.properties]: Skipping because of failed dependencies
  62. unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Executing '/usr/bin/dpkg-query -W --showformat '${Status} ${Package} ${Version}\n' jdk'
  63. unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Stage[main]/Main/Package[jdk]/ensure: current_value purged, should be present (noop)
  64. unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: /Package[jdk]: The container Class[Main] will propagate my refresh event
  65. unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: Class[Main]: Would have triggered 'refresh' from 1 events
  66. unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Class[Main]: The container Stage[main] will propagate my refresh event
  67. unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Stage[main]/Spark::Common/File[/etc/spark/conf/spark-env.sh]: Dependency Package[spark-core] has failures: true
  68. unit-spark-0: 13:57:34 INFO unit.spark/0.install Warning: /Stage[main]/Spark::Common/File[/etc/spark/conf/spark-env.sh]: Skipping because of failed dependencies
  69. unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Stage[main]/Spark::Worker/Service[spark-worker]: Dependency Package[spark-worker] has failures: true
  70. unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Stage[main]/Spark::Worker/Service[spark-worker]: Dependency Package[spark-core] has failures: true
  71. unit-spark-0: 13:57:34 INFO unit.spark/0.install Warning: /Stage[main]/Spark::Worker/Service[spark-worker]: Skipping because of failed dependencies
  72. unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Stage[main]/Spark::Master/Service[spark-master]: Dependency Package[spark-master] has failures: true
  73. unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Stage[main]/Spark::Master/Service[spark-master]: Dependency Package[spark-core] has failures: true
  74. unit-spark-0: 13:57:34 INFO unit.spark/0.install Warning: /Stage[main]/Spark::Master/Service[spark-master]: Skipping because of failed dependencies
  75. unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Stage[main]/Spark::History_server/Service[spark-history-server]: Dependency Package[spark-history-server] has failures: true
  76. unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Stage[main]/Spark::History_server/Service[spark-history-server]: Dependency Package[spark-core] has failures: true
  77. unit-spark-0: 13:57:34 INFO unit.spark/0.install Warning: /Stage[main]/Spark::History_server/Service[spark-history-server]: Skipping because of failed dependencies
  78. unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: Stage[main]: Would have triggered 'refresh' from 1 events
  79. unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Finishing transaction 41273900
  80. unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Storing state
  81. unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Stored state in 0.02 seconds
  82. unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: Finished catalog run in 1.33 seconds
  83. unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Executing '/etc/puppet/etckeeper-commit-post'
  84. unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Using settings: adding file resource 'rrddir': 'File[/var/lib/puppet/rrd]{:path=>"/var/lib/puppet/rrd", :mode=>"750", :owner=>"puppet", :group=>"puppet", :ensure=>:directory, :loglevel=>:debug, :links=>:follow, :backup=>false}'
  85. unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Finishing transaction 42307220
  86. unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Received report to process from juju-d7e021-2.lxd
  87. unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Evicting cache entry for environment 'production'
  88. unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Caching environment 'production' (ttl = 0 sec)
  89. unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Processing report from juju-d7e021-2.lxd with processor Puppet::Reports::Store
  90. unit-spark-0: 13:57:34 INFO unit.spark/0.install Traceback (most recent call last):
  91. unit-spark-0: 13:57:34 INFO unit.spark/0.install File "/var/lib/juju/agents/unit-spark-0/charm/hooks/install", line 19, in <module>
  92. unit-spark-0: 13:57:34 INFO unit.spark/0.install main()
  93. unit-spark-0: 13:57:34 INFO unit.spark/0.install File "/usr/local/lib/python3.5/dist-packages/charms/reactive/__init__.py", line 78, in main
  94. unit-spark-0: 13:57:34 INFO unit.spark/0.install bus.dispatch()
  95. unit-spark-0: 13:57:34 INFO unit.spark/0.install File "/usr/local/lib/python3.5/dist-packages/charms/reactive/bus.py", line 423, in dispatch
  96. unit-spark-0: 13:57:34 INFO unit.spark/0.install _invoke(other_handlers)
  97. unit-spark-0: 13:57:34 INFO unit.spark/0.install File "/usr/local/lib/python3.5/dist-packages/charms/reactive/bus.py", line 406, in _invoke
  98. unit-spark-0: 13:57:34 INFO unit.spark/0.install handler.invoke()
  99. unit-spark-0: 13:57:34 INFO unit.spark/0.install File "/usr/local/lib/python3.5/dist-packages/charms/reactive/bus.py", line 280, in invoke
  100. unit-spark-0: 13:57:34 INFO unit.spark/0.install self._action(*args)
  101. unit-spark-0: 13:57:34 INFO unit.spark/0.install File "/var/lib/juju/agents/unit-spark-0/charm/reactive/spark.py", line 172, in reinstall_spark
  102. unit-spark-0: 13:57:34 INFO unit.spark/0.install install_spark_standalone(zks, peers)
  103. unit-spark-0: 13:57:34 INFO unit.spark/0.install File "/var/lib/juju/agents/unit-spark-0/charm/reactive/spark.py", line 78, in install_spark_standalone
  104. unit-spark-0: 13:57:34 INFO unit.spark/0.install spark.configure(hosts, zks, peers)
  105. unit-spark-0: 13:57:34 INFO unit.spark/0.install File "lib/charms/layer/bigtop_spark.py", line 233, in configure
  106. unit-spark-0: 13:57:34 INFO unit.spark/0.install bigtop.trigger_puppet()
  107. unit-spark-0: 13:57:34 INFO unit.spark/0.install File "lib/charms/layer/apache_bigtop_base.py", line 611, in trigger_puppet
  108. unit-spark-0: 13:57:34 INFO unit.spark/0.install java_home()),
  109. unit-spark-0: 13:57:34 INFO unit.spark/0.install File "/usr/local/lib/python3.5/dist-packages/jujubigdata/utils.py", line 195, in re_edit_in_place
  110. unit-spark-0: 13:57:34 INFO unit.spark/0.install with Path(filename).in_place(encoding=encoding) as (reader, writer):
  111. unit-spark-0: 13:57:34 INFO unit.spark/0.install File "/usr/lib/python3.5/contextlib.py", line 59, in __enter__
  112. unit-spark-0: 13:57:34 INFO unit.spark/0.install return next(self.gen)
  113. unit-spark-0: 13:57:34 INFO unit.spark/0.install File "/usr/local/lib/python3.5/dist-packages/path.py", line 1455, in in_place
  114. unit-spark-0: 13:57:34 INFO unit.spark/0.install os.rename(self, backup_fn)
  115. unit-spark-0: 13:57:34 INFO unit.spark/0.install FileNotFoundError: [Errno 2] No such file or directory: Path('/etc/default/bigtop-utils') -> Path('/etc/default/bigtop-utils.bak')
  116. unit-spark-0: 13:57:34 ERROR juju.worker.uniter.operation hook "install" failed: exit status 1
  117. q^C
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement