Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- unit-spark-0: 13:57:32 INFO unit.spark/0.install Info: Applying configuration version '1497355051'
- unit-spark-0: 13:57:32 INFO unit.spark/0.install Debug: Prefetching apt resources for package
- unit-spark-0: 13:57:32 INFO unit.spark/0.install Debug: Executing '/usr/bin/dpkg-query -W --showformat '${Status} ${Package} ${Version}\n''
- unit-spark-0: 13:57:32 INFO unit.spark/0.install Debug: Executing '/usr/bin/dpkg-query -W --showformat '${Status} ${Package} ${Version}\n' spark-master'
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Debug: Executing '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-master'
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Error: Could not update: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-master' returned 100: Reading package lists...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Building dependency tree...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Reading state information...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install E: Unable to locate package spark-master
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Error: /Stage[main]/Spark::Master/Package[spark-master]/ensure: change from purged to latest failed: Could not update: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-master' returned 100: Reading package lists...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Building dependency tree...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Reading state information...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install E: Unable to locate package spark-master
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Debug: Executing '/usr/bin/dpkg-query -W --showformat '${Status} ${Package} ${Version}\n' spark-datanucleus'
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Debug: Executing '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-datanucleus'
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Error: Could not update: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-datanucleus' returned 100: Reading package lists...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Building dependency tree...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Reading state information...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install E: Unable to locate package spark-datanucleus
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Error: /Stage[main]/Spark::Datanucleus/Package[spark-datanucleus]/ensure: change from purged to latest failed: Could not update: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-datanucleus' returned 100: Reading package lists...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Building dependency tree...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Reading state information...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install E: Unable to locate package spark-datanucleus
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Debug: Executing '/usr/bin/dpkg-query -W --showformat '${Status} ${Package} ${Version}\n' spark-worker'
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Debug: Executing '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-worker'
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Error: Could not update: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-worker' returned 100: Reading package lists...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Building dependency tree...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Reading state information...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install E: Unable to locate package spark-worker
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Error: /Stage[main]/Spark::Worker/Package[spark-worker]/ensure: change from purged to latest failed: Could not update: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-worker' returned 100: Reading package lists...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Building dependency tree...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Reading state information...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install E: Unable to locate package spark-worker
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Debug: Executing '/usr/bin/dpkg-query -W --showformat '${Status} ${Package} ${Version}\n' spark-history-server'
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Debug: Executing '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-history-server'
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Error: Could not update: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-history-server' returned 100: Reading package lists...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Building dependency tree...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Reading state information...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install E: Unable to locate package spark-history-server
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Error: /Stage[main]/Spark::History_server/Package[spark-history-server]/ensure: change from purged to latest failed: Could not update: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-history-server' returned 100: Reading package lists...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Building dependency tree...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Reading state information...
- unit-spark-0: 13:57:33 INFO unit.spark/0.install E: Unable to locate package spark-history-server
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Debug: Executing '/usr/bin/dpkg-query -W --showformat '${Status} ${Package} ${Version}\n' spark-core'
- unit-spark-0: 13:57:33 INFO unit.spark/0.install Debug: Executing '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-core'
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Error: Could not update: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-core' returned 100: Reading package lists...
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Building dependency tree...
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Reading state information...
- unit-spark-0: 13:57:34 INFO unit.spark/0.install E: Unable to locate package spark-core
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Error: /Stage[main]/Spark::Common/Package[spark-core]/ensure: change from purged to latest failed: Could not update: Execution of '/usr/bin/apt-get -q -y -o DPkg::Options::=--force-confold install spark-core' returned 100: Reading package lists...
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Building dependency tree...
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Reading state information...
- unit-spark-0: 13:57:34 INFO unit.spark/0.install E: Unable to locate package spark-core
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Package[spark-python]: Dependency Package[spark-core] has failures: true
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Warning: /Package[spark-python]: Skipping because of failed dependencies
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Package[spark-external]: Dependency Package[spark-core] has failures: true
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Warning: /Package[spark-external]: Skipping because of failed dependencies
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Stage[main]/Spark::Common/File[/etc/spark/conf/spark-defaults.conf]: Dependency Package[spark-core] has failures: true
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Warning: /Stage[main]/Spark::Common/File[/etc/spark/conf/spark-defaults.conf]: Skipping because of failed dependencies
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Stage[main]/Spark::Common/File[/etc/spark/conf/log4j.properties]: Dependency Package[spark-core] has failures: true
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Warning: /Stage[main]/Spark::Common/File[/etc/spark/conf/log4j.properties]: Skipping because of failed dependencies
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Executing '/usr/bin/dpkg-query -W --showformat '${Status} ${Package} ${Version}\n' jdk'
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Stage[main]/Main/Package[jdk]/ensure: current_value purged, should be present (noop)
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: /Package[jdk]: The container Class[Main] will propagate my refresh event
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: Class[Main]: Would have triggered 'refresh' from 1 events
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Class[Main]: The container Stage[main] will propagate my refresh event
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Stage[main]/Spark::Common/File[/etc/spark/conf/spark-env.sh]: Dependency Package[spark-core] has failures: true
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Warning: /Stage[main]/Spark::Common/File[/etc/spark/conf/spark-env.sh]: Skipping because of failed dependencies
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Stage[main]/Spark::Worker/Service[spark-worker]: Dependency Package[spark-worker] has failures: true
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Stage[main]/Spark::Worker/Service[spark-worker]: Dependency Package[spark-core] has failures: true
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Warning: /Stage[main]/Spark::Worker/Service[spark-worker]: Skipping because of failed dependencies
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Stage[main]/Spark::Master/Service[spark-master]: Dependency Package[spark-master] has failures: true
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Stage[main]/Spark::Master/Service[spark-master]: Dependency Package[spark-core] has failures: true
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Warning: /Stage[main]/Spark::Master/Service[spark-master]: Skipping because of failed dependencies
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Stage[main]/Spark::History_server/Service[spark-history-server]: Dependency Package[spark-history-server] has failures: true
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: /Stage[main]/Spark::History_server/Service[spark-history-server]: Dependency Package[spark-core] has failures: true
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Warning: /Stage[main]/Spark::History_server/Service[spark-history-server]: Skipping because of failed dependencies
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: Stage[main]: Would have triggered 'refresh' from 1 events
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Finishing transaction 41273900
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Storing state
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Stored state in 0.02 seconds
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Notice: Finished catalog run in 1.33 seconds
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Executing '/etc/puppet/etckeeper-commit-post'
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Using settings: adding file resource 'rrddir': 'File[/var/lib/puppet/rrd]{:path=>"/var/lib/puppet/rrd", :mode=>"750", :owner=>"puppet", :group=>"puppet", :ensure=>:directory, :loglevel=>:debug, :links=>:follow, :backup=>false}'
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Finishing transaction 42307220
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Received report to process from juju-d7e021-2.lxd
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Evicting cache entry for environment 'production'
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Caching environment 'production' (ttl = 0 sec)
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Debug: Processing report from juju-d7e021-2.lxd with processor Puppet::Reports::Store
- unit-spark-0: 13:57:34 INFO unit.spark/0.install Traceback (most recent call last):
- unit-spark-0: 13:57:34 INFO unit.spark/0.install File "/var/lib/juju/agents/unit-spark-0/charm/hooks/install", line 19, in <module>
- unit-spark-0: 13:57:34 INFO unit.spark/0.install main()
- unit-spark-0: 13:57:34 INFO unit.spark/0.install File "/usr/local/lib/python3.5/dist-packages/charms/reactive/__init__.py", line 78, in main
- unit-spark-0: 13:57:34 INFO unit.spark/0.install bus.dispatch()
- unit-spark-0: 13:57:34 INFO unit.spark/0.install File "/usr/local/lib/python3.5/dist-packages/charms/reactive/bus.py", line 423, in dispatch
- unit-spark-0: 13:57:34 INFO unit.spark/0.install _invoke(other_handlers)
- unit-spark-0: 13:57:34 INFO unit.spark/0.install File "/usr/local/lib/python3.5/dist-packages/charms/reactive/bus.py", line 406, in _invoke
- unit-spark-0: 13:57:34 INFO unit.spark/0.install handler.invoke()
- unit-spark-0: 13:57:34 INFO unit.spark/0.install File "/usr/local/lib/python3.5/dist-packages/charms/reactive/bus.py", line 280, in invoke
- unit-spark-0: 13:57:34 INFO unit.spark/0.install self._action(*args)
- unit-spark-0: 13:57:34 INFO unit.spark/0.install File "/var/lib/juju/agents/unit-spark-0/charm/reactive/spark.py", line 172, in reinstall_spark
- unit-spark-0: 13:57:34 INFO unit.spark/0.install install_spark_standalone(zks, peers)
- unit-spark-0: 13:57:34 INFO unit.spark/0.install File "/var/lib/juju/agents/unit-spark-0/charm/reactive/spark.py", line 78, in install_spark_standalone
- unit-spark-0: 13:57:34 INFO unit.spark/0.install spark.configure(hosts, zks, peers)
- unit-spark-0: 13:57:34 INFO unit.spark/0.install File "lib/charms/layer/bigtop_spark.py", line 233, in configure
- unit-spark-0: 13:57:34 INFO unit.spark/0.install bigtop.trigger_puppet()
- unit-spark-0: 13:57:34 INFO unit.spark/0.install File "lib/charms/layer/apache_bigtop_base.py", line 611, in trigger_puppet
- unit-spark-0: 13:57:34 INFO unit.spark/0.install java_home()),
- unit-spark-0: 13:57:34 INFO unit.spark/0.install File "/usr/local/lib/python3.5/dist-packages/jujubigdata/utils.py", line 195, in re_edit_in_place
- unit-spark-0: 13:57:34 INFO unit.spark/0.install with Path(filename).in_place(encoding=encoding) as (reader, writer):
- unit-spark-0: 13:57:34 INFO unit.spark/0.install File "/usr/lib/python3.5/contextlib.py", line 59, in __enter__
- unit-spark-0: 13:57:34 INFO unit.spark/0.install return next(self.gen)
- unit-spark-0: 13:57:34 INFO unit.spark/0.install File "/usr/local/lib/python3.5/dist-packages/path.py", line 1455, in in_place
- unit-spark-0: 13:57:34 INFO unit.spark/0.install os.rename(self, backup_fn)
- unit-spark-0: 13:57:34 INFO unit.spark/0.install FileNotFoundError: [Errno 2] No such file or directory: Path('/etc/default/bigtop-utils') -> Path('/etc/default/bigtop-utils.bak')
- unit-spark-0: 13:57:34 ERROR juju.worker.uniter.operation hook "install" failed: exit status 1
- q^C
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement