Advertisement
Guest User

Untitled

a guest
Apr 22nd, 2016
126
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 25.41 KB | None | 0 0
  1. [stack@undercloud my_templates]$ heat deployment-output-show b1686e39-41c3-4cfc-880e-f78a139907c0 --all
  2. {
  3. "deploy_stdout": "\u001b[mNotice: Compiled catalog for overcloud-controller-0.localdomain in environment production in 21.64 seconds\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Resource::Ip[public_vip]/Pcmk_resource[ip-192.168.122.100]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Package_manifest[/var/lib/tripleo/installed-packages/overcloud_controller_pacemaker2]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Resource::Ip[storage_mgmt_vip]/Pcmk_resource[ip-172.17.4.10]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/File[/etc/sysconfig/clustercheck]/ensure: defined content as '{md5}dd12c2289a9ea2911ce6c8f48dc5c71d'\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Resource::Service[haproxy]/Pacemaker::Resource::Systemd[haproxy]/Pcmk_resource[haproxy]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph/Ceph_config[global/osd_pool_default_pgp_num]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph/Ceph_config[osd/osd_journal_size]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Keys/Ceph::Key[client.openstack]/File[/etc/ceph/ceph.client.openstack.keyring]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph/Ceph_config[global/osd_pool_default_min_size]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph/Ceph_config[global/auth_service_required]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph/Ceph_config[global/mon_initial_members]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph/Ceph_config[global/fsid]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Resource::Ip[control_vip]/Pcmk_resource[ip-172.16.0.42]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Resource::Ip[storage_vip]/Pcmk_resource[ip-172.17.3.10]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph/Ceph_config[global/cluster_network]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph/Ceph_config[global/auth_supported]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph/Ceph_config[global/auth_cluster_required]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Keys/Ceph::Key[client.admin]/File[/etc/ceph/ceph.client.admin.keyring]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Resource::Ip[internal_api_vip]/Pcmk_resource[ip-172.17.1.10]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph/Ceph_config[global/mon_host]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Keys/Ceph::Key[client.openstack]/Exec[ceph-key-client.openstack]/returns: + ceph-authtool /etc/ceph/ceph.client.openstack.keyring --name client.openstack --add-key AQAuUBpXAAAAABAAtwJPcZwic/6S0X8tTOPPZA== --cap mon 'allow r' --cap osd 'allow class-read object_prefix rbd_children, allow rwx pool=volumes, allow rwx pool=vms, allow rwx pool=images'\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Keys/Ceph::Key[client.openstack]/Exec[ceph-key-client.openstack]/returns: added entity client.openstack auth auth(auid = 18446744073709551615 key=AQAuUBpXAAAAABAAtwJPcZwic/6S0X8tTOPPZA== with 0 caps)\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Keys/Ceph::Key[client.openstack]/Exec[ceph-key-client.openstack]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Keys/Ceph::Key[client.bootstrap-osd]/File[/var/lib/ceph/bootstrap-osd/ceph.keyring]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Keys/Ceph::Key[client.bootstrap-osd]/Exec[ceph-key-client.bootstrap-osd]/returns: + ceph-authtool /var/lib/ceph/bootstrap-osd/ceph.keyring --name client.bootstrap-osd --add-key AQAuUBpXAAAAABAAtwJPcZwic/6S0X8tTOPPZA== --cap mon 'allow profile bootstrap-osd'\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Keys/Ceph::Key[client.bootstrap-osd]/Exec[ceph-key-client.bootstrap-osd]/returns: added entity client.bootstrap-osd auth auth(auid = 18446744073709551615 key=AQAuUBpXAAAAABAAtwJPcZwic/6S0X8tTOPPZA== with 0 caps)\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Keys/Ceph::Key[client.bootstrap-osd]/Exec[ceph-key-client.bootstrap-osd]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Xinetd/File[/etc/xinetd.conf]/content: content changed '{md5}9ff8cc688dd9f0dfc45e5afd25c427a7' to '{md5}b432e3530685b2b53034e4dc1be5193e'\u001b[0m\n\u001b[mNotice: /Stage[main]/Xinetd/File[/etc/xinetd.conf]/mode: mode changed '0600' to '0644'\u001b[0m\n\u001b[mNotice: /File[/etc/xinetd.conf]/seluser: seluser changed 'unconfined_u' to 'system_u'\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Resource::Ocf[galera]/Pcmk_resource[galera]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Exec[galera-ready]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Mysql_database[glance]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Heat::Db::Mysql/Openstacklib::Db::Mysql[heat]/Mysql_database[heat]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Heat::Db::Mysql/Openstacklib::Db::Mysql[heat]/Openstacklib::Db::Mysql::Host_access[heat_%]/Mysql_user[heat@%]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Heat::Db::Mysql/Openstacklib::Db::Mysql[heat]/Openstacklib::Db::Mysql::Host_access[heat_%]/Mysql_grant[heat@%/heat.*]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Heat::Db::Mysql/Openstacklib::Db::Mysql[heat]/Openstacklib::Db::Mysql::Host_access[heat_172.17.1.15]/Mysql_user[heat@172.17.1.15]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Heat::Db::Mysql/Openstacklib::Db::Mysql[heat]/Openstacklib::Db::Mysql::Host_access[heat_172.17.1.15]/Mysql_grant[heat@172.17.1.15/heat.*]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Heat::Db::Mysql/Openstacklib::Db::Mysql[heat]/Openstacklib::Db::Mysql::Host_access[heat_172.17.1.10]/Mysql_user[heat@172.17.1.10]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Openstacklib::Db::Mysql::Host_access[glance_%]/Mysql_user[glance@%]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Openstacklib::Db::Mysql::Host_access[glance_%]/Mysql_grant[glance@%/glance.*]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Openstacklib::Db::Mysql::Host_access[glance_172.17.1.15]/Mysql_user[glance@172.17.1.15]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Mysql_database[keystone]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Openstacklib::Db::Mysql::Host_access[keystone_172.17.1.15]/Mysql_user[keystone@172.17.1.15]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Openstacklib::Db::Mysql::Host_access[keystone_172.17.1.15]/Mysql_grant[keystone@172.17.1.15/keystone.*]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Cinder::Db::Mysql/Openstacklib::Db::Mysql[cinder]/Mysql_database[cinder]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Cinder::Db::Mysql/Openstacklib::Db::Mysql[cinder]/Openstacklib::Db::Mysql::Host_access[cinder_172.17.1.15]/Mysql_user[cinder@172.17.1.15]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Cinder::Db::Mysql/Openstacklib::Db::Mysql[cinder]/Openstacklib::Db::Mysql::Host_access[cinder_172.17.1.15]/Mysql_grant[cinder@172.17.1.15/cinder.*]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Cinder::Db::Mysql/Openstacklib::Db::Mysql[cinder]/Openstacklib::Db::Mysql::Host_access[cinder_%]/Mysql_user[cinder@%]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Openstacklib::Db::Mysql::Host_access[glance_172.17.1.10]/Mysql_user[glance@172.17.1.10]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Openstacklib::Db::Mysql::Host_access[glance_172.17.1.10]/Mysql_grant[glance@172.17.1.10/glance.*]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Mysql_database[ovs_neutron]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Openstacklib::Db::Mysql::Host_access[ovs_neutron_172.17.1.10]/Mysql_user[neutron@172.17.1.10]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Openstacklib::Db::Mysql::Host_access[ovs_neutron_172.17.1.15]/Mysql_user[neutron@172.17.1.15]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Mysql_database[nova]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Openstacklib::Db::Mysql::Host_access[nova_172.17.1.15]/Mysql_user[nova@172.17.1.15]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Openstacklib::Db::Mysql::Host_access[nova_%]/Mysql_user[nova@%]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Openstacklib::Db::Mysql::Host_access[nova_%]/Mysql_grant[nova@%/nova.*]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Openstacklib::Db::Mysql::Host_access[nova_172.17.1.15]/Mysql_grant[nova@172.17.1.15/nova.*]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Openstacklib::Db::Mysql::Host_access[ovs_neutron_172.17.1.10]/Mysql_grant[neutron@172.17.1.10/ovs_neutron.*]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph/Ceph_config[global/auth_client_required]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Openstacklib::Db::Mysql::Host_access[nova_172.17.1.10]/Mysql_user[nova@172.17.1.10]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Nova::Db::Mysql/Openstacklib::Db::Mysql[nova]/Openstacklib::Db::Mysql::Host_access[nova_172.17.1.10]/Mysql_grant[nova@172.17.1.10/nova.*]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Openstacklib::Db::Mysql::Host_access[ovs_neutron_%]/Mysql_user[neutron@%]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Openstacklib::Db::Mysql::Host_access[ovs_neutron_%]/Mysql_grant[neutron@%/ovs_neutron.*]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph/Ceph_config[global/osd_pool_default_size]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Openstacklib::Db::Mysql::Host_access[keystone_%]/Mysql_user[keystone@%]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Openstacklib::Db::Mysql::Host_access[keystone_%]/Mysql_grant[keystone@%/keystone.*]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Keys/Ceph::Key[client.admin]/Exec[ceph-key-client.admin]/returns: + ceph-authtool /etc/ceph/ceph.client.admin.keyring --name client.admin --add-key AQAuUBpXAAAAABAAtwJPcZwic/6S0X8tTOPPZA== --cap mon 'allow *' --cap osd 'allow *' --cap mds 'allow *'\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Keys/Ceph::Key[client.admin]/Exec[ceph-key-client.admin]/returns: added entity client.admin auth auth(auid = 18446744073709551615 key=AQAuUBpXAAAAABAAtwJPcZwic/6S0X8tTOPPZA== with 0 caps)\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Keys/Ceph::Key[client.admin]/Exec[ceph-key-client.admin]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Heat::Db::Mysql/Openstacklib::Db::Mysql[heat]/Openstacklib::Db::Mysql::Host_access[heat_172.17.1.10]/Mysql_grant[heat@172.17.1.10/heat.*]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Pacemaker::Resource_defaults/Pcmk_resource_default[resource-stickiness]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Swift/File[/var/lib/swift]/group: group changed 'root' to 'swift'\u001b[0m\n\u001b[mNotice: /File[/var/lib/swift]/seluser: seluser changed 'unconfined_u' to 'system_u'\u001b[0m\n\u001b[mNotice: /Stage[main]/Swift/File[/etc/swift]/owner: owner changed 'root' to 'swift'\u001b[0m\n\u001b[mNotice: /Stage[main]/Swift/File[/etc/swift]/group: group changed 'root' to 'swift'\u001b[0m\n\u001b[mNotice: /Stage[main]/Swift/File[/etc/swift]/mode: mode changed '0755' to '2770'\u001b[0m\n\u001b[mNotice: /File[/etc/swift]/seluser: seluser changed 'unconfined_u' to 'system_u'\u001b[0m\n\u001b[mNotice: /Stage[main]/Swift/File[/etc/swift/swift.conf]/owner: owner changed 'root' to 'swift'\u001b[0m\n\u001b[mNotice: /Stage[main]/Swift/File[/etc/swift/swift.conf]/mode: mode changed '0640' to '0660'\u001b[0m\n\u001b[mNotice: /File[/etc/swift/swift.conf]/seluser: seluser changed 'unconfined_u' to 'system_u'\u001b[0m\n\u001b[mNotice: /Stage[main]/Swift/Swift_config[swift-hash/swift_hash_path_suffix]/value: value changed '%SWIFT_HASH_PATH_SUFFIX%' to '7F3y3WKHQKnBnHpeReUZRD2s2'\u001b[0m\n\u001b[mNotice: /Stage[main]/Swift/File[/var/run/swift]/group: group changed 'root' to 'swift'\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph/Ceph_config[global/osd_pool_default_pg_num]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph/Ceph_config[global/ms_bind_ipv6]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Resource::Ip[redis_vip]/Pcmk_resource[ip-172.17.1.11]/ensure: created\u001b[0m\n\u001b[mNotice: /File[/etc/xinetd.d]/seluser: seluser changed 'unconfined_u' to 'system_u'\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Xinetd::Service[galera-monitor]/File[/etc/xinetd.d/galera-monitor]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Xinetd/Service[xinetd]: Triggered 'refresh' from 5 events\u001b[0m\n\u001b[mNotice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Openstacklib::Db::Mysql::Host_access[keystone_172.17.1.10]/Mysql_user[keystone@172.17.1.10]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Keystone::Db::Mysql/Openstacklib::Db::Mysql[keystone]/Openstacklib::Db::Mysql::Host_access[keystone_172.17.1.10]/Mysql_grant[keystone@172.17.1.10/keystone.*]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Glance::Db::Mysql/Openstacklib::Db::Mysql[glance]/Openstacklib::Db::Mysql::Host_access[glance_172.17.1.15]/Mysql_grant[glance@172.17.1.15/glance.*]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Cinder::Db::Mysql/Openstacklib::Db::Mysql[cinder]/Openstacklib::Db::Mysql::Host_access[cinder_%]/Mysql_grant[cinder@%/cinder.*]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Cinder::Db::Mysql/Openstacklib::Db::Mysql[cinder]/Openstacklib::Db::Mysql::Host_access[cinder_172.17.1.10]/Mysql_user[cinder@172.17.1.10]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Cinder::Db::Mysql/Openstacklib::Db::Mysql[cinder]/Openstacklib::Db::Mysql::Host_access[cinder_172.17.1.10]/Mysql_grant[cinder@172.17.1.10/cinder.*]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Resource::Ocf[redis]/Pcmk_resource[redis]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph/Ceph_config[global/public_network]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Profile::Mon/Ceph::Mon[overcloud-controller-0]/File[/tmp/ceph-mon-keyring-overcloud-controller-0]/ensure: defined content as '{md5}38a4a033b5c7890ad27ab5e025c9c0d9'\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Profile::Mon/Ceph::Mon[overcloud-controller-0]/Exec[ceph-mon-mkfs-overcloud-controller-0]/returns: ++ ceph-mon --id overcloud-controller-0 --show-config-value mon_data\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Profile::Mon/Ceph::Mon[overcloud-controller-0]/Exec[ceph-mon-mkfs-overcloud-controller-0]/returns: + mon_data=/var/lib/ceph/mon/ceph-overcloud-controller-0\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Profile::Mon/Ceph::Mon[overcloud-controller-0]/Exec[ceph-mon-mkfs-overcloud-controller-0]/returns: + '[' '!' -d /var/lib/ceph/mon/ceph-overcloud-controller-0 ']'\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Profile::Mon/Ceph::Mon[overcloud-controller-0]/Exec[ceph-mon-mkfs-overcloud-controller-0]/returns: + mkdir -p /var/lib/ceph/mon/ceph-overcloud-controller-0\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Profile::Mon/Ceph::Mon[overcloud-controller-0]/Exec[ceph-mon-mkfs-overcloud-controller-0]/returns: + ceph-mon --mkfs --id overcloud-controller-0 --keyring /tmp/ceph-mon-keyring-overcloud-controller-0\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Profile::Mon/Ceph::Mon[overcloud-controller-0]/Exec[ceph-mon-mkfs-overcloud-controller-0]/returns: ceph-mon: renaming mon.noname-a 172.17.3.17:6789/0 to mon.overcloud-controller-0\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Profile::Mon/Ceph::Mon[overcloud-controller-0]/Exec[ceph-mon-mkfs-overcloud-controller-0]/returns: ceph-mon: set fsid to a74b7b96-08a6-11e6-90e9-525400d62813\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Profile::Mon/Ceph::Mon[overcloud-controller-0]/Exec[ceph-mon-mkfs-overcloud-controller-0]/returns: ceph-mon: created monfs at /var/lib/ceph/mon/ceph-overcloud-controller-0 for mon.overcloud-controller-0\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Profile::Mon/Ceph::Mon[overcloud-controller-0]/Exec[ceph-mon-mkfs-overcloud-controller-0]/returns: + touch /var/lib/ceph/mon/ceph-overcloud-controller-0/done /var/lib/ceph/mon/ceph-overcloud-controller-0/sysvinit /var/lib/ceph/mon/ceph-overcloud-controller-0/keyring\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Profile::Mon/Ceph::Mon[overcloud-controller-0]/Exec[ceph-mon-mkfs-overcloud-controller-0]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Profile::Mon/Ceph::Mon[overcloud-controller-0]/Exec[ceph-mon-ceph.client.admin.keyring-overcloud-controller-0]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Profile::Mon/Ceph::Mon[overcloud-controller-0]/Service[ceph-mon-overcloud-controller-0]/ensure: ensure changed 'stopped' to 'running'\u001b[0m\n\u001b[mNotice: /Stage[main]/Ceph::Profile::Mon/Ceph::Mon[overcloud-controller-0]/Exec[rm-keyring-overcloud-controller-0]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Resource::Service[mongod]/Pacemaker::Resource::Systemd[mongod]/Pcmk_resource[mongod]/ensure: created\u001b[0m\n\u001b[mNotice: Failed to connect to mongodb within timeout window of 60 seconds; giving up.\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Mongodb_replset[tripleo]: Dependency Mongodb_conn_validator[172.17.1.13:27017] has failures: true\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Resource::Ocf[rabbitmq]/Pcmk_resource[rabbitmq]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Resource::Service[memcached]/Pacemaker::Resource::Systemd[memcached]/Pcmk_resource[memcached]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Neutron::Db::Mysql/Openstacklib::Db::Mysql[neutron]/Openstacklib::Db::Mysql::Host_access[ovs_neutron_172.17.1.15]/Mysql_grant[neutron@172.17.1.15/ovs_neutron.*]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Pacemaker::Corosync/Exec[enable-not-start-tripleo_cluster]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Pacemaker::Corosync/Exec[Set password for hacluster user on tripleo_cluster]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Pacemaker::Corosync/Exec[auth-successful-across-all-nodes]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Constraint::Base[internal_api_vip-then-haproxy]/Exec[Creating order constraint internal_api_vip-then-haproxy]/returns: executed successfully\u001b[0m\n\u001b[mNotice: Pacemaker has reported quorum achieved\u001b[0m\n\u001b[mNotice: /Stage[main]/Pacemaker::Corosync/Notify[pacemaker settled]/message: defined 'message' as 'Pacemaker has reported quorum achieved'\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Constraint::Colocation[public_vip-with-haproxy]/Pcmk_constraint[colo-ip-192.168.122.100-haproxy-clone]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Constraint::Base[control_vip-then-haproxy]/Exec[Creating order constraint control_vip-then-haproxy]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Constraint::Base[storage_mgmt_vip-then-haproxy]/Exec[Creating order constraint storage_mgmt_vip-then-haproxy]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Constraint::Colocation[control_vip-with-haproxy]/Pcmk_constraint[colo-ip-172.16.0.42-haproxy-clone]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Constraint::Colocation[redis_vip-with-haproxy]/Pcmk_constraint[colo-ip-172.17.1.11-haproxy-clone]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Constraint::Colocation[storage_mgmt_vip-with-haproxy]/Pcmk_constraint[colo-ip-172.17.4.10-haproxy-clone]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Constraint::Base[public_vip-then-haproxy]/Exec[Creating order constraint public_vip-then-haproxy]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Constraint::Colocation[storage_vip-with-haproxy]/Pcmk_constraint[colo-ip-172.17.3.10-haproxy-clone]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Constraint::Colocation[internal_api_vip-with-haproxy]/Pcmk_constraint[colo-ip-172.17.1.10-haproxy-clone]/ensure: created\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Constraint::Base[redis_vip-then-haproxy]/Exec[Creating order constraint redis_vip-then-haproxy]/returns: executed successfully\u001b[0m\n\u001b[mNotice: /Stage[main]/Main/Pacemaker::Constraint::Base[storage_vip-then-haproxy]/Exec[Creating order constraint storage_vip-then-haproxy]/returns: executed successfully\u001b[0m\n\u001b[mNotice: Finished catalog run in 588.99 seconds\u001b[0m\n",
  4. "deploy_stderr": "Device \"br_ex\" does not exist.\nDevice \"br_isolated\" does not exist.\nDevice \"ovs_system\" does not exist.\n\u001b[1;31mError: Could not prefetch mysql_user provider 'mysql': Execution of '/usr/bin/mysql -NBe SELECT CONCAT(User, '@',Host) AS User FROM mysql.user' returned 1: ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/lib/mysql/mysql.sock' (2)\u001b[0m\n\u001b[1;31mError: Could not prefetch mysql_database provider 'mysql': Execution of '/usr/bin/mysql -NBe show databases' returned 1: ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/lib/mysql/mysql.sock' (2)\u001b[0m\n\u001b[1;31mError: Unable to connect to mongodb server! (172.17.1.13:27017)\u001b[0m\n\u001b[1;31mError: /Stage[main]/Main/Mongodb_conn_validator[172.17.1.13:27017]/ensure: change from absent to present failed: Unable to connect to mongodb server! (172.17.1.13:27017)\u001b[0m\n\u001b[1;31mWarning: /Stage[main]/Main/Mongodb_replset[tripleo]: Skipping because of failed dependencies\u001b[0m\n",
  5. "deploy_status_code": 6
  6. }
  7.  
  8. [stack@undercloud my_templates]$ nova list
  9. ssh heat-admin@+--------------------------------------+-------------------------+--------+------------+-------------+----------------------+
  10. | ID | Name | Status | Task State | Power State | Networks |
  11. +--------------------------------------+-------------------------+--------+------------+-------------+----------------------+
  12. | acd72e45-f734-4ed0-ae2f-f2d8c6b2a328 | overcloud-cephstorage-0 | ACTIVE | - | Running | ctlplane=172.16.0.45 |
  13. | 2317993b-a9db-4898-ab91-377775a0f300 | overcloud-cephstorage-1 | ACTIVE | - | Running | ctlplane=172.16.0.50 |
  14. | 6f301ed7-2608-4d9b-96d1-e78b4190dce4 | overcloud-cephstorage-2 | ACTIVE | - | Running | ctlplane=172.16.0.44 |
  15. | 61774948-9979-4d7e-9cd8-4f7abf1f8c95 | overcloud-compute-0 | ACTIVE | - | Running | ctlplane=172.16.0.48 |
  16. | 80c45960-5861-4d21-9710-65065784c766 | overcloud-controller-0 | ACTIVE | - | Running | ctlplane=172.16.0.51 |
  17. | ba9ccc4a-9787-4fde-831c-a06c523f3c9e | overcloud-controller-1 | ACTIVE | - | Running | ctlplane=172.16.0.47 |
  18. | df4bb0e6-4ace-4c4a-b6a5-81b3f3935cbf | overcloud-controller-2 | ACTIVE | - | Running | ctlplane=172.16.0.46 |
  19. +--------------------------------------+-------------------------+--------+------------+-------------+----------------------+
  20. [stack@undercloud my_templates]$ ssh heat-admin@172.16.0.51
  21. The authenticity of host '172.16.0.51 (172.16.0.51)' can't be established.
  22. ECDSA key fingerprint is f1:d4:b7:26:16:c9:b1:1d:0e:68:dd:6c:77:51:36:c5.
  23. Are you sure you want to continue connecting (yes/no)? yes
  24. Warning: Permanently added '172.16.0.51' (ECDSA) to the list of known hosts.
  25.  
  26.  
  27.  
  28.  
  29. [heat-admin@overcloud-controller-0 ~]$
  30. [heat-admin@overcloud-controller-0 ~]$
  31. [heat-admin@overcloud-controller-0 ~]$
  32. [heat-admin@overcloud-controller-0 ~]$
  33. [heat-admin@overcloud-controller-0 ~]$ sudo ovs-vsctl show
  34. c0286a25-c19d-406c-b156-ca21574c8894
  35. Bridge br-ex
  36. Port "eth2"
  37. Interface "eth2"
  38. Port br-ex
  39. Interface br-ex
  40. type: internal
  41. Bridge br-isolated
  42. Port "eth1"
  43. Interface "eth1"
  44. Port "vlan101"
  45. tag: 101
  46. Interface "vlan101"
  47. type: internal
  48. Port "vlan201"
  49. tag: 201
  50. Interface "vlan201"
  51. type: internal
  52. Port "vlan401"
  53. tag: 401
  54. Interface "vlan401"
  55. type: internal
  56. Port br-isolated
  57. Interface br-isolated
  58. type: internal
  59. Port "vlan301"
  60. tag: 301
  61. Interface "vlan301"
  62. type: internal
  63. ovs_version: "2.4.0"
  64.  
  65.  
  66. s
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement