Vault hook failed to start

Dear All,

I am finding the error vault hook failed restart.

neutron-api/0* active idle 1/lxd/2 192.168.0.241 9696/tcp Unit is ready
neutron-api-mysql-router/0* active idle 192.168.0.241 Unit is ready
neutron-api-plugin-ovn/0* active idle 192.168.0.241 Unit is ready
nova-cloud-controller/0* active idle 3/lxd/1 192.168.0.244 8774/tcp,8775/tcp Unit is ready
ncc-mysql-router/0* active idle 192.168.0.244 Unit is ready
nova-compute/0* active idle 1 192.168.0.6 Unit is ready
ovn-chassis/2 active idle 192.168.0.6 Unit is ready
nova-compute/1 active idle 2 192.168.0.7 Unit is ready
ovn-chassis/1 active idle 192.168.0.7 Unit is ready
nova-compute/2 active idle 3 192.168.0.8 Unit is ready
ovn-chassis/0* active idle 192.168.0.8 Unit is ready
openstack-dashboard/0* active idle 2/lxd/3 192.168.0.246 80/tcp,443/tcp Unit is ready
dashboard-mysql-router/0* active idle 192.168.0.246 Unit is ready
ovn-central/0* active idle 0/lxd/1 192.168.0.238 6641/tcp,6642/tcp Unit is ready (northd: active)
ovn-central/1 active idle 1/lxd/1 192.168.0.239 6641/tcp,6642/tcp Unit is ready (leader: ovnnb_db, ovnsb_db)
ovn-central/2 active idle 2/lxd/1 192.168.0.240 6641/tcp,6642/tcp Unit is ready
placement/0* active idle 3/lxd/2 192.168.0.245 8778/tcp Unit is ready
placement-mysql-router/0* active idle 192.168.0.245 Unit is ready
rabbitmq-server/0* active idle 2/lxd/2 192.168.0.243 5672/tcp Unit is ready
vault/0* error idle 3/lxd/0 192.168.0.237 8200/tcp hook failed: “start”
vault-mysql-router/0* active idle 192.168.0.237 Unit is ready

Even i restarted the services
juju run-action vault/0 --wait resume but it could be solved the issues

Please help me to rectified this error

Please check the logs of the vault unit:

juju debug-log --replay --no-tail --include vault/0
1 Like

unit-vault-0: 18:21:00 ERROR unit.vault/0.juju-log Traceback (most recent call last):
File “/var/lib/juju/agents/unit-vault-0/charm/actions/resume”, line 231, in main
charms.reactive.main()
File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/charms/reactive/init.py”, line 74, in main
bus.dispatch(restricted=restricted_mode)
File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/charms/reactive/bus.py”, line 390, in dispatch
_invoke(other_handlers)
File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/charms/reactive/bus.py”, line 359, in _invoke
handler.invoke()
File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/charms/reactive/bus.py”, line 181, in invoke
self._action(*args)
File “/var/lib/juju/agents/unit-vault-0/charm/reactive/vault_handlers.py”, line 1003, in create_certs
bundle = vault_pki.generate_certificate(cert_type,
File “/var/lib/juju/agents/unit-vault-0/charm/lib/charm/vault_pki.py”, line 104, in generate_certificate
client = vault.get_local_client()
File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/tenacity/init.py”, line 339, in wrapped_f
return self(f, *args, **kw)
File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/tenacity/init.py”, line 430, in call
do = self.iter(retry_state=retry_state)
File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/tenacity/init.py”, line 367, in iter
return fut.result()
File “/usr/lib/python3.8/concurrent/futures/_base.py”, line 437, in result
return self.__get_result()
File “/usr/lib/python3.8/concurrent/futures/_base.py”, line 389, in __get_result
raise self._exception
File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/tenacity/init.py”, line 433, in call
result = fn(*args, **kwargs)
File “/var/lib/juju/agents/unit-vault-0/charm/lib/charm/vault.py”, line 254, in get_local_client
client.auth_approle(app_role_id)
File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/hvac/v1/init.py”, line 2072, in auth_approle
return self.auth(’/v1/auth/{0}/login’.format(mount_point), json=params, use_token=use_token)
File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/hvac/v1/init.py”, line 1726, in auth
return self._adapter.auth(
File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/hvac/adapters.py”, line 159, in auth
response = self.post(url, **kwargs).json()
File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/hvac/adapters.py”, line 103, in post
return self.request(‘post’, url, **kwargs)
File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/hvac/adapters.py”, line 233, in request
utils.raise_for_error(response.status_code, text, errors=errors)
File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/hvac/utils.py”, line 43, in raise_for_error
raise exceptions.VaultDown(message, errors=errors)
hvac.exceptions.VaultDown: Vault is sealed

unit-vault-0: 18:21:00 INFO unit.vault/0.juju-log DEPRECATION WARNING: Function action_fail is being removed : moved to function_fail()
unit-vault-0: 18:21:00 INFO juju.worker.uniter awaiting error resolution for “start” hook

This is the issue. However it is not surfaced in juju status. There, it shows a hook error of hook failed: “start”. Let me investigate.

1 Like

hvac.exceptions.VaultDown: Vault is sealed

unit-vault-0: 19:28:36 WARNING unit.vault/0.start Traceback (most recent call last):
unit-vault-0: 19:28:36 WARNING unit.vault/0.start File “/var/lib/juju/agents/unit-vault-0/charm/hooks/start”, line 22, in
unit-vault-0: 19:28:36 WARNING unit.vault/0.start main()
unit-vault-0: 19:28:36 WARNING unit.vault/0.start File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/charms/reactive/init.py”, line 74, in main
unit-vault-0: 19:28:36 WARNING unit.vault/0.start bus.dispatch(restricted=restricted_mode)
unit-vault-0: 19:28:36 WARNING unit.vault/0.start File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/charms/reactive/bus.py”, line 390, in dispatch
unit-vault-0: 19:28:36 WARNING unit.vault/0.start _invoke(other_handlers)
unit-vault-0: 19:28:36 WARNING unit.vault/0.start File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/charms/reactive/bus.py”, line 359, in _invoke
unit-vault-0: 19:28:36 WARNING unit.vault/0.start handler.invoke()
unit-vault-0: 19:28:36 WARNING unit.vault/0.start File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/charms/reactive/bus.py”, line 181, in invoke
unit-vault-0: 19:28:36 WARNING unit.vault/0.start self._action(*args)
unit-vault-0: 19:28:36 WARNING unit.vault/0.start File “/var/lib/juju/agents/unit-vault-0/charm/reactive/vault_handlers.py”, line 1003, in create_certs
unit-vault-0: 19:28:36 WARNING unit.vault/0.start bundle = vault_pki.generate_certificate(cert_type,
unit-vault-0: 19:28:36 WARNING unit.vault/0.start File “/var/lib/juju/agents/unit-vault-0/charm/lib/charm/vault_pki.py”, line 104, in generate_certificate
unit-vault-0: 19:28:36 WARNING unit.vault/0.start client = vault.get_local_client()
unit-vault-0: 19:28:36 WARNING unit.vault/0.start File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/tenacity/init.py”, line 339, in wrapped_f
unit-vault-0: 19:28:36 WARNING unit.vault/0.start return self(f, *args, **kw)
unit-vault-0: 19:28:36 WARNING unit.vault/0.start File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/tenacity/init.py”, line 430, in call
unit-vault-0: 19:28:36 WARNING unit.vault/0.start do = self.iter(retry_state=retry_state)
unit-vault-0: 19:28:36 WARNING unit.vault/0.start File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/tenacity/init.py”, line 367, in iter
unit-vault-0: 19:28:36 WARNING unit.vault/0.start return fut.result()
unit-vault-0: 19:28:36 WARNING unit.vault/0.start File “/usr/lib/python3.8/concurrent/futures/_base.py”, line 437, in result
unit-vault-0: 19:28:36 WARNING unit.vault/0.start return self.__get_result()
unit-vault-0: 19:28:36 WARNING unit.vault/0.start File “/usr/lib/python3.8/concurrent/futures/_base.py”, line 389, in __get_result
unit-vault-0: 19:28:36 WARNING unit.vault/0.start raise self._exception
unit-vault-0: 19:28:36 WARNING unit.vault/0.start File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/tenacity/init.py”, line 433, in call
unit-vault-0: 19:28:36 WARNING unit.vault/0.start result = fn(*args, **kwargs)
unit-vault-0: 19:28:36 WARNING unit.vault/0.start File “/var/lib/juju/agents/unit-vault-0/charm/lib/charm/vault.py”, line 254, in get_local_client
unit-vault-0: 19:28:36 WARNING unit.vault/0.start client.auth_approle(app_role_id)
unit-vault-0: 19:28:36 WARNING unit.vault/0.start File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/hvac/v1/init.py”, line 2072, in auth_approle
unit-vault-0: 19:28:36 WARNING unit.vault/0.start return self.auth(’/v1/auth/{0}/login’.format(mount_point), json=params, use_token=use_token)
unit-vault-0: 19:28:36 WARNING unit.vault/0.start File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/hvac/v1/init.py”, line 1726, in auth
unit-vault-0: 19:28:36 WARNING unit.vault/0.start return self._adapter.auth(
unit-vault-0: 19:28:36 WARNING unit.vault/0.start File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/hvac/adapters.py”, line 159, in auth
unit-vault-0: 19:28:36 WARNING unit.vault/0.start response = self.post(url, **kwargs).json()
unit-vault-0: 19:28:36 WARNING unit.vault/0.start File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/hvac/adapters.py”, line 103, in post
unit-vault-0: 19:28:36 WARNING unit.vault/0.start return self.request(‘post’, url, **kwargs)
unit-vault-0: 19:28:36 WARNING unit.vault/0.start File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/hvac/adapters.py”, line 233, in request
unit-vault-0: 19:28:36 WARNING unit.vault/0.start utils.raise_for_error(response.status_code, text, errors=errors)
unit-vault-0: 19:28:36 WARNING unit.vault/0.start File “/var/lib/juju/agents/unit-vault-0/.venv/lib/python3.8/site-packages/hvac/utils.py”, line 43, in raise_for_error
unit-vault-0: 19:28:36 WARNING unit.vault/0.start raise exceptions.VaultDown(message, errors=errors)
unit-vault-0: 19:28:36 WARNING unit.vault/0.start hvac.exceptions.VaultDown: Vault is sealed
unit-vault-0: 19:28:37 ERROR juju.worker.uniter.operation hook “start” (via explicit, bespoke hook script) failed: exit status 1
unit-vault-0: 19:28:37 INFO juju.worker.uniter awaiting error resolution for “start” hook
unit-vault-0: 19:31:57 INFO juju.worker.uniter awaiting error resolution for “start” hook

The stacktrace in the log you provided is from the resume action but the issue surfaced in juju status is a failed start hook. It would be great if you could capture that.

Can you also give the full output to juju status --relations vault?

1 Like

Sir,

FYI

root@maassrv:~# juju status --relations vault
Model Controller Cloud/Region Version SLA Timestamp
openstack juju-controller maassrv/default 2.9.14 unsupported 20:14:01+05:30

App Version Status Scale Charm Store Channel Rev OS Message
vault 1.5.9 blocked 1 vault charmhub stable 50 ubuntu Unit is sealed
vault-mysql-router 8.0.26 active 1 mysql-router charmhub stable 11 ubuntu Unit is ready

Unit Workload Agent Machine Public address Ports Message
vault/0* blocked executing 3/lxd/0 192.168.0.237 8200/tcp (start) Unit is sealed
vault-mysql-router/0* active idle 192.168.0.237 Unit is ready

Machine State DNS Inst id Series AZ Message
3 started 192.168.0.8 HP-04 focal default Deployed
3/lxd/0 started 192.168.0.237 juju-e45ffe-3-lxd-0 focal default Container started

Relation provider Requirer Interface Type Message
mysql-innodb-cluster:db-router vault-mysql-router:db-router mysql-router regular
vault-mysql-router:shared-db vault:shared-db mysql-shared subordinate
vault:certificates cinder:certificates tls-certificates regular
vault:certificates glance:certificates tls-certificates regular
vault:certificates keystone:certificates tls-certificates regular
vault:certificates mysql-innodb-cluster:certificates tls-certificates regular
vault:certificates neutron-api-plugin-ovn:certificates tls-certificates regular
vault:certificates neutron-api:certificates tls-certificates regular
vault:certificates nova-cloud-controller:certificates tls-certificates regular
vault:certificates openstack-dashboard:certificates tls-certificates regular
vault:certificates ovn-central:certificates tls-certificates regular
vault:certificates ovn-chassis:certificates tls-certificates regular
vault:certificates placement:certificates tls-certificates regular
vault:cluster vault:cluster vault-ha peer

root@maassrv:~#

root@maassrv:~# juju status --relations vault
Model Controller Cloud/Region Version SLA Timestamp
openstack juju-controller maassrv/default 2.9.14 unsupported 20:15:34+05:30

App Version Status Scale Charm Store Channel Rev OS Message
vault 1.5.9 error 1 vault charmhub stable 50 ubuntu hook failed: “start”
vault-mysql-router 8.0.26 active 1 mysql-router charmhub stable 11 ubuntu Unit is ready

Unit Workload Agent Machine Public address Ports Message
vault/0* error idle 3/lxd/0 192.168.0.237 8200/tcp hook failed: “start”
vault-mysql-router/0* active idle 192.168.0.237 Unit is ready

Machine State DNS Inst id Series AZ Message
3 started 192.168.0.8 HP-04 focal default Deployed
3/lxd/0 started 192.168.0.237 juju-e45ffe-3-lxd-0 focal default Container started

Relation provider Requirer Interface Type Message
mysql-innodb-cluster:db-router vault-mysql-router:db-router mysql-router regular
vault-mysql-router:shared-db vault:shared-db mysql-shared subordinate
vault:certificates cinder:certificates tls-certificates regular
vault:certificates glance:certificates tls-certificates regular
vault:certificates keystone:certificates tls-certificates regular
vault:certificates mysql-innodb-cluster:certificates tls-certificates regular
vault:certificates neutron-api-plugin-ovn:certificates tls-certificates regular
vault:certificates neutron-api:certificates tls-certificates regular
vault:certificates nova-cloud-controller:certificates tls-certificates regular
vault:certificates openstack-dashboard:certificates tls-certificates regular
vault:certificates ovn-central:certificates tls-certificates regular
vault:certificates ovn-chassis:certificates tls-certificates regular
vault:certificates placement:certificates tls-certificates regular
vault:cluster vault:cluster vault-ha peer

root@maassrv:~#

Please stand by for an update in the next couple of hours. Thanks for your patience.

1 Like

Hi @prajwalvinu, I suspect you are using the config option totally-unsecure-auto-unlock, otherwise the start hook, which is failing in your case, wouldn’t be “surprised” by a sealed vault, and would just “wait” for you to go through the manual procedure for unsealing. Can you please show us the output of this in order to validate my assumption?

juju config vault totally-unsecure-auto-unlock

I think you did everything right but this option is known to have some rough edges and when it goes wrong, it’s hard to get out of this state. If you can, I would recommend tearing down your deployment and trying again, and I would recommend not using this option and instead going through the secure procedure for unsealing the vault. I believe this is a brand new deployment, correct? Otherwise it wouldn’t be running the start hook I believe.

If you still want to try rescuing your current deployment, I would personally try the following things, but I can’t promise this will work:

# Get the unit out of the error state:
juju resolve --no-retry vault/0

# At this point the vault is still sealed and I believe we have no
# choice but to go through the manual procedure in order to
# unseal it. First let's disable the automatic unsealing:
juju config vault totally-unsecure-auto-unlock=false

# The procedure is best described at https://charmhub.io/vault
# and will look like:
snap install vault
export VAULT_ADDR="http://<unit-ip>:8200"
vault operator init -key-shares=5 -key-threshold=3
vault operator unseal <key-1>
vault operator unseal <key-2>
vault operator unseal <key-3>
export VAULT_TOKEN=<token>
vault token create -ttl=10m
juju run-action --wait vault/0 authorize-charm token=<token>

I hope this helps.

1 Like

Hi sir @aurelien-lourot

Thank you very much it is active now

root@maassrv:~# juju status vault
Model Controller Cloud/Region Version SLA Timestamp
openstack juju-controller maassrv/default 2.9.14 unsupported 10:53:59+05:30

App Version Status Scale Charm Store Channel Rev OS Message
vault 1.5.9 active 1 vault charmhub stable 50 ubuntu Unit is ready (active: true, mlock: disabled)
vault-mysql-router 8.0.26 active 1 mysql-router charmhub stable 11 ubuntu Unit is ready

Unit Workload Agent Machine Public address Ports Message
vault/0* active idle 3/lxd/0 192.168.0.237 8200/tcp Unit is ready (active: true, mlock: disabled)
vault-mysql-router/0* active idle 192.168.0.237 Unit is ready

Machine State DNS Inst id Series AZ Message
3 started 192.168.0.8 HP-04 focal default Deployed
3/lxd/0 started 192.168.0.237 juju-e45ffe-3-lxd-0 focal default Container started

Sir @aurelien-lourot

When the server get restarted vault status would be blocked and showing unit sealed and how do i make that active state ?
vault/0* blocked idle 3/lxd/0 192.168.0.237 8200/tcp Unit is sealed

Hi @prajwalvinu, this is by security design: when you restart the vault (or the server it’s running on), it gets sealed and requires manual intervention to unseal it. See our unsealing procedure and also the upstream documentation about this topic.

1 Like

Sir @aurelien-lourot

Thank you for support sir

Even the mysql-innodb-cluster showing that cluster is inaccessible.
and tired to resume using this commands

juju run-action --wait mysql-innodb-cluster/0 resume and reboot-complete-outage
also still its showing blocked state
App Version Status Scale Charm Store Channel Rev OS Message
mysql-innodb-cluster 8.0.26 blocked 3 mysql-innodb-cluster charmhub stable 11 ubuntu Cluster is inaccessible from this instance. Please check logs for details.

Unit Workload Agent Machine Public address Ports Message
mysql-innodb-cluster/0* blocked idle 0/lxd/0 192.168.0.9 Cluster is inaccessible from this instance. Please check logs for details.
mysql-innodb-cluster/1 blocked idle 1/lxd/0 192.168.0.235 Cluster is inaccessible from this instance. Please check logs for details.
mysql-innodb-cluster/2 blocked idle 2/lxd/0 192.168.0.236 Cluster is inaccessible from this instance. Please check logs for details.