Maybe it’s not obvious, but k8s workloads deployed by Juju can be used in cross model relations. This can be useful when you have a workload that can only be best run on a traditional vm cloud or bare metal, which you want to use with a k8s workload.
Here’s a quick intro to show how to spin up a working example or 2. We’ll deploy using a LXD controller and microk8s.
First, bootstrap a controller and deploy a vm workload (to the default
model). Then offer that workload.
juju bootstrap lxd
juju deploy mariadb
juju offer mariadb:db
Now let’s register microk8s and deploy mediawiki.
microk8s.config | juju add-k8s k8scmr
juju add model test k8scmr
juju deploy cs:~juju/mediawiki-k8s
The mediawiki app will be blocked until it is related to a database. Let’s relate to the offer running in the LXD model.
juju relate mediawiki-k8s default.mariadb
We can also relate purely k8s workloads across models. Let’s first deploy gitlab alongside mediawiki in the same model.
juju deploy cs:~juju/gitlab-k8s
Now add a new k8s model and deploy mariadb and make an offer.
juju add-model k8sdb k8scmr
juju deploy cs:~juju/mariadb-k8s
juju offer mariadb-k8s:server mariadb
Switch back to the model running gitlab and relate to the k8s mariadb offer.
juju switch test
juju relate gitlab-k8s k8sdb.mariadb