Charmed Apache Spark Solution Tutorial
The Charmed Apache Spark solution delivers Apache Spark utility client applications, that allow for simple and seamless usage of Apache Spark on Kubernetes.
This tutorial takes you through the journey of setting up necessary environment to run Charmed Apache Spark and teaches you how to run Spark jobs using both interactive shell and batch jobs. You’ll also learn how to use Charmed Apache Spark with streaming workloads and how to enable monitoring of the Apache Spark cluster.
This tutorial can be divided into the following sections:
- Setting up environment for the tutorial
- Interacting with Apache Spark using interactive shells
- Submitting Jobs with Spark Submit
- Streaming workload with Charmed Apache Spark
- Monitoring the Apache Spark cluster
- Wrapping Up
While this tutorial intends to guide and teach you along the way, it will be most beneficial if you already have a familiarity with:
- Basic terminal commands.
- General familiarity with Kubernetes commands and concepts (e.g.
kubectl
general usage)
Let’s proceed to set up the environment needed for this tutorial in the next section.