DEV Community

Michael Levan
Michael Levan

Posted on

Combing Kubeflow With A Dedicated Stack: Enter deployKF

In a world where there are thousands of tools in the cloud-native realm, we need methods to make our jobs more efficient. Software stacks have helped us out with this a ton. For example, LAMP (Linux, Apache, MySQL, Python) made deploying Linux web stacks far more efficient. Not because you had to use the specific stack, but because it gave you a great starting point.

Similar to other cloud-native stacks, engineers need a similar stack methodology for running AI and ML workloads on Kubernetes.

That’s where deployKF comes into play.

In this blog post, you’ll learn about what deployKF is and how to get it running in your Kubernetes cluster.

💡 At the time of writing this, I believe that it’s still more beneficial to use Kubeflow instead of deployKF. There are a lot of assumptions and prerequisites along with using a particular application stack when it comes to deployKF. Nontheless, it’s still good to know about in the world of Kubernetes and ML/AI.

Prerequisites

To follow along with this blog post in a hands-on fashion, you’ll need a Kubernetes cluster running the following:

  • 4 CPUs
  • 16GB of RAM
  • 64 GB of storage
  • No ARM-based Worker Nodes

If you aren’t going to follow along with the installation of deployKF, you don’t need to deploy the cluster.

💡 Please note that at the time of writing this, the steps in this blog post do not work on AKS due to a policy issue.

What Is deployKF

As with many complex implementations in the realm of Kubernetes, it’s always good to have a starting point. Some list or collection of tools that can help make your life easier when you install them, preferably at the same time.

When deploying AI and ML based workloads to perform Modeling on your cluster, you can install several tools separately to get the job done. Software like PyTorch and TensorFlow work on Kubernetes just fine and you can install them separately. However, what if there was a way to install all the necessary AI and ML tools on your cluster with one stack?

deployKF helps with this.

Much like Kubeflow, deployKF gives you a collection of resources to deploy on your Kubernetes cluster to ensure that you have everything you need to begin building Models. The biggest difference conceptually is that deployKF combines Kubeflow with a few other tools to make the entirety of managing the cluster from an AI and ML perspective a bit easier.

Below are the dependencies that you’ll see deployed with deployKF.

Image description

Source: https://www.deploykf.org/guides/cluster-dependencies/

All of the cluster dependencies are installed during the installation of deployKF.

Image description

Source: https://www.deploykf.org/guides/cluster-dependencies/

The gist is that deployKF combines Kubeflow with a couple of other tools that you may need into one installation/stack.

Installing deployKF

Now that you have gone through the “why” behind deployKF, let’s learn how to install it.

As with most pre-defined stacks, there are pre-defined tools that you must use to get the deployment up and running. One of the tools that you must use for the installation and to use deployKF is ArgoCD, a popular leading GitOps tool.

💡 Remember - all stacks that are pre-built for you require you to use specific tools. If you don’t like the “lock-in” piece, deployKF may not be for you along with just about any other pre-defined software stack.

  1. Clone the deploykf repo. This will contain the necessary installation for ArgoCD along with the values.yaml file you’ll use later.
git clone -b main https://github.com/deployKF/deployKF.git ./deploykf
Enter fullscreen mode Exit fullscreen mode
  1. Modify permissions to make the script executable.
chmod +x ./deploykf/argocd-plugin/install_argocd.sh
Enter fullscreen mode Exit fullscreen mode
  1. Run the installation script.
bash ./deploykf/argocd-plugin/install_argocd.sh
Enter fullscreen mode Exit fullscreen mode
  1. Create a deploykf.yaml file for the deployKF deployment with the following code.

Please note that this is the default Application object/resource installation. It’s using the default values that are available from deployKF. For example, if you look at the array under value_files, you’ll see that it’s pointing to the default configurations within the deployKF repo you cloned in step 1.

apiVersion: argoproj.io/v1alpha1
kind: Application
metadata:
  name: deploykf-app-of-apps
  namespace: argocd
  labels:
    app.kubernetes.io/name: deploykf-app-of-apps
    app.kubernetes.io/part-of: deploykf
spec:
  project: "default"
  source:
    repoURL: "https://github.com/deployKF/deployKF.git"
    targetRevision: "v0.1.4"
    path: "."
    plugin:
      name: "deploykf"
      parameters:
        - name: "source_version"
          string: "0.1.4"
        - name: "values_files"
          array:
            - "./sample-values.yaml"
  destination:
    server: "https://kubernetes.default.svc"
    namespace: "argocd"
Enter fullscreen mode Exit fullscreen mode

If you want to change the values in the sample-values.yaml file, open the deployKF rep and within the default directory, you’ll see the file.

If you want to change the values within the Application object/resource YAML, you can via the sample values override link here.

If you aren’t going to change anything of the defaults, don’t worry about going to the link where the sample values are.

💡 deployKF is heavily dependent on ArgoCD for deployment.

  1. Within the same directory that you saved the values.yaml file in, run the deploykf.yaml file.
kubectl apply -f deploykf.yaml
Enter fullscreen mode Exit fullscreen mode

Sync ArgoCD

Because deployKF is dependent on ArgoCD, you may need to sync the app or at least see the status of the deployment.

  1. Download the sync script.
curl -fL -o "sync_argocd_apps.sh" "https://raw.githubusercontent.com/deployKF/deployKF/main/scripts/sync_argocd_apps.sh"
Enter fullscreen mode Exit fullscreen mode
  1. Change the sync scripts permissions so you can run it locally.
chmod +x ./sync_argocd_apps.sh
Enter fullscreen mode Exit fullscreen mode
  1. Run the sync script.

💡 Make sure you’re on the latest version of the ArgoCD CLI before running the following script.

bash ./sync_argocd_apps.sh
Enter fullscreen mode Exit fullscreen mode

After about 15-20 minutes, you should see an output similar to the below screenshot when viewing resources in the deploykf-istio-gateway Namespace, which you’ll need for the next steps.

💡 You may have to run the script twice if you see a sync issue OR go into the ArgoCD dashboard and run the sync manually. My assumption is it’s a race condition issue. Some of the apps need to be running before the others, which the script is supposed to take care of, but the Pods may not be operational once the script moves on to another workload.

Image description

Accessing deployKF

deployKF is now installed and it’s time to access the dashboard. To access the dashboard, you’ll want to port-forward on the central dashboard.

Image description

kubectl port-forward svc/central-dashboard -n deploykf-dashboard 8085:80
Enter fullscreen mode Exit fullscreen mode

You should now see a screen similar to the one below.

Image description

Closing Thoughts

As ML tooling for Kubernetes stands right now, you have two primary options that are Kubernetes centric:

  1. Kubeflow
  2. deployKF

Both are great, but they leave a lot to be desired from an installation and configuration perspective. Both deployKF and Kubeflow don’t have the easiest installation methods. It makes sense due to the complex nature of AI and ML software, but the hope is that this changes in the future.

deployKF is a great solution if you want a ready-to-use application stack out of the box without you having to do much configuration. Just keep in mind that you will be bound to certain third-party solutions like ArgoCD and Istio to use it.

Overall, I would still recommend using Kubeflow directly.

Top comments (3)

Collapse
 
nutricare_aefd9a09054d6e3 profile image
Nutricare

You know what? F

Collapse
 
yaseen_7920ccbdefddcda63f profile image
Yaseen

Can Deployed MLOPS model process the application(micro service/Any) POST request(process it) and send it back to Application using deployKF endpoint ?, Can communication work in his way

Collapse
 
nutricare_aefd9a09054d6e3 profile image
Nutricare

How did you get the cluster and pod CPU utilization ?