DEV Community

Cover image for KitOps: The Bridge Between AI/ML Models and DevOps
Jesse Williams for KitOps

Posted on • Updated on • Originally published at

KitOps: The Bridge Between AI/ML Models and DevOps

Yesterday, Brad Micklea, Jozu CEO and KitOps maintainer, was a guest on the Partially Redacted podcast hosted by Sean Falconer. The 45-minute conversation (which you can listen to here) covered a lot of ground. Specifically, they discussed the current state of the KitOps project, where the project is headed, and some of our early ideas for productizing and releasing Jozu, which builds on top of KitOps.

In this post I want to dive a bit deeper into a few of these topics.

We built KitOps because the deployment of AI/ML models into production environments is nuanced and challenging, requiring different tooling and infrastructure, specific to the demands of AI operations. However, there’s a balance here. Organizations don’t want a completely separate and parallel tool chain for developing, deploying, and maintaining AI/ML models. They’ve perfected security and compliance guardrails in their existing DevOps tools. Unfortunately, those tools weren’t built to handle the huge files and dynamic (and decaying) nature of AI projects. This wasn’t an issue a few years ago, however, as AI and machine learning continues to evolve and adoption continues to grow, we see the need for tools to bridge the AI world of data science and experiments, with the production world of software engineering and DevOps. KitOps’ ModelKits and CLI were specifically designed to meet these needs by providing an efficient, scalable, and user-friendly way to package, share, and deploy AI models. It’s a step in the right direction, but it’s still just one step, we still have a lot more ground to cover.

Tooling and Infrastructure Challenges for AI/ML

Deploying AI/ML models involves more than just writing and testing the model code. It requires a stack that can support the model throughout its lifecycle (training, packaging, validating, deploying, operating, and retraining). As a model flows through this process it’s handed back and forth between multiple teams several times. ModelKits and the Kit CLI make this easier by generating an OCI-compatible package that works with your existing tools and eliminates the need to repackage your models into multiple vendor specific formats.

The benefits of this are clear. By packaging all necessary components, including data, configuration, and dependencies, ModelKits ensure that deployments are consistent and reproducible across all environments. It also makes it easier to collaborate on models. Additionally, every aspect of the model, from code to datasets, is version-controlled. Once a ModelKit is created, it becomes immutable and tamper-proof, ensuring that no unauthorized changes can affect the deployment.

What’s Jozu and what role does it play in all of this?

One thing we haven’t really spoken about is Jozu. Those of you who have been following our story closely know that the concept for Kit came out of first-hand experience trying to deploy models while building a startup called Jozu (V1). It’s a long story, but ultimately Jozu (V1) pivoted to focus on launching and productizing Kit (from this point on, we will talk about Jozu V2).

Today, Jozu is the company that allows us to spend our days building Kit.

Forward looking, we have big plans for Jozu. Since launching KitOps, we’ve received incredible feedback from our design partners, a lot of it has made its way into the KitOps roadmap and some of it falls into the enterprise use case bucket. Jozu will ultimately focus on building the infrastructure that enterprises need to adopt KitOps at scale and use it with the processes they’ve worked hard to perfect.

The next significant step for Jozu is the launch of a public hub for sharing and accessing ModelKits, this will be free to use and store your ModeKits in. This hub will facilitate community collaboration, allowing developers and companies to share their solutions and improvements, fostering innovation across the industry without having to worry about what tools you, or they, are using.

For enterprises, we plan to offer private repositories, where they can store their proprietary ModelKits and custom installations, enabling companies to install and run Jozu's infrastructure behind their firewall, ensuring that they can maintain the highest levels of data privacy and security.

In addition to this, we’re exploring other ways to add value to enterprises around provenance, security, monitoring, and platform integrations. We will keep you informed as these capabilities take shape.

For now, we’re excited to see the ongoing support for KitOps and are grateful for everyone who takes the time to give us feedback on the product. Believe it or not, we discuss and document all of our conversations with KitOps users, which leads to a lot of new tickets. If you’re interested in following along with our journey, we encourage you to join us on our Discord server and get involved in the KitOps project.



Top comments (0)