LABINTERMEDIATE

Deploy AI Gateway on Kubernetes

Deploy Ollama on Kubernetes and build an AI gateway with routing and caching.

75 minutes
ai-infrastructure/ai-gateway
Deploy AI Gateway on Kubernetes - Platform Engineering Hands-On Lab Icon

Lab Overview

This hands-on lab teaches you to deploy an AI gateway on Kubernetes with Ollama.

You'll learn to:

  • Deploy Ollama on Minikube with resource limits
  • Build a gateway service with cost-based routing rules
  • Implement response caching with TTL
  • Deploy the gateway with Kubernetes Service and Ingress
  • Test routing between local Ollama and cloud providers

Prerequisites

kubernetes-basics

ollama-local-llm

multi-provider-integration

Technologies Covered

kubernetesollamaai-gatewaycachingroutingminikube

Part of a Course

This lab is part of the LLM Integration and API Patterns course

View All Courses

Choose your plan

Simple, Transparent Pricing

One price, everything included

Monthly Plan

Access all content

$99/month
Save 16%

Quarterly Plan

Save 16% with quarterly billing

$249/quarter

Everything Included in Your Subscription

Content & Learning

  • Access to all courses and bootcamps
  • Video lessons with closed captions
  • Interactive quizzes and assessments
  • Course completion certificates

Hands-On Labs

  • Browser-based cloud labs
  • Pre-configured VMs ready to use
  • Playgrounds for experiments
  • Multi-VM realistic scenarios

AWS Integration

  • Managed AWS Account included
  • Pre-configured environments
  • Real-world cloud scenarios

Support & Community

  • Priority support
  • Active community forum

No Setup Required

  • Everything runs in your browser
  • No software installation needed
  • Automatic environment provisioning
  • Works on any device

Ready to Get Started?

Start this hands-on lab and build real-world Platform Engineering skills

Get Access Now