Why DevOps Teams Need to Rethink Automation in the Age of AI

Too much DevOps automation today is still bound by human bottlenecks; tickets, tribal knowledge, and a fragile web of YAMLs. And now with AI entering the mix, things are changing fast. But are we ready?

To understand this, we launched the "AI x DevOps" podcast! In our premiere episode, Rohit chatted with Vincent De Smet, a seasoned DevOps engineer from Handshakes. 

Vincent, with his journey spanning Docker, Kubernetes, Terraform (since version 0.7!), and AWS, brought a rich history to the discussion. His recent work on Terra Constructs aiming to simplify Terraform with higher-level constructs like AWS CDK even involves experimenting with LLMs for automation.

Here are the key takeaways and what you will learn sections, each point condensed into a single line, based on the sources:

What you will learn:

  • Learn about Vincent De Smet's DevOps background in Docker, Kubernetes, Terraform, AWS, and integrating AI into daily work.

  • Gain insight into Terraform's evolution and challenges with modules versus higher-level constructs.

  • Understand the benefits of AWS CDK's adoption rate among product engineers using familiar programming languages.

  • Learn about the terra constructs project bridging AWS CDK concepts with Terraform CDK, potentially automated by AI.

  • Explore contrasting approaches to LLM code generation like "vibe coding" vs. intentional, reviewed generation.

  • Learn techniques to improve AI-generated code quality, such as providing source code and documentation context.

  • Understand challenges in training LLMs on niche languages like HCL due to limited public production codebases.

Key Takeaways

  • Replacing clickops with chat ops through LLMs could lead to unmanaged infrastructure if companies don't have proper protocols and guardrails in place for LLM usage

  • AI/LLM capabilities are changing rapidly, requiring frequent re-evaluation of approaches.

  • Providing relevant documentation and context significantly improves LLM code generation quality.

  • Over dependence on LLMs risks the loss of core engineering skills as complexity is abstracted.

  • Combining deterministic checks (static analysis) with non-deterministic LLM analysis builds trust in AI outputs.

  • Gain perspective on how AI might enable developers to take more ownership of infrastructure, potentially changing platform engineering roles.

  • Learn about "ChatOps" dangers and the need for organizational protocols and orchestrators for governed LLM infrastructure use.

  • Understand critical security implications like data exposure and the need for internal models or strict rules.

  • Hear about potential future uses like AI for deep research and building specialized tools (MCPs) for frameworks like CDK.

LLMs as Junior Engineers

Vincent shared a compelling perspective: we should view Large Language Models (LLMs) as junior engineers on our team. Like junior engineers discovering AWS CDK, LLMs can handle repetitive tasks and generate workable code, but with proper supervision and review. This framing helps set realistic expectations while acknowledging AI's genuine utility.

The IaC Generation Debate: DSLs vs. Programming Languages

One of our most interesting discussions centered on infrastructure as code generation. Vincent made a compelling case that rather than training LLMs to write HashiCorp Language (HCL) or YAML; domain-specific languages with limited training data—we should leverage LLMs to write in general programming languages using SDKs like AWS CDK or Terraform CDK.

Why? Because:

  • LLMs have been trained on vastly more TypeScript/Python/JavaScript than Terraform HCL
  • Higher-level libraries provide better guardrails and deterministic behaviour
  • These libraries abstract cloud resources in a more intuitive way
  • SDKs often include validation logic that a raw Terraform file wouldn't

The Future of Platform Engineering

Will AI fundamentally transform platform engineering or just accelerate existing practices? We explored how AI might reshape the platform engineer's role into:

  1. Context providers for LLMs - Creating and maintaining internal knowledge bases that LLMs can access
  2. Guardrail builders - Implementing security and compliance checks for AI-generated infrastructure
  3. Quality gate enforcers - Creating processes that validate AI outputs before they reach production.
Vincent noted that organisations need clear protocols for AI usage rather than ignoring the inevitable adoption by engineers seeking shortcuts.

Pitfalls and Challenges

We didn't shy away from discussing the risks:

  • Ownership dilution - When code is AI-generated, who really understands and owns it?
  • Skill erosion - The "No LLM Fridays" concept Vincent mentioned highlights concerns about over-dependence.
  • Security implications - From data leakage to potential "rogue agents" making unexpected changes.
  • Hallucinations in critical systems - When an LLM's confident mistake could break production.

Looking Forward

The conversation concluded with exciting possibilities for AI-augmented DevOps tools that could make infrastructure creation more accessible to developers. Vincent is exploring GSII manifest integration with LLMs to streamline AWS CDK usage.

Final Thoughts

This discussion emphasised that we're in a rapidly evolving landscape: what works today might be obsolete in months. The most valuable approach seems to be a balanced integration where AI handles the tedious parts while humans maintain understanding, oversight, and decision-making authority.

For DevOps practitioners exploring AI integration, the key takeaway is clear: AI tools should complement your expertise, not replace it. Start with small, controlled experiments, implement proper governance, and always maintain the ability to work without AI assistance when needed.

🎙️Tune in Now

Catch the full episode on Spotify, Apple Podcasts, or wherever you get your podcasts.

Curious how AI-powered orchestration actually works in practice?

See how teams are using Facets to bring guardrails, context, and automation into their DevOps workflows. 👉 Book a Demo and explore what AI-ready infrastructure really looks like.