Vibe Coding: What Every VFX Artist Needs to Know About Working With AI

I recently sat in on a guest talk at SCAD with Russel Matsuo from Google, and it genuinely shifted how I think about AI in my VFX workflow. Not in a "it's going to replace us" way — more like finally understanding a tool you've been misusing this whole time.

Here's what stuck with me.

AI Won't Make You a Better Artist. But It Will Make You a Faster One.

Matsuo put it simply: you can't do better design with a computer, but you can speed up your work enormously. For VFX, that hits differently. Nobody's arguing that AI has an eye for comp, or instinctively knows when a light rig feels off, or understands why a simulation looks "wrong" even when the physics are technically correct. That judgment is yours — built from hours in Nuke, Houdini, Maya, whatever your stack is.

What AI can do is collapse the distance between your idea and a working solution. That's not nothing. That's actually huge.

It's Not "Understanding" Anything — And That Matters

Here's the thing that reframed everything for me. When you prompt an AI, it's not reading your request the way a supervisor would. It's doing inference — layers of probability calculations firing 400+ billion times, 50+ times per second. It's an extraordinarily sophisticated pattern-matching engine.

It doesn't understand your pipeline. It doesn't know what renderer you're on, what your deadline looks like, or why that expression node is causing the whole tree to break. It's making a very educated guess based on the signals you give it.

This is why hallucinations happen — when the model doesn't have enough context, it fills the gap with something that sounds right but isn't. If you've ever had AI confidently give you a Python snippet that almost works, you know exactly what this feels like.

The Context Window Is the Most Important Thing Nobody Talks About

AI forgets. After a certain point in a conversation, earlier context drops off — your pipeline specs, your constraints, your specific version requirements. Gone.

For VFX work this is especially painful, because our pipelines are specific. A solution that works in Houdini 19.5 might not work in 20. A Python script written for one studio's pipeline structure might be useless in another.

The fix: actively re-feed context as you work. Treat it like briefing a new freelancer every few messages. Restate your software versions, your constraints, your goals. Don't assume it remembers the setup from ten prompts ago.

My Take: AI as a Senior TD, Not a Replacement

Here's where I land on all of this, and it's shaped how I use these tools every day.

I don't see AI as a replacement for technical knowledge. I see it as a high-velocity accelerator for learning. In VFX, there's always something you don't know yet — a new pipeline tool, an unfamiliar expression language, a rendering workflow you've never touched. Traditionally, getting up to speed meant hours hunting through documentation, forum posts, and outdated tutorials.

Now I treat AI like a Senior Technical Director sitting next to me — one who can take me from "concept" to "execution" immediately. I still need to know enough to ask the right questions, recognize when something's off, and push back when the answer doesn't make sense. The technical foundation is still mine to build. But the gap between understanding something and being able to use it just got a lot smaller.

That's the shift. Not replacement — acceleration.

Five Things That Actually Make AI Work Better in a VFX Context

Matsuo shared practical tips that translate directly to production work:

1. Structure your research first, then use it as context. Before you start asking for solutions, do a deep dive on how the problem should be approached — whether that's a rigging system, a pipeline script, or a compositing workflow. Feed that structure in as context. You're giving AI a map before asking it to navigate your specific terrain.

2. Send errors directly back to it. Don't paraphrase a broken node or a failed script. Copy the exact error message and paste it. Precision in equals precision out.

3. Make it explain itself. Ask AI to add comments and print statements until it genuinely understands what it's generating. For anything touching your pipeline, you want to see the logic — not just the output. This catches drift early, before it becomes a production problem.

4. Reinforce your rules, but also teach the why. If you're working within specific pipeline constraints, don't just repeat them — explain the reasoning. AI responds better to context than to commands.

5. Be positive. This one sounds strange, but Matsuo was serious about it. The framing and tone of your prompts actually affects how the AI approaches problem-solving — what it thinks it can accomplish, how persistently it tries. Encouraging prompts genuinely get different results than terse ones.

The Bottom Line

VFX is already one of the most technically demanding creative fields out there. The artists who are going to get the most out of AI aren't the ones handing it the wheel — they're the ones who understand how it actually works, and use that to move faster without losing control.

The creative eye, the technical judgment, the pipeline intuition — that's still yours. AI just means you spend less time stuck, and more time making.

Next
Next

Building a Procedural Lotus Pond in Maya: L-Systems, Python, and Pipeline Architecture