What Production Teams Should Know About AI Clauses in 2026
If you've been on Instagram this week, you've probably seen this post from @aphotoeditor circulating through the photography community. It recommends that photographers add a specific clause to their contracts prohibiting clients from using delivered work for AI training, generation, or manipulation — in perpetuity. Nearly 4,000 likes and over 1,400 saves. That's not engagement noise. That's working photographers bookmarking language they plan to use.
The clause itself isn't new. The Artists Management Association has been circulating protective language for months, including a short version — "Deliverables not approved for AI use and/or AI training" — and a more detailed variant that restricts machine learning applications, metadata harvesting, and biometric identification technologies. What's changed is the velocity. This language is no longer a suggestion from industry lawyers. It's becoming a default expectation from the creative professionals production teams hire every week.
This matters to anyone who books photographers, DPs, editors, or post-production vendors. Here's what you need to know.
What These Clauses Typically Cover
AI contract clauses in the photography and production space generally address three areas:
Training restrictions. The core provision: delivered assets — images, video, audio, and associated metadata — cannot be used to train, fine-tune, or improve machine learning or AI models. This applies to the client, the client's partners, and any downstream licensees. The Artists Management Association's recommended language puts it plainly: "Unless explicitly authorized, licensee may not use the asset(s) including any caption information, keywords, or other metadata associated with content for any machine learning and/or artificial intelligence purposes."
Usage limitations on AI manipulation. Beyond training, many clauses restrict using AI tools to alter, extend, or generate derivative works from the delivered assets. This covers everything from AI-powered facial retouching to generative fill to synthetic scene extension. If a photographer delivers a portrait and the client runs it through an AI upscaler or background replacement tool, that may now violate the contract.
Perpetuity and scope. These restrictions typically apply for the duration of the license — and in some cases, in perpetuity. They also tend to cover the full chain of custody: the hiring client, any agency or platform that receives the assets, and any third party that accesses them through sublicensing.
The Broader Industry Context
This isn't happening in isolation. Several parallel developments are shaping the landscape.
The Paris Charter on AI and Journalism, backed by Reporters Without Borders and chaired by Nobel laureate Maria Ressa, established ethical guidelines for AI use in newsrooms. The WGA's 2023 agreement introduced disclosure rules for AI-generated writing. SAG-AFTRA's Digital Replica Guidelines now require consent and compensation for synthetic performances.
In December 2025, over 500 freelance photographers signed a collective letter opposing the Wall Street Journal's new freelance contract — citing concerns that Dow Jones's deals with OpenAI and Meta, combined with a work-for-hire structure, could permit AI licensing of their photographs without consent or compensation.
The California AI Transparency Act (SB 942), effective January 2026, adds regulatory weight on the platform side. The direction is clear. The specifics are still being negotiated.
Why This Matters to Production Teams
You don't have to be a photographer to be affected by these clauses. If you're a producer, line producer, or production manager booking creative talent, these contract terms flow through your deliverables chain.
Post-production workflows. If your editor uses AI-assisted tools for color correction, noise reduction, or background extension on assets delivered under an AI-restrictive contract, that workflow may now be a contract violation. The line between "traditional editing tool" and "AI-powered tool" is blurred — and these clauses don't always distinguish between them.
Client deliverables. If you're delivering final assets to a brand client who feeds them into an AI content pipeline, the photographer's contract may prohibit it. The liability question — who's responsible when a downstream party violates the clause — is one most production agreements don't yet address.
Sublicensing and archival. If your contract with a photographer doesn't explicitly address AI rights, the default may not be what you think. Archiving assets in a system later used for AI training could trigger a violation retroactively.
Crew expectations. Beyond photographers, DPs, editors, and other creative crew are watching closely. Expect AI clauses in a wider range of production agreements over the next 12–18 months.
What to Do Now
This isn't about taking a side on whether AI training on creative work is appropriate. It's about knowing what's in your contracts and understanding what you're agreeing to — on both sides of the table.
Review your existing contracts. Check whether your standard agreements with photographers, videographers, and post-production vendors address AI usage. If they don't, that ambiguity is a risk for both parties.
Talk to your legal counsel. AI clause language is evolving fast. Boilerplate from 2024 may not cover the scenarios that matter in 2026. Get specific advice for your production's structure and deliverables chain.
Understand what your post-production tools are doing. If your editing suite, DAM system, or delivery platform uses AI features — even passively — know what that means under the contracts you've signed.
Have the conversation with your clients. If you're delivering assets to a brand or agency, clarify how those assets will be used downstream. An AI restriction in your photographer's contract doesn't disappear because the asset changed hands.
The contract language is catching up to the technology. Production teams that understand both will be better positioned — regardless of where the industry lands.