Terms of Service — Pre-Production Begins
Category: Film, Sci-Fi, Psychological Horror, Production Diary, AI & Robotics
Idea Behind the Project
My fascination with humanoid robotics — and gynoids in particular — didn’t appear overnight. It’s been quietly accumulating for years, long before I coined MetroForm, and even longer before Terms of Service had a name.
One of the earliest and most formative influences was Ghost in the Shell 2: Innocence. Not just for its visuals, but for its tone: the cold elegance, the philosophical weight, the way synthetic bodies were treated not as novelties, but as mirrors. Objects designed for intimacy, obedience, and projection — carrying more meaning than their creators ever intended.
Later came the slow-burn discomfort of Black Mirror. Not its spectacle, but its cruelty-by-design. Systems that don’t malfunction — they work exactly as intended. The horror doesn’t come from broken code, but from the way the tech is used.
At some point, I got an idea.
- What happens when intimacy is productized?
- When affection has a retention curve?
- When consent, attachment, and memory are configurable features?
Terms of Service exists to explore that space — not through one long narrative, but through short, self-contained case files. Incidents. Reports. Internal reviews. Stories told after something has already gone wrong.
Shaping the First Season
The first season of Terms of Service is structured as 10 short episodes, each a few minutes long. This wasn’t a creative limitation — it was a practical one. I’m a one-man army on this project. Writing, directing, editing, compositing, sound design, VFX, motion graphics, concept art — all of it. That reality forces discipline.
From the very beginning, every script had to answer two questions:
- What is the strongest way to tell this story?
- What is realistically achievable without breaking immersion?
Some stories demand movement, pursuit, chaos. Others become stronger when nothing moves at all. As a result, the season deliberately shifts format between episodes: from VO-driven narration, through complex POVs/HUDs, to security cameras and classic "found footage". The variety isn’t stylistic indulgence — it’s camouflage. Each format helps sell the illusion that these are found materials, not authored scenes.
Unified Look
One of the earliest challenges in pre-production was visual cohesion. Pure AI-generated footage often looks too clean — movements are too smooth, lighting too ideal, faces too perfect. Ironically, that polish immediately breaks the illusion of realism. In the real world, images degrade. They compress, stutter, desync, and fail in small, ugly ways. Realism, it turns out, requires damage.
To counter this, the visual approach deliberately mixes sources and treatments. AI-generated material is degraded until it feels recorded rather than rendered, while real-world footage is constrained and processed to sit comfortably alongside synthetic images. Motion graphics and HUD elements aren’t there to impress — they exist to obscure seams, add latency, and introduce uncertainty. A pristine image feels suspicious. A compromised one feels like evidence.
Breaking Down the Episodes
At this stage, pre-production revolves around breaking down each episode’s script with a very practical lens: what truly needs to be shown, and what works better when left incomplete. Some moments gain strength through movement and escalation, but many become far more unsettling when reduced to fragments — a fixed angle, an obstructed view, or a system interface calmly logging something that clearly shouldn’t be happening.
Scenes that would traditionally require complex choreography or multiple performers are often reimagined as partial records: a locked camera that never cuts away, a reflection caught in glass, an audio log playing over a static frame, or a still image held just long enough to become uncomfortable. The goal isn’t minimalism for its own sake, but selective realism — showing only what would plausibly exist as surviving footage, long after the event itself is over.
Building a Believable Near Future
Alongside script breakdowns, I’m deep into concept development for characters, locations, and props across the entire season, mixng traditional "old school" techniques (like Photoshop and, lo and behold, pens/markers! 🤯), 3D and AI.
Mechanical designs — vehicles, drones, equipment, interfaces — are handled in 3D wherever possible. Not for spectacle, but for consistency. When something appears multiple times across episodes, it needs to feel like it belongs to the same manufacturer, the same design language, the same procurement catalog.
With over 15 years of experience as a concept artist and art director (samples can be found here), my focus isn’t on predicting the far future. It’s about designing a near one — ten to twenty years ahead — where everything feels slightly too familiar.
So, here we go...
This Lab series will continue documenting the project milestone by milestone. Terms of Service isn’t just a story about artificial companions. It’s a documentation of how easily systems designed to “help” can slide into quiet exploitation — when nobody involved technically breaks the rules.
More soon.
— Adam