Main menu

10 min read

When we use GenAI – pragmatism not prescription

Let’s cut to the chase. We don’t want to reheat old arguments about whether Generative AI should be used as a creative tool. You’ve read a tonne of stuff on this and so have we. Most studios are onside, on a sliding scale. Like all industries, there’s anxiety about using the tech in our creative work. There are some outliers who sit in the ‘never touch the stuff’ camp and some who have gone ‘all-in’. For most studios, and that includes us, flags have been put in the middle ground.

Like most agencies over the last few years, we’ve playing, exploring and learning the tech to see what value we can get from it, and how it helps us work better and smarter.

We’re using AI as a low-flying creative partner. It’s a tool. That’s it. It’s not a replacement for our human creativity, craft, empathy, imagination or experience. Used well, the it can support efficiency, exploration, and iteration, but it doesn’t replace decision-making, accountability, or creative intent. Those remain firmly human, and being transparent about that helps build trust and sets clear expectations on both sides.

We’ve done the hard yards. AI can never know more about the brands we work with, how we help make them impossible to ignore and build through connection, culture and community. Client relationships can’t be automated. That’s also down to us. Ultimately, it’s about knowing when to use GenAI, ensuring that clients are aware and excited by the potential.

A great leveller

For some projects, we use AI to help our clients level up. We create the type of visual assets that used to mean deep pockets and hours of design time. We also use it to test ideas, storyboard, iterate, push boundaries and create work that would otherwise be impossible. It helps us offer clients faster turnaround times and lower costs.

How much time do we save? We’re heading into ‘a piece of string’ territory here. It depends on the task at hand. Some things AI nails straight away – other times it can be a case of ‘prompt and pray’ even with the most explicit and carefully crafted prompt.

Two new tools that we’re most excited about trying right now are Obello and Fossa Tether. These offer creatives more control over GenAI. Obello is a brand-focused content tool and Tether works with After Effects to give motion designers control motion behaviours, movement and timing.

For image generation we love the ones built for creatives like Flora and Weavy. These are node-based tools that help with the creative workflow and integrate most AI models. (More on other tools in our AI kit later).

First, and before we get onto a couple of project examples from our work, let’s take a deep breath.

A reality check

Everything here comes with a huge dose of pragmatism about when to use AI, capabilities right now, and the guardrails that need to be in place. This isn’t a cure-all prescription. None of us have all the answers. Design teams, clients and our future creatives are anxious. Angela recently gave a talk to a roomful of undergraduates at Salford University. They’re as nervous as you are about jobs and the risks from poor-quality creative.

During the Q&A session AI inevitably came up. Students shared real concerns that their jobs could disappear before they’ve even graduated. They were also interested in how clients responded to AI. Some of our clients are full onboard with AI as part of the creative process, while others are more cautious and still finding their comfort level.

Ultimately, it’s important for us to have open discussions about AI with them. Being clear about where and how we intend to use it, what role it plays in the process, and where human judgement remains essential. And if a client would rather not use it, we respect that and move on.

What do brands think?

Brands are nervous about how creative and media agencies are using AI on their behalf. World Federation of Advertisers reports that an eye-watering 80% of brand owners have concerns about legal, ethical and reputational risks. And yet still usage is rising.

GenAI can help you create authentic, unforgettable visual identities. Brand identity is the bit you can control – visuals, messaging, values etc. Identity is the image you want to project. Then you lose control. Your actual image is how customers perceive the brand after interacting with that identity. Low-quality creative rattles trust and influences perception. Reputations and brand equity take a hit. Our digital lives are being flooded with low-quality content. How many bizarre cat videos have you seen? Customers circle around brands and content creators they trust.

AI-generated creative needs human oversight and editing. We know what good work looks like. The quality of the prompts and brand and audience insights you use are everything. You’ll need to do your homework on who owns what and get clear on fair use (there’s a good primer from It’s Nice That here).

One thing’s for sure, someone has to set the direction for AI – the Creative Director or lead. The person who makes the decision, understands cultural nuances and applies judgement, taste and intent.

Now we’ve got that out of the way, here are a couple examples of when we use it.

Creating distinctive brand assets

Investing Insiders

Investing Insiders is an affiliate website that compares investing platforms for everyday investors. They recently changed how they rated platforms overall (Best ISA / Best SIPP etc) and started rating them for each investor type. For example, the best platform for ‘Cost Cutters’ is one with low fees, ‘Portfolio Pros’ is the one with pro tools. The review site offers a more personalised experienced based on investor personas. We brought those personas to life by creating 12 different characters using GenAI. See how they’re used here.

They are used in both marketing and website user journeys to guide specific investor types to the perfect platform for them:

persona web slider User journey on the Investing Insiders website.

Difference is quick to replicate. Distinction gives brands an edge. Distinctive brand assets like characters, logos, shapes, patterns, colours, sounds, mascots etc are identifiers that make a brand unforgettable and the first that comes to mind when people are ready to buy. In other words, the brand ends up taking up more head space in customers’ minds than competitors (the scientific plant name for this is building brand salience and enhancing mental availability).

But, there’s some tough medicine here. Ipsos and Jones Knowles Ritchie pored over 5,000 brand assets and found that just 15% were truly distinctive in the eyes of consumers. The researchers estimated that $4.7 trillion is spent every year on marketing and 85% is being spent on brand assets that get ignored. Ouch. Nobody wants to spend time and money on work that goes unnoticed.

Company Shop

Company Shop is an ethical surplus supermarket that offers membership to customers on means-tested benefits. It also offers membership to key workers and registered charities. They have 13 stores across the UK. Businesses like this that are built to bring about the biggest change often have the smallest budgets. We used AI to 3D-rotate the brands fruit and veg assets before brining them into into After Effects Effects to create scenes and to further animate.

For video, Kling has been our go-to for a while now. It’s great value.

Storyboarding animations

AI can be great for quickly iterating on storyboards using rough / hand-drawn styles before illustrating characters.

We illustrated a character style for the animation, fed this into Flora and instructed it to create black and white draft scenes for the storyboard above with existing characters to communicate the visual narrative to the client before illustrating the scenes.

UI from Flora AI image gen workflow Feeding in original characters to generate rough storyboard scenes

Generating photos

It’s no surprise that we often use AI to generate high-res imagery. It’s frequently more efficient than searching stock libraries, especially when client needs are highly specific or conceptual. All the usual caveats on cohesiveness, readability, creativity, representation and diversity apply. Again, the quality of the text prompt and insights you put in determines the quality of the image you get out. Having a solid understanding of scene framing, camera choices, lenses, and shooting techniques – alongside colour theory, storytelling and crucially the brand – all feed into stronger prompts.

Knowing when something should feel wide and cinematic, intimate and close, or slightly imperfect and human comes from real visual experience. When you understand how images are actually made, you can be far more intentional about the outcome, whatever tools you’re using. As with all things, the tools you use depend on what you’re trying to achieve. As AI quickly evolves, the right model for the job is constantly shifting. For us? Right now we’re loving Seedream for character consistency.

Other tools in our workflow

Currently were in a three-way LLM relationship with Claude, Manus and ChatGPT. We’d love to commit to just one, but with each having distinct strengths and weaknesses and this relentless AI arms race meaning today’s winner could be tomorrow’s runner-up, so monogamy feels impractical. So instead of choosing, we’re embracing our commitment issues.

We’ve also been experimenting in Replit – creating a new in-house product that we may monetise if it works well, and our Start Your Project form was vibe coded in Lovable.

A final thought…

As we said at the top, GenAI is a low-flying creative partner for us. On some projects, it cuts out the guesswork and helps us come up with ideas, iterate and create high-quality visuals that used to come with a hefty budgets and a big time commitment. It helps us push our creativity and experiment. We wanted to share just a few examples of when it works for us. It’s not replacing our craft.

How can we help?

If you’re reading this and curious about how GenAI could work for you, let’s schedule some time together. Connect with us here.