← Back

Workforce Engineering from the other side of the table

I’ve spent the last ten years building CRM strategies, loyalty programmes and engagement campaigns for brands like Halfords, Prezzo and the Volkswagen Group. In that time, I’ve learned a lot about how to measure what customers do, how to predict what they’ll do next, and how to put the right message in front of the right person at the right time.

What I’ve never been able to measure, with anything close to that precision, is how my own team operates.

I recently read Will Hackett’s piece on Workforce Engineering and had a genuine moment of recognition. Not because I’d heard the term before — I hadn’t — but because he was describing a problem I’ve been living with for my entire career. The only difference is he was writing from the CTO’s chair. I’m writing from the marketing one.

The problems are identical. The blind spots are identical. And the gap between how we measure our output and how we measure our effort is just as wide on this side of the building.

We can measure everything except ourselves

In marketing, we are obsessed with metrics. Open rates. Click-through rates. Conversion rates. Revenue per email. Lifetime value. We build personalisation models that score customers on future value and category affinity. We run Next Best Action engines that decide in real time which message to send, on which channel, at which moment.

We have more data on what our customers had for breakfast than we do on how our own team spent last Tuesday.

If someone asks me what a campaign produced, I can tell you within the hour. If someone asks me what that campaign cost to produce — the team hours, the creative handoffs, the stakeholder reviews, the three rounds of re-scoping that happened because a priority shifted mid-sprint — I’m guessing. Everyone is guessing. And in most organisations, nobody even asks the question.

At Utility Warehouse, we use Asana to try to forecast the team’s capacity. In theory, we estimate how long projects will take and plan the week accordingly. In practice, people end up logging time after the work is done, not before. And forward planning always looks like we have capacity — because the unplanned requests, the stakeholder fire drills, the “quick thing” that takes three days, none of it appears in the plan until it’s already consumed someone’s week.

We’re not unusual. Every marketing team I’ve worked in or alongside has this exact problem. The tools exist to measure what we produce. Nothing measures how we produce it.

AI changed the game and nobody noticed

Here’s a thing that happened to me recently. I started using Gemini to write campaign copy. Not all of it — I still write the strategic briefs and the high-stakes pieces myself — but for the volume work, the variants, the multi-audience segmentation copy, it’s become indispensable. I feed it the brand guidelines, the audience context, the tone of voice framework, and it produces copy that’s good enough to ship, fast enough that I can turn around requests that used to take days.

In practice, what this means is I’ve removed a bottleneck. Our creative team is at max capacity — they’re working on higher-profile briefs, brand campaigns, work that genuinely needs their expertise. Previously, my additional copy variants and campaign iterations would sit in a queue behind that work, waiting for capacity that didn’t exist. Now I can get those campaigns over the line myself, with AI supporting the copy, keeping brand tone of voice intact while I focus on the strategy and targeting.

Nobody in my organisation has tracked this shift. No one has measured it. The creative team isn’t doing less work — they’re doing the right work, because I’m no longer competing for their time on things I can now handle myself. I’m producing more campaigns, faster, with more targeted copy. But this transformation is completely invisible at an organisational level.

If you asked my finance team how much we spend on AI marketing tools, I doubt they’d know. I certainly don’t — I have zero visibility on the budget. And that’s the strange part: I’m deploying AI as a genuine productivity multiplier and I couldn’t tell you whether we’re under or over budget on it.

Hackett makes the point that AI spend is a form of labour. I felt that. AI isn’t a software subscription sitting quietly in the IT budget. It’s doing work. It’s replacing handoffs. It’s changing who does what. But because we’re not measuring it that way, the transformation is happening in the dark.

It’s not all success, either. I ran a test using AI to generate copy for a multi-audience campaign — different segments, different value propositions, different calls to action. I gave it extensive context: brand guidelines, audience profiles, the works. It completely missed the mark. The copy wasn’t relevant, the tone was wrong, the intended action didn’t land. We scrapped it and wrote it from scratch. AI works brilliantly for some things. For others, it produces a convincing output that actually makes your campaign worse. The problem is knowing which is which, and right now, the only way to find out is to ship it and measure the damage.

The Halfords lesson

The biggest loyalty programme I ever launched was the Halfords Motoring Club. It was a major piece of work — personalised journeys, advanced segmentation, tier structures, the full architecture. It ended up driving 25% new customer acquisition, doubling clickthrough rates, and premium members visited 32% more frequently than non-members. By any output metric, it was a success.

But inside the team, the picture was different. The complexity of the project was massive, and the further we got into it, the more we realised how much we’d underestimated the build. Certain tasks that should have been handled by senior people ended up on juniors’ desks, because that’s who was available. There was stress. There was overwork. Mid-project, we should have rescoped — but we didn’t have the visibility to see it clearly enough to make that call with confidence.

If I’d had a system that showed me, in real time, where every hour of effort was going across that launch — the actual allocation, not the planned allocation — I would have made different decisions. I’d have moved people around. I’d have scaled back certain workstreams. I’d have caught the pressure building before it became a problem.

We delivered a brilliant programme. We also burned people out delivering it. Those two things shouldn’t have to coexist, and with the right visibility, they wouldn’t.

Agency billing proved the model works

Before Utility Warehouse, I spent four years at Plinc, a CRM agency. At Plinc, we billed clients down to 15-minute increments. Every task, every email, every meeting — tracked, coded, attributed to a client and a project.

It was painful. It was probably error-prone in places — it’s nearly impossible for people to accurately track everything they do at 15-minute resolution. But it gave you something extraordinarily valuable: you could see when an account was overburning. You could connect effort to outcomes. You could have a conversation with a client grounded in data, not feelings, about what their programme actually cost to run and whether it was worth it.

When I moved in-house, that visibility disappeared overnight. The logic was reasonable: in-house employees are 100% dedicated to one company, so why would you need to track their time? But the question isn’t about utilisation — it’s about allocation. Where is effort going? Is it going to the right things? Could we get a better result if we redistributed it?

At UW, I have approximate answers to those questions. At Plinc, I had precise ones. Neither extreme is ideal — agency-style 15-minute timesheets are overhead-heavy and demoralising, but the in-house approach of “everyone’s busy, trust the process” is a visibility black hole.

What I actually want is something in between. Not a timesheet. A system that shows, at the project level, where effort is going, what it’s producing, and whether the allocation makes sense. Not to micromanage anyone — to make better decisions about scope, staffing and priority.

The dashboard I wish I had

If someone put a dashboard in front of me tomorrow showing exactly where every hour of my team’s time went, what it cost, and what it produced, here’s what I’d do.

First, I’d look at the low-impact projects that are absorbing disproportionate time. Not to kill them — some low-impact work is still important — but to assess whether we could rescope them. Scale them down. Make them less of a drain on resource so they still get delivered, but don’t crowd out higher-impact work.

Then I’d look at the high-impact projects that are under-resourced. The campaigns we know will move the needle but that keep getting deprioritised because something “urgent” lands from a stakeholder. Priorities shift constantly in marketing. When a high-importance request arrives mid-sprint, low-importance campaigns get deprioritised to make room. That’s fine in theory. But without visibility into the cascade effect, you can’t see that the same three campaigns have been deprioritised four weeks in a row, or that one team member has been context-switching between six workstreams while another has capacity.

The data would let me do what I already try to do — balance impact and importance — but with evidence instead of intuition.

Data is sexy on both sides

Here’s what I’ve learned from a decade in data-driven marketing: data can make you creative. It doesn’t replace creativity — it fuels it. When you know that a specific customer segment responds to a particular kind of message at a particular time on a particular channel, that’s not a constraint on creativity. It’s a springboard.

I believe the same thing applies internally. Data about how your team operates isn’t surveillance. It’s the same kind of insight you’d use to optimise a campaign, applied to your own machine. Where is effort going? What’s producing results? Where is someone stretched too thin?

The best marketing teams I’ve worked in weren’t just good at making campaigns. They were good at understanding how they made campaigns — the process, the handoffs, the bottlenecks. They treated it as a system. They iterated on the system, not just the output.

That’s not a million miles from what Hackett describes as Workforce Engineering: treating the workforce as a system that can be instrumented, measured and improved. I’ve been doing a version of this my entire career. I just never had a name for it.

Efficiency matters. But it’s a balance — between efficiency and doing what’s right for the people you’re marketing to, and for the team doing the marketing. I always want to be proud of the work I’m delivering out the door. Visibility into how the team operates doesn’t threaten that. It protects it. Because the alternative is burnout, misallocation and campaigns you shipped because you had to, not because they were ready.

The name for the thing

When I read Will Hackett’s post on Workforce Engineering, I didn’t learn something new. I recognised something I already knew but hadn’t been able to articulate. Every marketing leader I know has felt this gap — between the precision of our customer metrics and the total absence of equivalent data about our own operations.

I don’t know if marketing teams will adopt the term. The language feels engineering-native, and marketing people tend to speak a different dialect. But the discipline — measuring how effort connects to outcomes, making allocation decisions from data instead of gut feel, treating the team as a system worth optimising — that’s universal. Every function in every organisation has this problem. Marketing might just be the one where the irony is sharpest: we built an entire industry around understanding customer behaviour, and we still don’t understand our own.


George Barda is Partner Engagement Marketing Manager at Utility Warehouse and a two-time DMA Award winner. Previously he directed CRM strategy at Plinc and Merkle for brands including Halfords, Prezzo, and the Volkswagen Group. Connect on LinkedIn.