Don’t invest unless you’re prepared to lose all the money you invest. This is a high risk investment, and you are unlikely to be protected if something goes wrong. Learn more.

March 03, 2026

AI That Actually Pays: Five Takeaways from London Live

AI is everywhere. Profitable AI is not.

AI is everywhere. Profitable AI is not.

That tension set the tone for this edition of London Live. A deliberate step away from model releases and product demos, towards a blunter question: where is AI actually making money, and what survives once the hype moves on?

The evening opened with a fireside conversation between Adrian Love (General Partner, Love Ventures) and Dan Cobley (ex-Google, Founder of ClearScore and Salary Finance), followed by a founder panel hosted by Marcus Love (General Partner, Love Ventures) with Natasha Jones (Metris Energy), Ghali Bennani Laafiret (Ralio), and Pac O'Shea (Round).

Here's the top five things that stayed with us.

1. Define "good" before you go live. Not after.

Dan's ClearScore story was a good one.

The day FCA approval came through in 2015, the team went live on national TV. Same day. But before any of that, before spending a pound on advertising, they'd already agreed what the numbers had to look like. Clear ranges: dead, fixable, working. If it landed in the wrong bracket, they'd stop. No fudging it afterwards.

That kind of pre-commitment is rarer than it should be, especially now, when "we're experimenting with AI" has become a way of avoiding accountability entirely.

Pick the metric. Set the threshold. Decide in advance what failure looks like. Otherwise you're not running a test, you're running a story.

2. The real prize isn't automation. It's the whole workflow.

There's a version of AI adoption that looks impressive and doesn't change much. And there's a version that restructures how a business actually operates.

Natasha Jones, CEO of Metris Energy, was precise about which one she's building. The metric isn't "faults detected by AI." It's faults fully resolved: detection, context, maintenance scheduling, final report, end to end, without a human in the loop. That's a different thing entirely.

Most companies can shave 10-20% off a role. The ones changing their economics are driving 80-90% of a workflow. That's where pricing power shifts. That's where the contract conversations change.

3. It's not a tooling problem. It's a people problem.

Selling AI into real organisations is slower than the pitch decks suggest, and the reason isn't usually technical.

First you're dealing with trust: can this system actually make calls that matter? Then, once you've cleared that, you hit something people are less comfortable saying out loud: if this works, what happens to my team?

The founders on the panel were honest about it. The easiest entry points tend to be growth moments, an acquisition, a new service line, where AI enables scale before headcount pressure becomes the conversation. Or workflows that are genuinely painful and nobody's properly solved: out-of-hours cover, for instance.

Slow adoption isn't scepticism about the technology. It's people being clear-eyed about what it means.

4. The moat isn't the model.

If everyone's building on the same foundation models, where does differentiation actually come from?

The panel had a clear answer: data, embedding, and execution.

Proprietary data compounds. If your system is getting smarter from millions of domain-specific data points every day, an off-the-shelf model can't catch up by next quarter. Deep workflow integration raises switching costs from inconvenient to structural. And two teams with identical tools will still produce very different results, because judgment, speed and customer proximity still matter.

Dan's filter for founders: are they excited when models improve, or nervous? If the business only works because GPT-4 is bad at something, that's not a moat. It's a countdown.

5. Fast revenue and durable revenue are not the same thing.

AI has produced some genuinely astonishing growth curves. But Dan was direct: fast-growing, low-commitment revenue and multi-year embedded enterprise contracts are different assets. One can unwind as fast as it built.

The founders are already navigating this in how they price: usage-based models, hybrid SaaS bundles, workflow pricing rather than cost-savings arbitrage. And underneath it all, the economics are shifting in ways worth watching. Lower build costs don't mean lower operating costs. Token spend, inference, model costs: these become real line items. The maths evolves.

Revenue that survives switching is worth more than revenue that grows fast. That's the distinction that matters.

The bottom line.

The hype will pass. The operational reality won't.

What was clear across the whole evening: AI is delivering genuine value, but only where it's embedded deeply, built on real data, and tied to outcomes the business actually cares about.

The next few years won't go to whoever talks about AI best. They'll go to whoever figures out how to run it.

If you weren’t able to join us this time, we hope to see you at the next edition of London Live in September 2026.