← Field notes
Note 051 April 20269 min

The prompt is the product, until it isn't

For the first generation of LLM applications, the differentiator was the prompt. A clever instruction template was the entire moat. This was true for about eighteen months. It is no longer true, and most teams who built their product on prompt cleverness are now discovering it.

For the first generation of LLM applications, the differentiator was the prompt. A clever instruction template was the entire moat. This was true for about eighteen months. It is no longer true, and most teams who built their product on prompt cleverness are now discovering it.

What replaced it is harder to name and harder to copy: the surrounding apparatus. The retrieval pipeline that gives the model documents it didn't have. The memory system that gives it continuity between sessions. The correction loop that gives it the ability to learn from its mistakes within the lifetime of a deployment. The interface that makes the answer auditable. The refusal layer that knows when not to answer at all.

None of these are the prompt. The prompt is now the cheapest, most copyable, least defensible part of an AI tool. The work has moved to everything around it. Teams who haven't recognised this are still polishing their instruction templates while their competitors are building memory.

The lesson for our practice: when an AI tool stops getting better, it's almost never the prompt. It's the apparatus the prompt is sitting inside.