Matt Burgess avatar

Matt Burgess

low brow entry to high brow topics

Service Design in the Machinic Age: Can we use the machine to do better reconnaissance?

Service design is hard. I’ve worked with brilliant practitioners who spend their days tracking down disparate parts of an enterprise just to pull a legible picture of the system together. It is a role fraught with peril, often forced into a ‘compliance’ mindset - tasked with making what the system should do look plausible, rather than being empowered to surface what the material reality of the situation actually necessitates.

The Service Blueprint is our most worthy attempt to bridge this. It is intended to connect the Front Stage (the customer journey) with the Backstage (the technical stack). But because it is so effortful to generate, the meaning-gap between these two silos is a chasm.

Why Blueprints are Hard

A Service Blueprint is not a ‘drawing.’ It is a high-fidelity reconciliation of a thousand contradictions. To build one, you must manually track the Materiality of the System - tracing a customer ‘click’ through five APIs and a legacy database that hasn’t been documented since 2012. Without this, the designer is denied the literacy required to actually form the material they are designing.

Auto-generated description: A tired-looking child with disheveled hair appears to be leaning on a surface, accompanied by the caption EXHAUSTING!!!!!

Tethering Intent to Materiality

Product ways of working and enterprise agility are theoretically positioned to incrementally fix some of this, but progress is glacial. Often because there simply isn’t the situational awareness to ‘simultaneously hold twenty-one moving parts in one’s head’. A Service Blueprint provides the necessary ‘technical reconnaissance’ - the ‘spatial memory’ anchoring the Why to the How.

In most enterprises, a customer journey map - the molecular expression of a thin slice of interactions - usually has a startlingly empty Backstage row. Conversely, the molar expression - the technical architecture - is located deep down away from the customer (its reason-for-being). Systems allegedly exist to serve the customer, but the reality is often a process of sedimentation; layers of new tech built on old workarounds, bleaching away original purpose.

This leaves an ontological void where the connection to the customer should be.


I got to thinking: what if we used the machine to weave these disparate ontologies into a shared plane for the first time?

I ran an experiment using Gemini as a sort of Machinic Reconciler to catalyse the Customer Journey and the Technical Stack to individuate together. By scaffolding the tech stack against the customer journey, we synthesise a Metastable Assemblage: a functional and expressive whole where front and back stage are structurally coupled while maintaining the autonomous expertise of their respective silos.


The Experiment

I used the AI as a machine in two distinct roles to perform this ontological ‘cat’s-cradling’:

  • AI as Legend Generator: Machines cannot feel the structural logic of a visual customer journey map. To address this, I prompted the AI to translate the visual layout into a Machine-Readable Legend - a sort of Rosetta Stone. By discretising the visual map (the nodes, the flows, the emotional peaks) into a structured text pattern, we created Digital Hypomnemata. This isn’t just a description; it’s a technical memory aid that allows the machine to hold the service logic in its context window so it can be manipulated alongside me.

  • AI as Reconciler: Using this legend, the AI acted as the Intercessor between two disparate data sets. I gave it the raw application data and asked it to map them against the customer journey using the Legend.

It performed a bi-directional synthesis: it identified which molecular customer touchpoints (e.g., “User enters credit card”) were dependent on molar system applications (e.g., “Legacy SOAP service latency”). It didn’t just list them; it exposed the reciprocal visibility - revealing how the tech stack enables or constrains the flow of the experience.


The Result: Sufficient Fidelity

The output was a functional, first-iteration Service Blueprint that did more than just ‘look right.’ By traditional design standards, it wasn’t ‘high-fidelity,’ and by architectural standards, it wasn’t yet a ‘system of record’. But it was just what was needed: a sufficiently familiar landscape that mapped the adjacent-possible for both sides. It had tied discrete customer interaction to the precise rows of application data and system dependencies required to support them. It was a beginning - a way for the work of collaboration and negotiation to progress without the usual “meaning-gap” that forces different disciplines into a position of defensive compromise.

Beyond mere scaffolding, the AI acted as a diagnostic layer, surfacing friction points that had been obscured. This allowed a team to move away from binary ‘can/can’t’ arguments and into a triage of imperatives, weighing customer needs against the actual material cost of architectural change. And because the output was just a HTML file, it moved from a machine-readable concept to a collaborative artifact in seconds.

Because the Why and the How were structurally coupled, one could move beyond the friction of translation and into the flow of Noetic Agency. The blueprint became a shared, material reality that allowed disparate teams to deliberate and begin to form the service, rather than just documenting a series of disconnected workarounds.

This shift is not about AI ‘solving' the problem of complexity; it is about the human-machine assemblage unlocking a new kind of situational awareness. By providing a necessary artifact for exteriorised memory, the machine handles the sheer scale of the technical stack, freeing the whole team to focus on the high-value work of negotiation and design. It turns the service into material that can be tuned - functionally and expressively - in real-time.

From Artifact to Catalyst

The value here isn’t just speed; it’s the iterative surfacing of what sociotechnologists call a Boundary Object (Star & Griesemer, 1989). As Jabe Bloom emphasises, these objects allow teams to cooperate effectively without the exhausting overhead of forced consensus or learning each other’s specialised languages. By projecting the tech stack and the customer journey onto the same plane, we create a common catalyst - a shared map that allows silos to remain experts in their own fields while providing a clear, material ground for collective action.


A Note of Caution

I was satisfied with this as a piece of sociotechnical scaffolding. It worked. It reconciled. It brings the silos into the same room. It was a good day. But as the blueprint emerged, so did a sense of dissatisfaction.

In a Delandian framing, the “molar” side of the map - the applications and hard architectural constraints - possesses a genuine capacity to affect and be affected. It has weight; you can change a line of code and the world moves.

In a coupled assemblage, the molecular side - the fluid, intensive human signals - should exist in a reciprocal relationship with that structure, acting as the catalyst for architectural change. However, this felt different. I realised that the PDF customer journey maps we reconciled are just a simulated molecular. The raw, vital signal has been processed, bleached, and categorised until it is no longer an intensive force, but a static representation of one.

Even with a high-fidelity machinic bridge, we would still be transporting a simulation, a human signal fragmented into bits, across the gap.

Beyond Reconciliation

If we move beyond just ‘making things match,’ how do we project human signals into the system so they retain their original weight? In a truly coupled assemblage, why is the customer’s voice or inferred intent, relegated to a passive data point rather than the primary force it should be - one that the technical architecture is reciprocally obliged to respond to?

I know engineering teams that are desperate for this context. They want to move away from the ‘pointless production’ of features that don’t land, toward a system where the customer signal is high-quality enough to actually architect against.

If we don’t solve for the quality of that signal, are we really forming and delivering services? Or are we just building high-fidelity conduits for low-fidelity ghosts?

The scaffolding is in place, but it’s time to talk about the state of the signals we’re sending across it. This realisation is leading to the next stage of enquiry: Transduction.


Follow the Enquiry

I’m documenting this experiment in human-machine collaboration - not as an attempt to automate design, but as an evolution in how we think and remember together.

If you’re interested in how we might use these tools as a practice of care - ensuring that our shared technical reality amplifies human agency rather than automating it away - follow me on Micro.blog or here on LinkedIn