Back to Insights
Leadership·4 min read·2 September 2024

How to Measure the ROI of Service Design (Without Losing Your Mind)

Design leaders are under pressure to prove their value in financial terms. Here is a practical framework.

Ask ten design leaders how they measure the return on their team's work and you will get ten different answers, most of them uncomfortable. Design ROI is genuinely hard to measure, and acknowledging that difficulty is not a sign of weakness. It is the honest starting point for building a measurement approach that actually holds up under scrutiny. The goal is not to make the numbers tell a story they cannot honestly support. The goal is to identify the causal connections between design activity and business outcomes that are real, measurable, and credible to the people who make investment decisions.

Why Design ROI Is Genuinely Hard to Measure

Service design rarely operates in isolation. An improvement in a digital service journey happens alongside changes to operational processes, staff training, technology infrastructure, and market conditions. Attributing a specific business outcome to design work alone, rather than to the broader programme it was part of, is a genuine methodological challenge. Any design leader who claims to have solved this problem cleanly is probably either working in a very controlled environment or overstating their case.

The honest answer is that design ROI measurement involves estimation, attribution judgements, and confidence intervals rather than precise accounting. This is true of most business investments, including training, marketing, and organisational development. The key is to be transparent about your methodology, conservative in your claims, and consistent in your measurement approach across projects so that patterns become visible over time.

The Three Categories of Service Design Value

  • Cost reduction: design work that reduces the volume or cost of service delivery. This includes reducing failure demand (contacts generated by service failure), eliminating rework caused by unclear processes or ambiguous guidance, and shifting volume from high-cost channels such as phone and in-person to lower-cost digital self-service.
  • Revenue protection: design work that reduces churn, increases completion rates on commercial journeys, or reduces the likelihood of regulatory penalties. A user who abandons an application journey is revenue not earned. A complaint that escalates to an ombudsman is a direct financial cost as well as a reputational one.
  • Risk mitigation: design work that reduces the probability of costly failures, regulatory breaches, or reputational damage. This category is the hardest to quantify but often the largest in terms of expected value, particularly in regulated industries.

Specific Metrics to Track

For each category of value, there are specific metrics that provide meaningful signal about design impact. The key is to establish baseline measurements before a design intervention and track the same metrics consistently afterwards, over a period of at least three to six months to account for seasonal variation and implementation lag.

  • NPS change: track at the service level, not the brand level. A five-point improvement in NPS for a specific service journey is more meaningful than a one-point improvement in overall brand NPS.
  • Cost-to-serve reduction: measure the average cost of handling one transaction or enquiry before and after the intervention. Include all channels.
  • Error rate and rework rate: track the volume of transactions that require correction or re-processing. Reductions here translate directly to staff time savings.
  • Complaint volume and resolution cost: track complaints related to the specific service area being redesigned. Reduction in complaint volume and in average resolution cost are both credible impact measures.
  • Digital completion rate: for transactional digital services, the proportion of users who complete the intended journey without abandoning or switching to a higher-cost channel.

Isolating Design's Contribution

You do not need to claim all the value. You need to credibly claim some of it.

The most defensible approach to attribution is to be explicit about what design work was responsible for and what it was not. If a service improvement programme involved a new technology platform, process changes, and service design work simultaneously, design should not claim 100 percent of the resulting improvement. Instead, work with programme sponsors to agree a reasonable attribution model before the work starts. This might be a simple split based on relative investment, a contribution model based on which element addressed which specific failure mode, or a controlled experiment design if the service allows for A/B testing.

In most cases, even a conservative attribution of 20 to 30 percent of measured savings to design work produces a compelling ROI calculation, because the cost of design work as a proportion of total programme cost is typically much lower than 20 to 30 percent. A design team that costs 15 percent of programme budget and can credibly claim 25 percent of measured savings has made a strong case, without needing to overclaim.

A Simple One-Page ROI Reporting Structure

  • Project summary: one paragraph describing the service design intervention and its scope.
  • Investment: total cost of design work, including internal team time and external fees.
  • Baseline metrics: the pre-intervention measurements for each tracked metric.
  • Post-intervention metrics: the same measurements at three months and six months after implementation.
  • Attribution rationale: a brief explanation of how design's contribution was estimated.
  • Financial impact: the calculated value of the attributed improvement, expressed as an annual figure.
  • ROI ratio: investment divided by annual financial impact.
  • Non-financial outcomes: a brief note on user experience improvements that are real but not included in the financial calculation.

Measurement is not the enemy of creative, ambitious service design. Done well, it is the tool that earns the trust and investment that makes ambitious work possible. Teams that measure consistently and honestly build credibility over time. That credibility is worth considerably more than any single impressive ROI figure.

Found this useful?

Blueprint Base | Strategic Service Design & Product Strategy