DEV Community

Cover image for 8 success metrics for design systems
Jeroen van Meerendonk
Jeroen van Meerendonk

Posted on • Originally published at jeroen.wtf

8 success metrics for design systems

The concept of a design system is much more established than it was five or ten years ago. Yet we still sometimes need to sell it in order to allocate time and resources to work on it, specially in larger organisations.

A great way to prove the value of a design system, or to ensure it's doing its part, is to be able to measure the impact of having and using it. I will point you a few metrics that you can check and some hints to help you to evaluate them. Some of them are quantitative but others will require to get feedback from team members and other stakeholders. Keep in mind that they are just examples and may not apply to your specific case, but it's a good place to start.

Adoption

The level of involvement, usage and awareness of updates can be the main metric of success, since most of the other ones depends on it. An unused design system can be wasted effort and lead to deprecation, losing a tremendously valuable opportunity. Communication is key to find out and remove blockers.

  • How many people or teams are using the design system?
  • What are the pain points stopping the adoption? Complexity? Lack of documentation?
  • Do people depend often on modifications in the design system in order to use it or does it cover their needs?
  • Are there some specific components that are not being used? Why? Should they be modified or removed?

Maintenance

Some companies have a specific team maintaining the design system. In some others, it's a group effort spread in the organisation, so allocating time to keep it in good shape is trickier. It's important to understand the maintenance cost so we can include it when we do the math of how much time it saves us.

  • What is the impact of updating the design system? How often are there breaking changes?
  • How much time developers need to invest when updating the design system related code (not talking about new feature implementations, only about the design system itself)?
  • Is the codebase far behind the changes of the design or do they keep up to date?
  • How many single-use components are there? Are they being unified and replaced?
  • How many lines of old code are being removed because of implementing the design system?

Design handoff to developers

Communication between the design and development teams can be a source of frustration for everyone, and eventually impact their performance and overall happiness in the workplace. Designs bouncing back because of implementation problems or unclear guidelines increases the overall time to market (TTM) metric. The smoother the handoff the better.

  • How many times a design bounces back to the design team?
  • Do the developers often need to clarify things about the design implementation?
  • How easy is for developers to identify the reusable parts of the design system in a new design?
  • How often new components need to be created when implementing new designs?

Design and development time

How much time is invested from the conceptualization of a new feature to the moment it's released to production. Shorter times helps to estimate and plan roadmaps and allows people to work on more things. This is an interesting metric to keep product managers happy since they will be able to measure the product-market fit (and reap the benefits) of new features sooner.
Good, battle-tested components also ensures less bugs and surprises that would add hours to the task.

  • How much of the new implementation is based in existing elements of the design system? How much of it required modifications or new additions?
  • How was the release process? Did something break or new bugs sneaked into production?
  • How fast are pull requests approved and merged?

Onboarding of new hires

How much time takes new people to get up to speed and contribute, either in the design team or development team, is a metric we want to keep small. A new team member should be able to start contributing with confidence as soon as possible. It's good for the business and it's empowering for the newcomer.
Also, the onboarding process is a great way to find weak points and improvement opportunities for the design system and its documentation.

  • Is it easy to get familiar with the elements and parts of the design system or its implementation?
  • How much time needs to be invested from other team members to help guiding and clarifying things to the new hires?

Ownership

Who is in charge of deciding and implementing modifications of the single source of truth is important to keep consistency and to prevent the design system to get bloated unnecessarily. It's crucial to keep it as lean as possible, since it gets more complicated to maintain as it grows. Decisions can be made as a democracy or it can be someone designated to have the last word.
Sometimes the source of truth will be in Figma and handled by the design team, or maybe it will be written in code and maintained by the developers.

  • Are people and teams blocked because of the current ownership or the lack of it when trying to work on the design system?
  • How is the feedback gathered from different stakeholders, so they can feel part of it? Is it an easy process or do they often feel ignored?
  • When decisions are taken, do they have a general consensus?

Performance

How it affects Lighthouse score and other tests or your servers infrastructure. Better performance means faster and snappier interactions and happier customers, but also reduces the cost and surprises of the business infrastructure.

  • How are the different performance metrics (FCP, LCP, TBT, etc) affected when using the design system components? Can you see an improvement after a specific release or over a specific amount of time?
  • How many libraries are you using and how is it impacting the overall size and performance of your product? It can be your JS bundle or the size of your iOS or Android apps.
  • How are the servers cost being affected?
  • When running tests, how many times do they fail because of design system components or implementations? How many of them are fixed because of that?

Customer happiness and brand reputation

A lot of things can affect this metric, but you can try to identify the impact related to the design system. How it affects other product metrics like engagement, conversion rate or customer churn. Measuring before and after a design system change can shed some light about how it acts on them.

  • How the customer success rate is affected when using the design system? Is the UX improved and familiarity increased?
  • How often do they need to contact support?
  • Is the visual language consistent and helping the customers to achieve what they are trying to do?
  • Is your product more accessible because of the design system? Sometimes we neglect accessibility in order to reduce timings. Having it baked off in the design system allows us to give it the importance it deserves.
  • If your product has different themes or variants, is it making it easier to handle and maintain?

What other things you measure to understand how your design system is performing? Feel free to leave them in the comments.

Photo by Andy Hermawan on Unsplash

Top comments (0)