Measuring Successful Documentation

Amara Graham - Mar 18 '21 - - Dev Community

"Yes, but how do you measure success?"

This is easily the most common phrase thrown around, not just in Developer Relations or Developer Experience, but in this agile world we live in today. The fact is, you can be really busy and, unfortunately, not deliver any real, measurable business value. This is typically a waste of your skills and a problem for your employer. No one wins.

Let's think about Developer Experience, which for the purpose of this post is roughly all things enabling developers. If the documentation exists, is well written, is used by developers, isn't that enough to say it's successful? Maybe it doesn't even have to be well written, it just needs to exist. If it's used by developers, isn't that successful?

Depending on your goals, sure, that may be successful. Having a single, unique, non-employee pageview may meet your criteria for success. Someone looked at it! Someone is enabled! We've done it! Success, in this case, is really just proving it is live.

This is enablement at its most basic, superficial sense. You have documentation and someone outside of your company can see it. But I highly doubt the person who asked this question is going to find this an acceptable answer. And long term, this shouldn't be an acceptable answer for you either!

What's my goal?

As I implied above, my goal with documentation is squarely in enablement.

Your first step is to admit to pretty much anyone and everyone, including yourself, that this is hard. You will need to iterate based on your community, your company's goals, and maybe even your industry.

The better you know your community, the better you can determine what is measurable. Copy-paste, cookie-cutter metrics will not help you as much as they will hurt you.

What do I measure?

I want my documentation to enable my developer community, existing enterprise customers, and my internal coworkers to get the information they need efficiently and delightfully.

This means I'm going to focus my success metrics around these topics and areas:

External

  • Vanity metrics (pageviews, bounce, landing pages)
  • Engagement metrics (star ratings, comments)
  • Zero search results, search term keywords (comment if you want to see my thoughts on this in a future post)
  • Percent of content updated per release, per quarter

Internal

  • Partnering with tech support to decrease the number of 1-touch, "how do I" questions
  • Partnering with consulting, customer success, field ops, etc. to incorporate UX-type feedback into docs experience (improve findability, searchability)
  • Partnering with product management on success metrics for product/project
  • Ability to execute doc-initiatives (gardening, re-platforming, etc.)

Data with no context is dangerous!

All of this comes with a big asterisk. Simply producing this data without context could lead to some thrash.

Vanity metrics

Vanity metrics are a great gut check. Is your documentation live? Are your top-performing pages maintaining their standing?

Depending on the kind of documentation, these vanity metrics may be a little different from marketing pages. For example, glossary terms will probably have a low "time on page" and a high bounce. Conceptual pieces may have a higher "time on page" and perform similarly to landing page content. Both are functioning as expected.

At a previous company, our most consistent "lowest-performing page" was our uninstall instructions. This is also functioning as expected. The uninstall experience was so intuitive (and so unnecessary) that this was just an article for due diligence but answered the question without having to install it - "can I uninstall this software easily?".

Some other things to look at - what are the exit pages? What are the entrance pages? Are these what you expect? Adjustments to these lists can be great indicators of a successful documentation initiative.

Engagement metrics

Silent developers are happy developers or they are so confused they can't articulate what they need to get out of their confusion. This is where it comes back to knowing your community.

Certain developers and developer personas are known for quietly struggling through your documentation and others are going to make a big scene about it across social media.

Your developers may be a chatty group, willing to leave comments, questions, and concerns on your docs, forums, or with their field ops engagement.

Unfortunately, comment spaces on docs can turn into noise. To me, any comment shows engagement even if the engagement is "this is broken". They found the documentation article, found a way to engage, and potentially provided an opportunity to review the content for clarity, at a minimum.

Of course, more informative comments will inevitably give us more to go on, but that's for another blog.

Internal partners

You'll notice I wasn't super clear about measurements in my breakdown of internal partners. This was intentional.

You can share certain success metrics, for instance, lowering the number of tickets for 1-touch, "how do I" support requests. These create noise, but they also highlight pretty clear patterns in opportunities. By partnering with tech support, you can see these patterns (hopefully in the form of reports) create or modify existing documentation, and monitor the number of 1-touch tickets that come in on that topic, which should theoretically go down.

Now it's super important to partner here because depending on how support is measured, your work to successfully enable developers with better, clearer docs (that reduce 1-touch tickets) may impact not just the amount of tickets they get, but the rate at which they close those tickets. 1-touch tickets are fast to close but offer low business value. Removing 1-touch tickets free up cycles the technical support reps can spend on higher touch, higher complexity tickets, higher business value tickets.

Similarly, partnering with other externally facing teams to identify gaps in potential documentation is crucial to determine if there are any existing assets or material that can be reused or if not having this information publicly available is intentional. Perhaps this information is gated by an enterprise license or specific contract. Don't leak your special sauce recipe!

All in all, accidentally screwing an internal partner in the name of your initiative's success does not win friends.

Success is a healthy team

What is often lost in metrics that work for other teams or projects is that, just like in engineering, documentation has technical debt. Documentation corpus gardening in the form of reviewing, updating, archiving, and other honing activities is critical for the health of your docs. Whether your documentation is maintained by your product engineers or a dedicated team of technical writers, you need to factor in some time to burn down this doc tech debt.

Successful documentation doesn't mean the entire corpus is touched every release. With older, larger software platforms that simply isn't scalable. Measuring an initial baseline and setting a target from there is one way to see if your team is able to not just document features, but also tend to the gardening activities too.

I look at this as having an external and internal component. Externally, you'll see how many pages are updated for the new version or release maybe in a PR, release notes, or by the number of "updated" badges on your documentation. Internally, if you are maintaining a backlog, how many of those documentation-based backlog items were worked? Or how many were intended to be worked but were missed to support new feature documentation instead?

An undocumented or under-documented feature is only as good as a missing feature, and some documentation maintainers really internalize this. Balancing documenting new features and taking care of your documentation garden is critical to the health of your overall documentation corpus and team. Your documentation isn't successful if you are churning through burned-out maintainers.

Wrapping this up

There is more to this, but this is a great mindset to start with. My biggest caution is leaning too hard into the vanity metrics. Pageviews are good, but they tell your SEO story more than anything else. Great for awareness, possibly not as good for enablement, but it ultimately depends on your community, your company, and your goals.

I'm interested to hear from folks if this resonates. Given that documentation at Camunda is a joint effort between product engineering (writers) and Developer Experience (strategy) the success metrics are tied to the goal of enablement.

How do you measure success for documentation?


Big shoutout to Ali for this tweet that inspired my responses. Highly recommend reading through the responses there too.

Cover photo by patricia serna on Unsplash

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .