SHARE

February 10, 2021

How listening to developers improved our API docs

Alex Hoffer

As an API company, we pride ourselves on creating an outstanding developer experience. That includes providing amazing docs. 

Docs have always been an area where we've invested a lot of effort. We've heard from both Plaid developers and other API companies that they value the thoroughness, detail, and clarity of Plaid's API documentation. But in 2020 our docs site was starting to show its age. While the presentation of information was more than acceptable back in 2013 when we had only a handful of API endpoints, no mobile or international support, and few enough customers that ticket workflows were never an issue, by last year things had changed dramatically.

Our documentation was on a single page that took 20-plus seconds to load. As the result of a website migration, docs were constructed from jsx files with HTML tables that nobody wanted to edit. Plenty of technical documentation—especially for beta products—was in the form of PDFs requiring a painstaking update process every time new content was added or a mistake was fixed.

We also had a separate Help Center, containing information that we knew was helpful but didn't fit anywhere into the existing docs structure. Its most popular article had one of the lowest “helpfulness” scores, but without any way to capture qualitative feedback from visitors, we didn't know how to improve it.

All of this needed to change. We decided to put out a new documentation platform that reflected our dedication to providing an excellent docs experience. In establishing a few requirements, we knew it had to be:

  • Maintainable. Engineers were reluctant to edit our current docs due to the painful experience. If we wanted to keep docs up to date, we’d need to make the process easy.

  • Scalable. The structure of our docs struggled to keep up with the amount of content and new features we created.

  • Discoverable. The docs would have to be easy to use and make discovering content easy, too.

  • Comprehensive. No matter how great a company’s support desk, waiting for someone to answer your question still slows down development in a major way. The documentation on the site would need to be as self-serve as possible.

What we learned about gathering feedback

Pre-beta and demo internally—widely, early, and often

We updated the entire company as frequently as possible when working on the new docs. We started putting out builds so that we could dogfood this internally as soon as we could, then sent them to a company-wide mailing list. We received substantial feedback, including over 100 individual comments, notes and suggestions. We also used it as a way for folks across the company to stay in the loop on progress as content evolved.

Collect in-app feedback with an "escape hatch"

Once we’d fixed all the critical issues from the internal dogfooding process, we launched a public beta linked from the existing docs. We put feedback buttons in the new docs as well as the old ones. All of the feedback ended up on an internal dashboard, which we committed to addressing in full.

Below: The in-app feedback flow

Once the feedback was sufficiently positive, we felt ready to make the new docs the default experience. To do so safely, we left a link in place for users to go back to the old docs. However, before being redirected to the old URL, they had to explain why they weren’t satisfied with the new ones.  This was a small annoyance, but since users only had to go through the flow once in order to get the old URL, we felt it was a worthwhile tradeoff to make sure we would have the information needed to fix the site for those users -- especially since we knew that we'd eventually have to stop maintaining the old version. And while a few visitors did just mash the keyboard to get to the next screen, many provided great feedback. This “escape hatch” also allowed us to sleep soundly, knowing that if the new user interface wasn’t working, people could go back to the old one. 

And of course, we tracked metrics around all of this. By tracking clicks on the feedback buttons, as well as the button to go back to the old docs site, we were able to derive a KPI. Tracking this “helpfulness” metric allowed us to keep tabs on responses to the site and understand whether our changes were moving user sentiment in the right direction.

Get feedback from people with problems

In order to learn how to improve the site, we had carried out various kinds of opt-in user research—like sending betas to friendly customers, new hires, and other people who were glad to give feedback. These sources yielded some great input that we incorporated into our flows and information architecture.

But much of our best feedback came from a different source: people who might never have opted into a beta. These were folks actively trying to solve real problems. Developers with stable implementations don't refer back to the docs very often and might not have much to say when given a survey; but a developer who's actively using the docs to try to solve a problem most definitely does -- and they won't worry about sparing your feelings, either, especially if they have an anonymous feedback channel.  Likewise, a support engineer who wakes up in the middle of the night stressed about an influx of tickets resulting from bad documentation is motivated to scour a docs site for problems like no new hire, no matter how helpful, ever could be. 

And that's why the feedback we got from the in-app feedback feature and the "escape hatch" turned out to be so valuable. People who opt into a beta experience and then give feedback are helpful folks who like trying new things and taking their own time to help you out by improving your product -- which means they aren't necessarily representative of everyone who uses it. By funneling everyone into the new experience and then letting them opt out, we got more eyes on the new product and feedback from a more diverse audience. It also encouraged people who might not have proactively volunteered their feedback to make a comment in order to access the older docs.

We also extended this approach of getting input from people with real problems to how we solicited feedback internally. In addition to sending out internal betas of the new docs and then waiting to hear back, we scheduled interviews with employees directly affected by them: people in Support, Sales, and Product. What pieces of documentation had they written themselves to supplement gaps in the official docs? What concepts that were already documented did they keep having to re-explain? These answers guided the content development of the site.

What we learned about user experience

Users don't see things the way you do

An unexpected find from the internal dogfood was that UX problems don't always present as such. “I like the docs,” read one piece of feedback from the Sales Engineering team, “but we should really have detailed troubleshooting content.”

Since the project had involved creating and publishing troubleshooting guides for every API error code, we were left scratching our heads. What was wrong with our troubleshooting content? Was it too vague? Missing an important step?

We asked the sales engineer to describe the problem to us. “We list the error codes,” he pointed out, “but there's no troubleshooting guide.”

This turned out to be a content discoverability issue disguised as a missing-content issue. In this case, the left nav bar looked too similar between pages and some visitors didn't recognize it as a contextual navigational tool that showed different content for each page. Adding subtle cues, such as bolding and shading the user's location, turned out to be enough to have users notice the changes as they navigated the site.


A similar surprise came with the site’s launch to GA, making it the primary documentation experience. Immediately, developers started using the feedback widget to submit feedback about missing content.

It turns out that the culprit was a change to our representation of API schemas. To make the schemas easier to read, we put them in expandable tables. However, these tables were collapsed by default, meaning the content wasn't discoverable via command-F. 

According to a factoid on the Internet, attributed to Google researcher Dan Russell, only 10% of Internet users know how to use command-F to find text on a page. Whether or not that’s true, 100% of developers use it. And because the old Plaid docs were one long scrolling page, we’d unwittingly trained developers to use command-F to find anything and everything in our documentation. Then we broke it.

Fortunately, the resolution was quick. Once we understood the cause of the problem, a one-line code change set the default table display to expanded rather than collapsed, and developers could once more command-F their way through the docs. 

Users don't use your product the way you think

This turned out to be just the beginning in a long series of improvements to search, as search discoverability became a theme of user feedback. It turned out that while users exploring the docs in an open-ended way used site navigation at first, then would move to search only if they couldn't find what they were looking for, those looking to answer specific questions would often completely bypass navigation and go directly to search.

To address this use case, we built a dashboard showing the most popular visitor queries, then tested them to make sure the results made sense. We ended up adding hundreds of entries to the search index so that the names of fields used in API requests and responses would always return relevant information. And it took plenty of manual tweaking to make sure that, for example, different results were returned for similar-looking search terms like “user” and “username”, or that the term “ACH” returned results about ACH-supporting products rather than exclusively about a partner whose name starts with the letters "ACH."  

In fact, your users might not even be who you think they are

The other big surprise in the docs was just how many of our users weren't developers but other technical stakeholders, such as product managers evaluating potential solutions for their team, or support engineers trying to help debug a user-facing error in an integration.

Once we had a search feature and our content broken down into multiple pages, we could see what was popular and what wasn't. Some content, which we’d almost considered an afterthought and hadn't even bothered to include in the old docs, turned out to have a huge impact. For example, both qualitative feedback and metrics from the search tool told us that the Postman Collection, which lets you use the API without writing code, was one of the most popular pages on the site. We quickly adapted by linking to it from the Quickstart to divert users who wanted a no-code solution.

What's next for our docs

While we’ve built a lot already, not every piece of feedback has been quick or easy to address—which means there’s more work to be done. Here are a few things we're planning next:

  • Better exposure of relevant information across multiple surfaces. There's a lot more we can do to show relevant docs content to developers, like using data input into the support ticket form to search documentation, in case the answer to your question is already on the site, or linking more directly to troubleshooting content from API error codes.

  • More self-serve docs. We've migrated multiple PDFs and slide decks to the website, including most of our troubleshooting content, the Launch Checklist, the OAuth guide, and documentation for several beta products. However, we haven't fully converted over, meaning some information, especially for beta products, is still distributed manually via PDF, rather than being available on-demand on the site.

  • More live demos and code. We want to add additional demos, playable code snippets, and more complete sample apps to make it faster to explore and harness Plaid's capabilities.

P.S. We're hiring!

If the idea of creating any of the above features excites you, come join us! The Plaid Developer Relations team has several remote-friendly openings. We're currently looking for:

To learn more, check out our postings on plaid.com/careers.