top of page

GitLab Document Site

Usability Study

The GitLab Documentation website is an open-source platform where users can find the necessary information to guide their usage of the product.

This study aims to evaluate the usability of the latest version to date (15.0) of the GitLab Documentation site and provide relevant recommendations for future development.

Apple iMac 21.5_ (2019).png
Apple MacBook Air (2020).png

​

My Role - UX Researcher (Team of 4)

​

​

Tools Used - Mural, Zoom, UserTesting.com 

​

About the Project

Figma

The purpose of the study was to evaluate the usability of the GitLab Documentation website for new users (less than 6 months of interaction) based on user feedback. According to research conducted in 2021 by the GitLab Documentation team, new users account for 37% of their traffic. The GitLab Documentation website is an open-source platform where users can obtain knowledge that will help them use the product more effectively.

 

A 30-minute usability study was conducted with each of the nine participants over remote screen-sharing technology.  The goal of the study was to answer questions related to the following areas:

  • The usability of the documentation site.

  • The findability of information.

  • The usefulness of certain features (such as the Algolia search, the version picker, and the right-side navigation).

 

Both quantitative and qualitative feedback were recorded to measure the performance, as well as to collect quotes as per the client’s request. Participants were encouraged to think out loud while completing the tasks and communicate their thoughts regarding the interaction with the website.

Project Goals

While conducting the usability testing for Gitlab Doc site our goal was to the answer the following questions:

  1. Does the documentation site satisfy the user and meet expectations?

  2. Do users understand the right-hand navigation and find it useful?

  3. Do users find the search functionality effective?

  4. Can users find the tutorial page?

  5. Do users understand the drop-down version picker and how to use it?

Expert Review

The purpose of the expert review was to do a cognitive walkthrough of the site as a typical user who has less than six months of experience with GitLab and discover any areas of the site that negatively impacts the user’s experience and/or conflicts with established standards and conventions.​

​

Jakob Nielsen’s 10 Usability Heuristics for User Interface Design were used as the criteria for the evaluation, with an additional focus on the 10th heuristic, help and documentation. Per the client’s request, Alita Joyce’s guidance for providing reactive help served as the criteria for this additional focus. The severity scale was adopted from GitLab’s UX research handbook, with the addition of a positive category. It was important for us to use the same severity scale as GitLab’s internal projects, so that the results can be directly compared to prior research.

​

The expert review was first conducted individually and then combined to create a group expert review. The findings for each heuristic were listed and prioritised based on the severity ratings. The list of prioritised findings helped determine the areas of the site that had room for improvement. These findings also gave the team insight into the areas we need to pay attention to during the usability testing.

​

What we found - 

Screenshot 2022-07-09 at 9.21.24 PM.png

Prepare Screener and Moderator Guide

Prior to conducting the usability test, the team sent out a recruiting screener to identify the desired user group. Participants were recruited from Gitlab’s userbase and UserTesting.com’s panel of users. A total of 10 participants were recruited. All participants were new to Gitlab, within 1-6 months of use.​

​

The next step was to create a moderator guide, consisting of participant briefing, and warm-up questions, followed by scenarios and corresponding 3 tasks. Task completion was followed with post-test questions and difficulty ratings to collect quantitative and quantitative feedback.

Conduct Usability Testing

Participants took the usability test via remote screen-sharing technology (Zoom) on a desktop computer or a laptop. Both the participant and the moderator communicated via Zoom call while the study was taking place. There was no restriction on the type of operating system or web browser used. However, participants were encouraged to use an incognito window and to have a strong and reliable internet connection to ensure screen sharing could be conducted smoothly without lag. 

 

Each user performed the tasks in the same order and each usability session lasted about 30 minutes. All participants were asked to perform three main tasks. Each task took approximately 10 minutes to complete. All participants were compensated for completing the usability session. 

​

Synthesize Data

After conducting 10* usability sessions, we organised and analysed our notes and extracted insights, representative quotes which were grouped based on the task. We then developed recommendations based on the extracted insights and grouped all the insights and quotes around the corresponding recommendation.

​

*One participant’s data was not used in the analysis due to technical difficulties.

Findings and Recommendations

Screenshot 2022-07-10 at 2.00.57 PM.png
Screenshot 2022-07-10 at 2.03.20 PM.png
Screenshot 2022-07-10 at 2.09.45 PM.png
Screenshot 2022-07-10 at 2.08.07 PM.png
Screenshot 2022-07-10 at 2.09.55 PM.png
Screenshot 2022-07-10 at 2.10.05 PM.png
Screenshot 2022-07-10 at 2.10.15 PM.png
Screenshot 2022-07-10 at 2.10.24 PM.png
Screenshot 2022-07-10 at 2.10.33 PM.png

Results & Next Steps

At the end of the project, all the findings and recommendations were communicated to the GitLab team via a presentation. Our clients were very pleased with our findings and the actionable insights, both from the expert review and the usability test.

​

If the team were to continue this project past the final deliverable, some next steps would include:​

  • Additional usability tests with a larger group of participants to get a more detailed and concrete understanding of the new users’ expectations and pain points.

  • Do competitive user testing to compare users' mental models and to provide additional insights to our client.

bottom of page