Appendix A
Overall Methodology

In the first major study of SaaS sites undertaken, we examined 30 different SaaS websites through 90 different user interactions. Partnering with UserTesting.com, we had participants share their thoughts out loud as they completed tasks and answered questions about their experience of these sites. The tasks and the results were classified into 4 main categories, corresponding to different elements of the SaaS marketing site:

  • Homepage—how useful and engaging was the main homepage on the site?
  • Pricing page—were users able to find all the information needed to make an informed decision from the plans on the pricing page?
  • Organization—was the site easy to navigate and were users capable of finding relevant information quickly?
  • Branding—how did the site impact the user’s understanding of the product, SaaS company, and brand?

The user’s experience on each site was recorded in its entirety to video, in total amounting to over 1,800 minutes of user testing. Analyzing the video and results we were able to understand what makes for a successful visit to a SaaS site.

Our Participants

In total, 90 participants from the UserTesting panel participated in this study. All completed the study using their desktop computers (any OS, any browser). This meant that every participant saw the same desktop version of the site. Additionally, every participant took the test on the same day, so no changes were made to the sites between tests. Participants were selected according to these criteria:

  • Currently working 30 or more hours, with an income in the range of $40k-$150k+
  • Working in Finance, Marketing, Sales, Service, or Technology.

When the test was first started, we asked our participants 2 questions:

  1. Tell us about the company you work for and your role there.
  2. How many employees work at your company?

Though we didn’t use this information to exclude anyone from the study, or to categorize any results, it gave us the ability to dive further into these results and review the data based on role and company size.

Homepage Evaluation Methodology

After answering the initial questions, participants began the test proper by spending a few minutes on the homepage. There they were asked to describe what the company does only using information from the homepage, as well as what is expected of a user next. This allowed us to learn whether all the information users needed to understand the product and the brand was available from the homepage, and whether calls-to-action were descriptive and obvious. Questions we asked:

  • In your own words, describe what this company does. Spend no more than 2-4 minutes on this activity. Please be as descriptive as possible.
  • Overall this task was: (1 = Very Difficult, 5 = Very Easy) Please explain your answer.
  • What does this company want you to do next from the homepage? Please be as descriptive as possible.
Watch someone describe what Optimizely 
https://www.usertesting.com/videos/KHuAyxaYN-zeG27MCtucGw/clips/735786

After spending some time on the homepage, users then went on to other tasks related to pricing. However, toward the end of the test we asked users to go back to the homepage and think about it again, but now with information from the wider site. 

Questions we asked:

  • Any other thoughts or comments on this page? Is there any information missing that you would like to see?
  • (Rating Scale: Very Difficult to Very Easy) How difficult or easy was it for you to understand the information on the homepage? Please explain your answer.
  • (Rating Scale: Not Useful to Very Useful) How useful do you find the information on the homepage? Please explain your answer.
  • (Rating Scale: Strongly Disagree to Strongly Agree) Please rate how strongly you disagree or agree with the following statement: “The homepage is engaging.” Please explain your rating.

These questions allowed us to assess the information on the homepage in relation to the rest of the site. We could see whether information about the product from the other pages might be better used on the homepage, allowing users to understand the core value quicker and move in to the decision making part of the process.

Pricing Page Evaluation Methodology

We were particularly interested in how easily users could find pricing plan information and decide which plan suited their needs. We wanted to see what information participants used to make these decisions, and discover how they chose the plan that was best for them and their company. We wanted to see whether all this information was readily available within the pricing plans or whether users had to search the site for the pertinent information.

Questions we asked:

  • Use the website to find the product, service, or plan that best suits your needs at work. Give 2-3 reasons why you made the selection you did.
  • [Rating] Overall this task was: (1 = Very Difficult, 5 = Very Easy) Please explain your answer.
  • If you didn’t go to the pricing page, please do so now. Which product, service, or plan best suits your needs at work? Give 2-3 reasons why you made the selection you did.
  • Is there any information missing that you would like to see on this page?
  • (Rating) How difficult or easy was it for you to understand the company’s various offerings? Please explain your answer.
  • (Rating Scale: Not Useful to Very Useful) How useful do you find the information on the company’s product offerings? Please explain your answer.
Watch someone browse plans on Box
https://www.usertesting.com/videos/T97UXQb31s6UlIray7qrIg/clips/735836

With these tasks, we could follow the users as they searched for the correct information, made decisions, and talked through their process. We could find out what information users needed to make this type of decision, and what information was missing from these pages that they still needed.

Marketing Site Organization Evaluation Methodology

Throughout the entire test, we were interested in how the users navigated the site and how they found the information needed. Though we could sometimes see this from the specific tasks, we also asked users to spend some time using the site as they wished. 

Question we asked:

  • Spend the next 3-5 minutes using the website any way you’d like. Please remember to think out loud.
  • (Written Response) Now that you have spent some time using this site, describe what this company does in 1 sentence.

This allowed us to see how a user might find their way around the site when unguided and whether there was an intuitive structure to the site. From here, we could look at the structure of each site and how it helped or hindered the discovery of product information, and how this related to how the user chose the correct plan for them.

Brand Awareness Evaluation

We wanted to gauge user knowledge of the SaaS brand both before they had seen the site and once they were finished with the testing. To do so, each test was started on a neutral page—google.com. 2 questions were presented to the users before they went to the SaaS marketing site. 

Questions we asked:

  • Have you heard of the company before? Please explain.
  • (Written response) What three words would you use to describe the company? If you haven’t heard of them before put “N/A".

Using this strategy meant we could see if the site changed the impression of the brand that our users had. When they got to the end of the study, the users were asked what they understood of the product, and how they would pitch the site to their boss. These questions demonstrated whether participants understood the site’s objectives and missions. 

Questions we asked:

  • (Rating Scale: Not at all confident to Extremely Confident) How confident are you that your description of what this company does is accurate? Please explain.
  • If you had to convince your boss why your company should purchase this product or service, what would you say to persuade him or her? Please be as descriptive as possible.
  • (Written Response) What 3 words would you use to describe the company?

Finally, users were asked a single question after the test had concluded: 

Question we asked:

  • How likely are you to recommend this site to a friend or colleague (0=Not at all likely, and 10=Very Likely)?

From the answers to this question, we were able to compute the net promotor score (NPS) for each site. NPS is a metric and a process that helps evaluate customer loyalty. It uses a single question, “How likely is it that you would recommend Company X to a friend or colleague?” as a gauge of overall customer satisfaction and a predictor of business growth. 

We calculated the NPS of each brand using the ratings from our recommendation question using UserTesting.com’s methodology. With the 3 users for each site, a promoters was assigned +33 for their rating and a detractor was assigned -33 for their rating (passives are assigned 0). When added together they produce the net promoter score for the site. 

Respondents are asked to rate the answer 1-10, with 10 being extremely likely to recommend, and 0 being not at all likely to recommend. The respondent’s score was then put into one of three categories:

  • A user who scored the brand 9 or 10 was considered a Promoter, someone who is loyal to the brand and will actively recommend it to friends and colleagues.
  • A user who scored the brand 7 or 8 was considered Passive. They like the brand and will recommend it, but usually with caveats and some hesitation.
  • Users who scored the brand 6 or below were Detractors and are likely to spread bad word of mouth about the brand and stop using it.

We then calculated the NPS of a site or brand by working out the score for detractors, and subtracting it from the score for promoters. The score runs from +100 (entirely promoters) to -100 (entirely detractors). 

Watch someone describe Chartbeat after the study
https://www.usertesting.com/videos/vrTSFeesBqrwzIx-xHT0rA/clips/735777

Data Analysis

The data from this study came in two forms: qualitative and quantitative.

Qualitative data

As the users were asked to continually think out loud throughout the test, we had subjective reports from each user about the sites. We both listened to each and every recording, and also generated transcripts for each user test. Using timing data from UserTesting we could cut these transcripts down by task to determine exactly what users said at every stage of the test. 

From this data we could both determine what users thought about each section of the site, and were also able to generate wordclouds based on the task and specific section of the SaaS site.

Quantitative Data

After certain tasks we asked participants to rate the ease of use of the site or the usefulness of the content. These questions were useful in understanding where the pain points were after a core task was completed. Rating scale questions were also helpful when trying to understand how useful and digestible the content on the site was. Overall, rating scale questions provide a way to quantitatively measure a task and compare tasks easily.

  • Overall this task was: (1 = Very Difficult, 5 = Very Easy) Please explain your answer.
  • How difficult or easy was it for you to understand the company’s various offerings? Please explain (1-Very difficult, 5-Very easy)
  • How confident are you that your description of what this company does is accurate? Please explain. (1-Not at all confident, 5-Extremely confident)

We used this data to rank the sites in each of our 4 categories. We also looked at whether these rankings correlated with certain features—videos, CTAs, feature matrices—that the sites had. However, none of the correlations were statistically significant, so we excluded these findings from the results.

UserTesting Methodology

This study was conducted in UserTesting, with help from Asha Vergis (UX Researcher) and Marieke McCloskey (Director of Research) on their Research Team

UserTesting is a platform that allows you to quickly conduct remote unmoderated UX Research with participants from their panel. Remote unmoderated research means that participants are attempting predetermined activities using a design or interface provided by you. The participant decides when and where she would like to complete the study and records the sessions while thinking out loud. 

By partnering with UserTesting we were able to complete 90 sessions in 1 day, because there was no need for a moderator to be present during each session.

Run this study yourself

Run this study yourself to see how your SaaS company website stacks up. Do your customers know what you offer just by glancing at the homepage? What information can you provide on your pricing plan pages to help customers pick the plan that best suits them?

To take the research a step further for your company there are a few things you should consider:

  • Sample size: 8-10 participants
  • More specific demographics:
  • Who would find your service useful?
  • What position are they in and what type of field do they work in?
  • Recruit for participants who are looking for the product and service you offer

Existing UserTesting clients can launch this study now.

<< Chapter 6: Case Study - Wistia