Week 2 of my minidegree scholarship was all about research.
In this article, I’m going to summarize everything I learned during the week.
Let’s start with the XL research model.
1. The Research XL Model
Step 1: Heuristic Analysis
It’s an experience-based review of your website.
You have to analyze each page in four different categories:
Step 2: Technical Analysis
It consists of cross-browser/device testing, speed analysis, bug fixing.
Step 3: Digital Analysis
- Identifying drop-off points.
- Correlating behaviours with outcomes.
- Measuring everything that needs to be measured to verify the data.
Step 4: Qualitative Testing (Talking directly to your customers)
You can do it with surveys.
Step 5: User Testing
Here, you can ask anybody to use your website, requiring them to do either a broader task or a specific one.
Step 6: Mouse Tracking Analysis
You see the heatmaps of your website to understand where people are clicking.
What to do with the data?
Now, that you have done your research, you have to rank each of your problems on a 1–5 scale where 5 is a severe problem, and 1 is a small issue.
Here are the two most important criteria when giving a score:
1. Ease of implementation — how easy it is to make the change.
2. Opportunity score — how many people you’re going to affect by making a change.
Key insight: you need data that you can act upon.
1.1 How to Measure the Effectiveness of Your Testing Program
Use these three metrics:
- The number of tests that you have done.
- The percentage of tests that give you a win.
- The impact of a successful experiment.
1.2 Ask Yourself These Questions When Doing Site Walkthroughs:
- Does the site work with every major browser?
- Does the site work with every device?
- What’s the user experience like with every device?
2. Heuristic Analysis
It helps you find the problem areas of your website, so you can later see if the data validates or disproves your findings.
You should always be aware of your biases, such as:
- Confirmation bias — where people favour information that confirms their beliefs.
- Bias blind spot — where you see yourself as less biased than other people.
Key insight: Whatever type of analysis you do, you should always follow a structured approach.
When evaluating a site, you should:
- Assess each page for clarity.
- See if each page is relevant to your users.
- Understand if people see clearly what value they are getting for their money.
- Find points of friction.
- Pay attention to any distractions on your page.
3. Usability Evaluation
Jakob Nielsen, who is a usability expert, defines usability in 5 quality components:
- Learnability — how easy it is for users to accomplish a basic task the first time they land on your website.
- Efficiency — how quickly the users can perform different tasks.
- Memorability — how easily the users can reestablish proficiency with your site.
- Errors — the number of mistakes each user makes.
- Satisfaction — how pleasant your design is.
Here is why you should do surveys:
- Reduce your costs.
- Understand your brand.
- Identify your competitors.
How to design a survey?
1. Use close-ended questions.
2. Use questions that address the desired information.
3. Use well-written questions that don’t mislead your users in any direction.
Here are some of the most common mistakes in designing surveys:
- Using scales that are not intuitive.
- Mixing questions of behaviour with questions of attitude.
- Asking questions that are not relevant.
- Creating surveys that are too long.
Here is what you should remember when you’re conducting in-house surveys:
- Existing customers are always biased towards your company.
- You should keep your surveys short.
- Use a sample size of at least 100 people.
4.1 Web & Exit Surveys
Here is when you should trigger polls in your website:
- When people spend more than 10 seconds on a page.
- When there is an above-average engagement.
- When people express exit behaviour.
You can ask questions like:
- What’s holding you back?
- Why are you not interested?
Key Insight: These polls have only one goal in mind — to understand where the friction is.
You should run polls always one page at a time, starting with yes/no question.
5. Live Chat Transcripts
If you use live chat to answer pre-sale questions, you should read transcripts from the last month to understand more about your customers.
6. User Testing
The main benefit of user testing is identifying bottlenecks. Here, you observe how real users interact with your website. Once you have the results, you should change your website design and copy to remove any barriers that stop people from accomplishing their on-site goals.
User testing is different from A/B testing in the following ways:
- A/B testing is done with people who don’t know that they are a part of a test.
- User testing revolves around testing people who are given specific tasks on the website.
Whenever you want to start optimizing your website, you should conduct user tests. You should test people that are your actual target audience. The minimum sample size is 5.
Here the 3 ways you can use to run usability tests:
1) Over the shoulder testing
The best way to run usability tests is by conducting them in-person. All you need is an empty room, a working computer and a notepad to write down your observations. That may seem like a lot in a COVID-19 world, but it still remains the best way to get valuable insights from users.
2) Unmoderated remote testing
That’s a form of testing that’s done remotely using tools like:
3) Moderated remote testing
It’s a form of remote testing where you can clarify the task of the users. Here you aim to learn if your users understand the goal of your website and business.
7. Google Analytics Health Check
Google Analytics is maybe the most important tool for any business to understand its data.
Here are some of the questions you should ask yourself when you’re running a health check:
- Does it collect what we need?
- Can we trust this data?
- Where are the holes?
- Is there anything that can be fixed?
- Is anything broken?
- What reports should be avoided?
Here’s a complete checklist you can use when running a google analytics health check analysis:
1) Adwords setup — ensure this is correctly configured, PPC data shown in GA.
2) Filtering — filter out office IP, agencies, other 3rd parties, yourself.
3) Default URL set up correctly.
4) Enhanced link attribution turned on.
5) Webmaster tools linked (vital for SEO, PPC.)
6) Enable demographics and interest reports (doubleclick .js code.)
7) Custom definitions set up if applicable (and using universal analytics.)
8) Are the views set up optimally?
9) Do you have PLAY (testing) and RAW profiles for all setups?
10) Are you doing country filtering or lumping together?
11) Is there a mismatch between views and intended use?
12) Default page and timezone configured.
13) E-commerce tracking turned on.
14) Site search tracking turned on.
15) Goal configurations — checked thoroughly.
16) Are filters correct for this profile?
17) Basic advanced segments set up and shared.
18) Cross domain tracking — incorrectly configured.
19) More than one outcome using the same page — common!
20) Missing tracking code — on some pages — check.
21) Double tracking of page or event — causing 0% bounce rate.
22) Events not being set as interactive/non-interactive, skewing bounce rate horribly (compare with flow reports.)
23) Meta refresh or redirection, causing params to be lost.
24) Campaign tracking wrongly set up.
25) Missing tracking in emails, newsletters, RSS, social.
26) GA tracking code not all on one version.
27) Funnel or goal steps too broad a match.
Thank you for taking the time to read this post.
Here’s a picture of a lovely dog, watching your beautiful face: