Consumer Ideas: Smoke Tests

Participants

Alex Boase - Product Manager

Vincent Plummer - UX Designer

Ben Messner - Product Analyst

Overview

In order to determine the next product that the Return Path Consumer Team should build, we carried out an extensive validation plan in order to de-risk our ideas.

We began with a user-centered approach incorporating brainstorming sessions, smoke tests, user interviews, and comprehensive analysis of all our data points.

Methodology

We started out by brainstorming dozens of ideas as a full team. After voting on our favorites, we plotted each idea's complexity against it's potential addressable market, as indicated by their competitors.

We then built smoke tests for the top 10 ideas, creating value propositions for each, as well as developing a brand and smoke test website, and marketing them as if they were real products.

Our final step was conducting user interviews with the top performers from our smoke tests.

Instapage:

Instapage is the platform we used to build and host the smoke tests. It allowed us to maximize conversion rates and deliver professional landing pages without developer time. The wysiwyg editor was intuitive and easy to use, plus there were many pre-made templates to choose from. We were able to track metrics easily and capture emails of users who might be willing to speak more with us in the future.

Our goal with this was to simulate a desktop installation of a chrome extension. The test was designed to capture three separate moments of user intent: 1) Landing Page CTA, 2) Chrome Extension Installaion, 3) Form submission.

Google Analytics:

A unique GA property was created for each principle landing page, and integrated with Instapage via their Google Analytics plugin. This allowed us to cross reference location, demographics, interest, gender, and many other forms of data from the users that visited the landing pages.

Hot Jar:

Hotjar is a new and easy way to truly understand your web and mobile site visitors. By inserting a little snippet of code into the <head> of the html, the team can see how users react to the landing page. Heatmaps, Recordings, Conversion funnels, form analysis, feedback polls, surveys, and user testing recruiting are all features that Hotjar offers.

 

Advertising the Smoke Tests

We put $7k into marketing all ten of the smoke tests over a two week period. Half went to social ads on Facebook, and half went to Google AdWords. Ten campaigns were created on each platform, with multiple sets of creative under each campaign.

 
pivot.png

Adjustments & Analysis

Round 1: Quick Targeting Pivot

Worldwide

15-65+ English Speakers

All of our ads ran in English. We initially were getting our majority of clicks from non-English speaking countries. Since a majority of the clicks were coming from android mobile phones in Google AdWords and Facebook Interstitial Ads, and because we were already suspicious of clickfarm activity from those countries, we then blocked the top offenders from our advertising.

 
instapage-results.png

Round 2: More Refined Targeting

N. & S. America, UK, Europe, Australia

15-65+ English Speakers

After blocking the countries that we suspected false traffic from, we targeted counties that we had previous marketing success with.

In order to get a broader perspective, we looked at our results from a number of different angles. We analyzed data "only from before" our targeting change, data "only after" that change, and data "over the entire course" of our advertising test.

Our quantitative results all showed that Password Watchdog won enough of our categories to justify building that application, but not by a significant margin. We needed to confirm with qualitative data.

 

Round 3: Interpreting the Qualitative Data

Password Watchdog vs. Email to PDF

Since we had a form field on our Smoke Tests to capture why users had searched for and tried to install our application to solve their problem, we were able to separate out legitimate interest from clickfarm traffic.

We quantified the answers given by tallying up similar results, throwing out the irrelevant ones, and looking at which results matched with the product's value propositions that we advertised.

We had hoped that this result of this analysis would show us that one of the smoke tests had more illegitimate responses than the other, but the results were dead even at 64%. More qualitative research was needed.

However, we did notice that out of the responses that were a dead-even match to our value propositions, Email to PDF had outperformed Password Watchdog by 7%.

 

Round 4: Additional Ad Campaign

N. & S. America, UK, Europe, Australia

15-65+ English Speakers

Password Watchdog vs. Email to PDF

Because we had conflicting results pointing us in different directions at this point, we decided to run one more ad campaign on our smoke tests. Previously, we had separated each Smoke Test ad campaign, but this campaign pitted Email to PDF against Password Watchdog to compete for budget.

Email to PDF produced a higher click through rate, double the amount of impressions, a lower cost per click, and more conversions, which made it the definitive winner of this test.

 

User Interviews

Recruiting with Streak & Instapage Responses:

To gather more definitive qualitative data, we turned to our Instapage results to recruit beta testers. We used a CRM tool called Streak to set up user interviews with Instapage leads that provided an email address via the Smoke Screens tests. Each lead received two emails asking if they would be willing to set up a time to discuss their interest in the product, but in the end we did not net any additional interviews via this process.

Lesson Learned- Quick followup is key to getting potential respondents' feedback, as is having a legitimate email address to send from, instead of a gmail address.

Recruiting with Respondent.io

The scope of this test included five participants sourced via Respondent.IO in an interview to test comprehension of the landing page, gauge interest in the product, and determine the pain point that causes someone to seek a solution like ours in the first place.

Video Interviews & Editing / Affinity Mapping

Thirteen 30-45 minute interviews were conducted for Password Watchdog and Email to PDF. Due to the success of Email to PDF, the longer interviews were edited down into smaller 3-7 minute highlight clips to be more digestible for the team and stakeholders. The user statements were then grouped with virtual sticky notes using RealTime board. We then created an affinity map to identify common pain points and similar value propositions.

real-time-board.jpg
participants - usertesting
 

Conclusion

Password Watchdog Interview Results

The goal was to:

  • Validate top value prop that would benefit the user (the "why" rather than a "feature"),

  • Identify top pain pains of the user that caused them to seek out this solution

  • Assess what other solutions respondents are already using.

Results

The average Delight score was 6.5 out of 10.

We found that:

  • The value prop for current version of Password Watchdog does not fully address the respondent pain points of relieving anxiety and securing their accounts.

  • While respondents appreciated the additional notification, some expressed desire for it to actually secure their passwords.

  • Respondents were disappointed that it was only a notification of a PW change rather than something that actually protected them.

Why We Decided Against Password Watchdog

  • Lower Delight Score

  • Misalignment of User Needs with Value Props. Summary: "Respondent wants security or hack prevention rather than a notification."

  • Lower alignment between respondent answers verses Email to PDF

 

Email to PDF Interview Results

The goal of the test was to identify the following:

  • The main anxiety around wanting to remove their email content from the email itself.

  • What is the role of email in this? Primary? Secondary?

  • Why downloading? What methods are an acceptable means of delivery?

  • Is it more important to be able to do this with a one email or a series of emails?

Results

The average Delight score was 9 out of 10. The majority of respondents actually indicated a willingness to pay for this product.

We found respondents indicated strong desire for the following features:

  • Ability to migrate emails from accounts that they deem may not stay active (Hotmail) or a work or school account

  • Ability to download emails to desktop so they could access them offline

  • Ability to preserve or save off a server attachments and key documents

Why We Decided In Favor of Email to PDF

  • Higher Delight Score. Respondent often said "9 or 10" which is extremely high.

  • Full Comprehension of Value Props

  • Strong alignment between respondent answers which demonstrated consistent connection to value prop

E2PDF Areas of Focus

Along the way, we discovered two pain points that will help inform our positioning and marketing of the product.

Offline Access (specific emails or series of emails)

  • A desire to access emails to read while traveling, working on an airplane, or in areas with poor internet access like a beach.

  • A desire to download and organize important personal or business related documents like tickets, travel itinerary, reservations, etc.

Migration / Download (large bulk)

  • When a user leaves their school and is forced to have to pay for their email service.

  • When a person leaves their work and wants to save correspondence or important contacts.

  • Migration is complicated on free platforms with years worth emails.

  • Nostalgia related correspondence and attachments. Ex: Emails with deceased relatives or with loved ones.

  • Personal attachments like photos or business related documents.

  • Personal posterity

  • Press releases, marketing material, official transcripts, and more

 

Lessons Learned

After extensively testing a high volume of potential product ideas against each other, and performing deep dive analysis on the quantitative data (from advertising numbers) and qualitative data (from user interviews) the Consumer team is confident in our decision to build Email to PDF as our next product.

Lessons learned:

  • Be mindful of priming users with use cases so you don't influence results

  • Treat all tests equally so you're comparing apples to apples

    • Don't change methods midway through a test if you've found a better solution to a problem

    • A clever product name can fool users

  • Speed and perception of legitimacy are crucial when recruiting users to interview

  • Obtain both qualitative and quantitative data to validate narratives

Not only did the product validation process prevent us from choosing a high risk product that may not deliver results, but it also revealed a winning concept that none of us expected to have as much traction as it did. After all our research and validation, we believe Email to PDF will surpass tens of thousands of users within it's first year.