Interview with

Peep Laja

Peep Laja is the founder of CXL, CXL Institute, and the soon-to-be-launched CopyTesting.com. Peep has a voice, and he’s not afraid to use it. His twitter profile proudly proclaims, “I call your bullshit” — and indeed, he does.

So, for a brief time a couple of weeks ago, I put aside the fear of having my bullshit called out and asked Peep a handful of questions. He’s busier than I am, so the answers are brief and to the point. One might say, they’re optimized.

Quick Insight

Copy is the #1 driver of conversions on most sites. People will buy if their motivation is high, and you get that mainly with copy.

Let’s say you walked into a new project that had been going for a few years. No testing had ever been done and the site was converting at around 1%. It’s an ecommerce site. What’s the first thing that you do?

First thing in CRO engagement – conduct a heuristic analysis (desktop and mobile separately), walk through the site page by page, and evaluate each page against a set of criteria like motivation, clarity, friction etc.

What is the most common mistake that you see people make with regards to testing?

2 most common issues:

  1. Testing stupid shit that makes no difference. Mainly because it’s easy to test that particular thing, or zero conversion research was done, and the test hypothesis is based on an “idea”.
  2. No proper AB testing statistics education, so they end the test way before needed sample size achieved, use statistical significance as a stopping rule (don’t do it). 

Do you view desktop and mobile as fundamentally different challenges for conversion optimization efforts?

For sure. Real estate differences, and use case differences. You use your phone while in line at Starbucks or waiting at a red light. The level of scrutiny is way different. So you should always run separate tests on mobile and desktop. (Also because the traffic split between desktop/mobile is usually different).

I’m a huge fan of remote user testing, but it’s typically targeted towards UI and UX improvements. Can you give us some details on Copytesting.com, and what you’re trying to do there?

Copy is the #1 driver of conversions on most sites. People will buy if their motivation is high, and you get that mainly with copy. 

Yet – before Copytesting there were no tools that told you which parts of your copy suck, and what’s wrong with them, how to improve it. Web analytics can’t tell you anything about it. Heat maps can’t tell you anything, and so on.

Copytesting gives you quant data on what’s good or bad (e.g. your headline is statistically significantly bad), and qualitative data on what seems to be the issue. 

Now you use this data to optimize your copy, make it better. You use it either *before* you run a copy A/B test, or when you don’t have enough volume to run A/B tests. The results are also very quick, within 2-4 hrs.

You’re known for being very opinionated. I don’t think it’s so much that, I just think that you speak very clearly and are able to articulate very clearly your viewpoints. Is that a deliberate thing? Does that come from your experience in conversion optimization?

Thank you! I try. I’ve always been a writer (special literature focused class in high school etc), and writing helps you articulate your thoughts better.

What is your desktop to mobile device usage ratio? I’m going to guess 60% desktop, 40% mobile.

I’m on desktop at work (8hrs/day), and at home maybe 50/50 split.

Does that ratio differ when you’re shopping for something online? If so, why?

Shopping is 95% desktop if I need to browse, learn and decide what to buy. I buy on mobile from Amazon when it’s purely transactional – I already know what I want, and it’s just about hitting that “1-click purchase” button.

When browsing, what your red flags? Are there any particular things that websites consistently do wrong that really irk you?

Stock photos, sliders, jargon, vague bullshit (a lot of B2B sites).

Back to all issues