15 Email Marketing Myths Debunked!

Share

How well do you understand email marketing? You may be believing outdated or unsubstantiated “facts.”

Here’s a look at common misconceptions regarding email, according to the following Email Monks infographic.

All email recipients who opted in are loyal readers. False. “Some of them opted-in through proxy forms for incentives to surpass premium content gateways,” suggests Email Monks.

Moreover, the length of time on a list can increase the possibility of spam complaints. “At 31 months, 43% of a list were more likely to complain,” states Email Monks.

The higher the email frequency, the higher the unsubscribe every time. Not true! “HubSpot conducted a study and found that if you’re emailing your list four or five times a month, it will actually decrease your unsubscribes,” Email Monks states.

To find out more misconceptions about email marketing, check out the infographic.

Veronica Maria Jarski is the Opinions editor and a senior writer at MarketingProfs.

Twitter: @Veronica_Jarski

MarketingProfs All In One

Share

An Epic Landing Page Makeover That Debunked 3 Landing Page “Best Practices”

Share

If you’re in the online marketing space, your inbox probably looks like mine. An avalanche of emails with advice on A/B testing, conversion rate optimization, lead generation and tips on how to engage your list with valuable content.

Everyone seems to have best practices figured out, so why reinvent the wheel? Tell me what works — I’ll try it. Show me what failed — I’ll avoid it. Follow those who have gone before… it makes sense right?

It does make sense! Until it doesn’t.

In this post, you’ll see how three common conversion tips got totally debunked during our testing with my client NueMD a medical billing software and EHR company. Does it mean these best practices never work? No. But I guarantee it will change the way you think about best practices — and it’ll help prevent you from making the same conversion mistakes on your own landing pages.

Below, check out the initial redesign:

nuemed - 600

 

The original NueMD landing page (left), and the re-designed and “conversion optimized” version I created (right).

Oh. My. Stars. #FAIL doesn’t begin to describe the best practices bellyflop we experienced. Do we expect to disprove hypotheses in testing? Of course. Do we expect to iterate? Toats!

Do we expect for best practices to do a TOTAL faceplant? No!

“Best practice” #1: Headlines Must Address Visitors Pain

I *know* it seems impossible to debunk this, but before you freak out in the comments, please read on..

Talk to your customers about the problems *they* wanna solve.

This piece of wisdom is a biggie in the realm of optimizing headlines for conversions — and for good reason. In many testing scenarios, it works. You’ve got to lead with your peeps and their problems (not your product!) if you want to connect with your audience.

The result of making this assumption was probably the most surprising piece of data from our testing.

Our original headline read, “Watch a Free Demo of NueMD Medical Billing Software:”

nuemd 2 - 600

 

 

When I step into the mind of the visitor reading this headline, I hear:

“Why?”
“What do I get?”
“What does this demo show?”
“These people don’t understand my challenges at work.”
“Am I going to get a call from a sales person?”
“How much does this cost?”

This headline scares people off, right? It’s talking about the product before the people!

After a lot of research and collaboration with NueMD’s marketing director, I developed a hypothetical “script” of what visitors are thinking when searching “medical billing software”:

“Getting reimbursed by insurance companies is consuming a huge part of my practice’s administrative time. Dealing with the back and forth, denied claims, delayed accounts receivable puts such a strain on our productivity. All we want to do is take care of our patients. This should not be such a huge deal and we’ve got to find a way to simplify.”

From this dialogue, we decided to test an alternative headline that touched on real pain points found in our research (delayed accounts receivable, laborious tasks and software that is difficult to use).

The headline we tested encompassed each of these three elements, and appeared just above a testimonial that reinforced the headline statement:

nuemd 3 - 600

 

 

So how did it perform against our original headline?

nuemd 4 - 600

 

Well this is embarrassing. Our new headline lagged by almost 2%.

Our A/B test revealed that our new headline lagged by almost 2%.

Conversion Freak-Out Note #1: If you’re freaking out about the test being called after 13 conversions, this means you: 1) have high traffic landing pages that reach the baseline of 250 conversions in less than a year, or 2) you’re a conversion expert who thinks low-traffic pages aren’t worth testing, or 3) work with clients that have a really large budget for PPC landing page testing. If you read through to the end of this post, you’ll see how we navigated the low-traffic situation and worked within my client’s PPC budget.

Conversion Freak-Out Note #2: This headline test ran for a full month. Would we have preferred to run until we got a total of 250 conversions or a full year of testing data? Of course. However, if you’re like a lot of PPC advertisers, you: 1) have a budget and 2) can’t force more people to search for your keywords. My client, NueMD had a PPC budget to work within and felt confident we could test in a way that produced more leads for their sales department. (See “The Results” at the end of this article for final results.)

Back to the headline. What happened?

We had a solid hypothesis, didn’t we? We addressed the visitor’s problems and that’s a good thing, right? We connected with their pains and dreams and that’s what people want, don’t they?

I think the answers are yes, yes and yes, but our testing disproved several assumptions:

  • We thought visitors needed more information about the benefits and value of the software. Our heatmaps showed they were ready to see a demo.
  • We assumed visitors needed to be “seen and heard” at this phase in the buyer journey, when they actually just wanted to see the product in action.

Testing showed this audience didn’t want to hear about problems within their medical practice. They are most likely in a hands-on role and already know the problems. They’re busy and don’t need to have their in-office problems re-articulated to them.

When they arrive on the landing page, they have a different problem than the actual problem they’re dealing with in their practice. They need the information a demo contains!

And they want a headline that says they’re going to get what they want, which is a demo!

We proved this hypothesis to be true, changing the headline on the new design to read, “Watch a Free Online Demo,” with sub-headline, “And see why 24,000 medical professionals choose NueMD.”

nuemd 5 - 600

 

 

This design contains similar information as the original, but consistently outperformed the original.

Takeaways for Your Landing Pages

In this headline test, we learned the importance of defining the buyer journey and identifying where your prospect is in their buyer journey. Before you write your next headline, try the following:

  • Test your PPC traffic to see what message resonates. In our case, we could have tested “Get a Free Demo” against “Automate and Get Paid Faster” or “Get Paid Faster.”
  • Develop a hypothetical “script” of what visitors are thinking when they are searching for your product/service. What phase of the buying process are they in? Are they searching for a solution to a burning problem or are they aware of the solution and looking at their options?

Understanding your prospect’s place in the buyer journey is your first step in writing a headline that resonates.

“Best practice” #2: Only High Value Content Gets Leads

You can’t get quality leads without providing high-value content. This is best practice #2. When people provide their email, they often know they’re being put on a list — so you better make it worth their while.

Based on this best practices advice, we made the assumption that the “Register below to view a FREE demo of our software!” wasn’t a high value offer compared to the quantity of information required to get it.

Landing Page Call to Action

The demo is barricaded behind a form that requires my first/last name, email AND phone number. Notice the emphasis on FREE in the form’s headline…  This implies that some people pay for a demo, which some may find odd or at least confusing.

When I step into the visitor’s mind, I hear a lot of “exit page” dialogue:

“I don’t think so.” /exit page
“I don’t have time.” /exit page
“Just give me the demo.” /exit page
“I don’t want to talk to a salesperson.” /exit page

You get the idea. Logic tells us people don’t provide an email address for something that should be freely available.

Problem is… the data showed I was wrong, and an average of 4–6% of visitors consistently fill out this form.

But why?

One (of many potential) hypotheses: All of their competitors have the same call to action. If you want a demo of medical billing software from *anyone* without providing a name and email, you’re up the creek:

nuemd 6 - 600

 

All of NueMD’s Google AdWords competitors also used a gated demo.

Of the competitors bidding on “medical billing software,” all of them kept their demos behind a form. In the medical billing software field, requiring contact details for sales staff to follow up is pretty much “table stakes” across the industry.

So how does the competitive landscape change the conversation in our visitors mind?

For starters, it can color their expectations. Maybe their inner dialogue sounds more like this:

“Good grief, I can’t get a demo anywhere without entering my email. Whatever.” /fill form
“Oh they require the same as XYZ company. Okay.” /fill form
“I’ve got to see how these guys compare to XYZ company I’m talking to.” /fill form
“Whatever. I have to see this demo to get on with my research.” /fill form
“I’ve heard good things so I guess it’s fine.” /fill form

Takeaways for Your Landing Pages

In this test we learned:

  • You have to test various offers and CTAs on each audience to find out what really constitutes “high friction.”
  • Logic does not prevail. Even if something seems like a common-sense assumption, you have to test your logic/hypotheses.
  • It’s important to put yourself in your prospect’s shoes. Click on your competitors’ ads to find out how their messaging may be affecting the action visitors take on your landing page.

“Best practice” #3: Low-Traffic Pages Aren’t Worth Testing

I can’t NOT talk about this. Low-traffic pages, or pages that take a loooong time to reach 250 conversions or “statistical significance” are common. I’ve noted this already, but it’s a point worth exploring in more detail.

To say I’m going to “debunk” this best practice may be overstating, but the results of this test (summarized in “The Results” section) prove that you can still increase leads into your sales funnel using a low-traffic landing page.

So let’s look at the details. Our testing for these pages ran for a total of 5 months and allowed for full-week, Friday-to-Friday testing. Each week we review results and either let the test keep running, or discuss new things to test.

Via AdWords, this page gets around 300-400 unique visitors a month. There could be some AdWords click-through optimization to increase traffic to the pages, but we can’t force more people to search for “medical billing software.” There is a ceiling on the number of people looking for this product, which you may experience in your business as well.

So how do you A/B test pages that won’t reach “statistical significance” (or get 3,000–4,000 conversions, or heck even 100 conversions) until the year 2054?

Test High-Impact Changes

We can’t dramatically increase the number of visits to the page, so the way we handled this question with NueMD is by testing high-impact changes. High impact changes are those that test variants with very different elements. Examples from NueMD include:

  • Different messaging/headline: “Watch a Demo” vs. “Automate and Get Paid Faster”
  • Design: Big shift in the typography, layout and visual design

When you don’t have a ton of traffic to send to your A/B tests, big changes have more noticeable and measurable effects. Making dramatic changes per variant helps you get the data you need, even if you can’t send more traffic (or wait until 2054!).

If you make a dramatic change that seems like it’s promising, it may be worth the wait to keep the test running as long as you can. But if you have a challenger that’s consistently underperforming, don’t be afraid to pull the plug altogether and go back to your control. It’s really a balancing and juggling act — working with your PPC budget, PPC messaging and conversions, landing page messaging and conversion, and doing your best to make wise decisions with the money you have to spend.

The Test Results

Here’s where the rubber meets the road. Whether you’re working on a page that gets gobs of traffic or not, at the end of the day, did you drive more leads into the sales funnel or not? Here’s our pre-redesign stats:

Landing Page Stats Before

and our stats after:

Landing Page Stats After

On average, NueMD is consistently seeing 1.5% more leads to their inside sales team over a 5-month period as a result of our testing and optimization. That’s an additional 31 leads to their inside sales team in 5 months. Low traffic or not, calling the test too early or not, that’s a concrete result we can directly attribute to our optimization efforts.

Takeaways for Your Landing Page

To recap, you can test and get meaningful data from low-traffic landing pages. Statistical significance won’t play the same kind of role it does on consistently high traffic pages, but you can still gather insights and optimize for more leads.

When you’re testing low-traffic pages, try the following to keep your A/B testing moving along:

  • Make dramatic, high-impact changes for more noticeable and measurable effects.
  • If a challenger looks like it’s completely bombing, you don’t necessarily need to wait until statistical significance eliminates it. Pull the plug, go back to the original, and move on to your next test.

What to Do Next

The beautiful thing about A/B testing is that even failed tests can bring you invaluable, actionable insight.

The important thing is to start testing today.

There were so many awesome nuggets learned from this exercise that I didn’t even get to share in this post, but here’s a recap of the biggies:

  • Conversion “best practices” can be dead wrong. Test assumptions even if they go against common advice.
  • Do some testing around your buyer’s journey, and find out if you’re making correct assumptions about what they want at their stage in that journey. Effective headlines and CTAs are heavily dependent on location in the buyer journey.
  • Check out your competition — they may be doing things that influence your buyer’s clicking habits.
  • If you don’t have a lot of traffic, you can still optimize your pages by testing for high impact changes. And don’t be afraid to pull the plug on an underperforming challenger, even before statistical significance.

One final word: These insights are a great launching pad for your next test, but you should always test them yourself to be sure the same rings true for your particular niche or audience!

Over to you — have your A/B tests debunked any “best practices”?

The post An Epic Landing Page Makeover That Debunked 3 Landing Page “Best Practices” appeared first on The Daily Egg.


The Daily Egg

Share