Measuring the effectiveness of display ads

From what I’ve heard, only about 18 percent of web users ever click on a display ad. But studies have clearly shown that display ads affect people’s behavior. Sometimes people see the ad and enter the URL directly. Sometimes it leads them to search on a brand-related term.

For example, a web user might see an ad for a Dell Lattitude on the side of the page and then type “Dell Lattitude” into his handy little google search box at the top of his browser. At that point he might click on an organic link or a paid link.

What advertising campaign is going to get credit for any resultant conversion? If you’re measuring by clicks, it won’t be the display ad.

Which leads us into “view-through conversions,” where the display network takes credit for some percentage of conversions based on the fact that the ad was displayed to the user some time before conversion. (The time window can vary.)

But is that fair? Maybe the person was going to convert anyway. Maybe he never even saw your ad.

Another option is re-marketing, where a visitor comes to your site, gets a cookie, then goes out into the world and sees ads to draw them back to your site. It’s effective, but how do you measure how effective it is? What are you measuring against what?

It seems to me that the logical way to do this is to use a combination of re-marketing and an A-B split.

IOW, some number of people come to your site and get a cookie. That population is split into two groups. Group A goes off into the world and isn’t exposed to any of your ads. Group B sees your ads. Compare the behavior of Groups A and B. Any difference can be safely attributed to the ads on that network.

For some odd reason, display ad networks don’t seem to be able to do this, and I’m not sure why. It doesn’t sound technologically difficult, and it would convincingly prove the effect of the ads.

Actionable Analytics

I’m going to be doing a presentation on analytics in a few weeks at the SIPA Mid-Year Marketing Conference.

The presentation has ten exercises to encourage practical steps publishers can take from the data on website usage. Here’s the first one, which has to do with fine-tuning your web pages for your visitors.

Step 1

Find the top 100 search terms that bring traffic to your site and sort them by bounce rate, from lowest to highest.
 
The top terms should describe your site / coverage / expertise. “Yes, these are the people who should be coming to my site.”
 
The terms on the bottom should get progressively further away from your coverage. If you find some of your key terms towards the bottom, you’ve found a problem to solve.

Step 2
 
Group your search terms conceptually and see which group makes up the highest percentage of traffic.
 
Step 3
 
Find the landing page(s) associated with that group of keywords.

Step 4
 
Review those landing pages to see how they can be optimized for people searching on those terms.
 
For example, you might add links to related content on your site. If you have any related products, be sure to add links from the landing page to the product page(s). If you don’t have any related products, consider an affiliate relationship with someone who does.

Online registration form mistakes

I was just signing up for a conference, and the registration form had a couple “usability” problems.

First, it asked me for my zip+4. I don’t know what it is. I had to get a piece of mail addressed to me to find out.

Why would you do that? How many people actually know their zip+4?

Second, it asked for my fax number. Again, I don’t know it off the top of my head. I have to consult my business card.

Third, there was no indication of which fields were required and which were optional, so I didn’t know if I had to go look up my fax number.

Consumer Reports website disappointed me

I’m in the market for a dishwasher, so I thought I’d go to consumer reports and see which ones are the best.

I tried to sign up for their service and I got an error on the registration page. When you sign up you have to “create an account” with a username and password, and passwords on consumerreports.org have to be all lowercase.

This is a stupid mistake. Why limit your customer’s choice in creating a password?

I’ve developed a little algorithm I use to create passwords on websites. It helps me remember a secure password for each site. But sometimes it requires an uppercase letter.

So Consumer Reports lost my sale because of their silly password policy.

The lesson is simple — don’t restrict password options!

Then I wanted to send them a note telling them about this so they could fix it. But I couldn’t find a “contact us” page, or any way to send them an email.

That’s two mistakes.

They may know a lot about dishwashers, but their web team needs some help.

How to use search tools to increase website traffic

Google has a new tool called the “Wonder Wheel.” Yeah, it’s a silly name, but it’s a good tool, and when used in combination with Google Trends and a standard keyword tool it could be a very effective way to increase traffic to your web pages.

First, give it a try.

1. Google a term that’s relevant to your site.
2. Click on the “show options” link at the top of the search results on the left-hand side.
3. Under “standard view” click on “Wonder wheel.” This shows different subcategories of topics under your search term.
4. Drill down into narrower concepts by clicking on one of the subtopics.

Each time you drill down you’re finding a group of related terms and you’re getting a more granular perspective on how the term relates to search.

With the “Wonder Wheel” you can get an idea of the structure of your search terms — e.g., how they might fit into folders in a content network campaign. And you can also get ideas for other terms.

But you don’t know which term gets more traffic. That’s where Google Trends comes in.

So you use the Wonder Wheel to get a sense of how Google organizes all the terms related to your primary search term, and to get ideas for related search terms (you can also use a simple keyword tool for that), then you use Google Trends to find out which terms and phrases get the most traffic.

Then comes the hard part. You need to go through this exercise before you finalize your content.

Let’s say you’re writing a story on the health benefits of green tea and you’re wondering what headline will attract the most search traffic.

First you go to Google Trends to compare a few ideas. Try “tea health, tea healthy, tea benefits” and get a chart like this.

Obviously “tea benefits” is the right way to go, so then you google “tea benefits” and try the “Wonder Wheel.” The first set of subcategories looks like this.

roibos tea benefits
oolong tea benefits
white tea benefits
black tea benefits
green tea benefits
herbal tea benefits
ginger tea benefits
chamomile tea benefits

Obviously you want “green tea benefits,” so you choose that one and get the next set of drill-downs.

green tea metabolism
matcha green tea benefits
green tea skin benefits
lipton green tea benefits
green tea benefits weight loss
green tea diet

Does the story have more to do with general health benefits, the effect on skin or metabolism, or diet or weight loss? Let’s say it’s about weight loss, so now you know you want “green tea benefits” in the title of the story, but do you want “weight loss” or “diet” (assuming they’d both fit from an editorial perspective).

Now consult Google Trends, compare “tea benefits diet, tea benefits weight loss” and see that weight loss is the clear winner.

I’m not saying that search results should drive all editorial decisions, but the simple truth of the matter is that if you want people to find your article, you need to use the words they’re looking for, not necessarily the words you like. The headline “Green tea is good for you” isn’t going to get the same traffic as “Green tea’s health benefits include added weight loss” for the simple reason that it has more of the terms people actually type into search engines. So use it as the title of your page and the headline for your story.

And if you don’t believe me, here’s an interesting article about how the Tribune Company used a method like the one I’ve just outlined to achieve a dramatic increase in search engine traffic.

Will Murdoch lead the way?

Rupert Murdoch is realizing that the advertising model — give away the content and sell ads on the page — simply won’t sustain most major media operations.

See Murdoch vows to charge for all online content

I believe two things about media companies.

  1. There are too many of them, and some will have to fail
  2. They are going to have to quit giving away their content

Media companies have to provide content that’s worth something, and then charge people for access. (If people won’t pay for it, then by definition it’s not worth anything.)

This article highlights a problem media companies face today. In this case, The Washington Post paid a reporter to do research and write a serious piece, which was then largely stolen by another site. To make matters worse, the other site is earning advertising revenue from that page — off the Post’s content!

Google could be the publisher’s white knight.

When Google indexes a page, it checks to see if the content on one page is like the content on another page (and, from what I hear, marks down sites that have too much duplication).

Google could display that fact in the search results by making it obvious that site B is parasitic on site A? Google could invent a rating system based on the amount of content that is … borrowed … and give the URL a “parasite rank.”

Of course in some cases the publisher might want its content on the other page. Many content providers syndicate their content to other sites. In those cases, the publisher and the content partner would want to suppress the “parasite” label. All that would be required would be for the publisher to register with Google as a content provider and list its authorized content partners. They wouldn’t get marked down for borrowing content.

This would reward the people who actually generate content and would penalize the parasites who feed off of it.

The next step would be for publishers to push advertising networks not to place their ads on sites with a high parasite rank.

Overwhelmed by white papers?

I am.

I read a lot of them, but not nearly as many as I’d like to.

There’s lots of good information out there, and I’m sure it would help me in my daily work, but I’m too busy with my daily work to read the things.

I’m sure this is true for lots of other professionals in other organizations, which leads me to this odd thought.

If I was a CEO I think I’d find a young “idea guy” and hire him to do three things — and only three things.

  1. Read every white paper he can find that’s applicable to my business,
  2. Sit in on every meeting he can, and
  3. Keep a working “best practices” document on every functional area of the business.

I suspect that the increased efficiency would far more than pay his salary.

What’s a reasonable bounce rate?

The last few days I’ve done some reading on bounce rates and spent some time in Google Analytics getting a feel for what’s what, and it seems that the best rule for bounce rates is that you’re doing well if your bounce rate is lower than it was last month.

Some articles will give you some fixed guidelines. For example, How to Fix a Leaky Web Site says under 25% is good, but over 40% is too high.

I don’t buy that because your bounce rate will depend on a whole lot of things that will vary from site to site. A blog has a completely different site architecture than a store that sells dishwasher parts, and different sites get different sorts of traffic.

Here are a few things to look at to get a sense of what’s going on with your site.

Look at your traffic sources and compare the bounce rates for each. (E.g., direct traffic, referring sites, and paid search vs. natural search.) Look at your top landing and top exit pages.

Do you see any patterns? Look at the pages with a low bounce rate and see how they differ from the pages with a high bounce rate. Do they attract different sorts of visitors? Is there a clear call to action, or some obvious next step on one page and not on the other?

Remember that a high bounce rate may simply be a sign of poorly qualified traffic. Your site may rank highly for a word that has several different meanings. (E.g., “cobra” can be a snake or a kind of mustang.)

I’ve set up some “advanced segments” in Google Analytics that let me track how many people stay for one page view, or two, or three or more. I then run a report on my top search terms and see what percentage of my traffic falls into those three groups. Good pages not only have a low bounce rate, but they have a high “time on site” and a high percentage of people in the “3 or more” category.

Time spent on site and website goals

I had an interesting chat with a friend of mine from SIPA.

Harry had read a statistic that people are spending less and less time on newspaper sites. (I don’t know if this is what he’d read, but along those lines see Average Time Spent on Top 30 Newspaper Web Sites Declines — More Than Half Fall)

Harry thinks that time on the site is more important than the number of visitors, and I think he’s probably right. We talked about it for a while, and then I came up with the following.

Every website has a purpose, and webmasters ought to have clear goals for their sites. Your average SIPA-member website will probably have goals something like this.

  1. Get a visitor in the first place
  2. Get the visitor to a second page (i.e., not bounce)
  3. Get the visitor to return
  4. Sign up / register in some way (e-mail newsletter, forum, whatever)
  5. Buy something (probably something small)
  6. Buy something else (up- or cross-sell)

Goals 3 through 6 should be cross-referenced to the time the user spends on the site. IOW, of the people who spent more than X minutes on the site, how many returned, signed up or bought something?

If those stats bear out what both Harry and I expect they do (that people who spend more time on the site are more likely to complete site goals), then a good strategy would be …

  1. Find the pages that people spend the most time on,
  2. Make more of that type of page, and
  3. Make sure those pages are optimized towards your site goals.

What? Pay for a product?

According to Forbes, David Heinemeier Hansson has caused a ruckus by promoting the idea that online customers should actually pay for products.

Imagine that three shoe companies have come to you for some venture capital. They each have a different business model, as follows.

  1. Give away the shoes to get market share, then charge for them later when everybody is hooked on the brand.
  2. Give away shoes, but sell advertising space on the sides.
  3. Charge a fair market price for shoes.

Which company would you invest in?

Why is that so obvious with shoes, but not so obvious with online content and services? Probably because most everything on the Internet has been free for a long time.

We don’t fuss about paying for shoes because people have always had to pay for shoes. But it was a bit of a leap for people to start paying for water, or for the right to fish, because those things used to be free.

The internet culture of “free” is going to have to change (at least in some areas) because the simple truth is that ads don’t pay the bills. This is going to be a difficult transition and a lot of companies are going to go under in the process, but I don’t think there’s any alternative.