I think people writing blogs, or posting on Facebook or Reddit want to help. Their advice on optimising Google Ads comes from a genuine desire to share what's worked for them - as does mine. They don't set out to misinform.
But not all advice is good advice. In fact, following some will harm your campaigns.
I can think of four reasons why good intentions turn into bad advice...
Nobody starts an article on, say, choosing a bidding strategy with a detailed description of the account. How many campaigns, how many keywords, what happened in the past, the goal they're optimising for etc.
Instead, they launch into what worked for them.
But, what worked in their account - their context - might be wrong for your's.
A tactic that increased conversion rates for a nationwide campaign with 500 000 impressions a month might break a local campaign that only gets 1 000 impressions a month.
A routine that works well in an agency might not be a good fit for an in-house PPC manager.
A campaign structure that supports 10 000 keywords might be too complex for a small account.
You get the picture. Advice is correct in the context it was learned in.
We humans tend to accept statements at face value if they’re made by people with authority. We don't question a doctor the way we would if some average bloke said it.
Don't feel bad about this. Even highly educated scientists have been caught by this.
In 1923, leading American zoologist Theophilus Painter declared that humans had 24 pairs of chromosomes. He made a mistake, but his influence and standing were so great, that for the next 30 years his error persisted. Scientists who counted differently thought they were mistaken and altered their conclusions. Even photos in textbooks showing 23 pairs of chromosomes were labelled as having 24.
You'll recognise an appeal to authority - and possibly bias - when you see phrases like:
We humans tend to follow the majority, even if we think they’re wrong.
In the 1950s Solomon Asch documented this by experimenting on college students, in what are now known as the Asch conformity experiments.
Groups of eight participated in a simple task. Seven of each group were actors. Number eight was the unsuspecting lab rat, the subject of the experiment.
Here's how it worked.
The actors were introduced to the subject as fellow participants.
Each group was shown a card with a line on it. They were then shown another card with three lines, labeled A, B, and C. One of the lines on the second card was exactly the same length as the first line. The other two lines were much longer and much shorter. There was no way anyone could get it wrong.
Each person in the group had to say aloud which line matched the first - A, B or C. The group was seated so that the actors spoke first and the subject spoke last.
They repeated this exercise 18 times with each subject.
On the first two rounds the actors gave the correct answer.
On the third round, all seven of the actors gave the same, but wrong, answer. The actors gave the same, but wrong, answer on 11 of the remaining 15 rounds.
The focus of the experiment was to see how many subjects would change their correct answer to match the group's wrong answer.
About three quarters of the subjects followed the crowd and gave the wrong answer at least once. Only a quarter stuck to their correct (but unpopular) answers in every round.
For every winner in an experiment or split test there must be a loser.
I bet you there are important lessons in articles like:
I could write one right now, based on something that didn't work called I moved the best performing search terms into an alpha campaign and the cost per click doubled. But I won't. Nobody writes these articles.
Losers aren’t sexy and they don’t make good traffic-pulling headlines.
Instead we hear only about the winners. The small number of ideas, experiments and changes that lead to an improvement.
The missing losers are as important because optimisation is as much about doing less of the wrong things as it is about doing more of the right things.
We all want to apply best practice, but, the truth is that there is no all-embracing best practice.
What’s best practice for one account isn’t always best practice for another account.
There are some generally accepted good practices.
Nobody would argue with you if you said "always split test your ads" or "try reduce your cost per acquisition". But, even these generally accepted good practices have exceptions.
There would be little point spending lots of expensive hours writing 3 great ads for an ad group that only got 5 impressions a month. It’d take years to get enough data to find a CTR winner. It’d take decades to find a conversion rate winner. Those hours could be better spent elsewhere.
And on reducing CPA. On the surface it makes sense but what if you could only reduce the CPA meant dropping the total number of leads by half. You might reduce the ad spend by half, but having half the leads might also mean half the sales. Nobody would thank you for that.
There are no secrets, no magic, no "One simple trick that gets 15% increase in CTR". Anyone who tells you otherwise is probably trying to sell something or doesn’t yet know enough about Google Ads.
But there is one universal principle: Do more of what works and less of what doesn’t.
Why did Google charge me $300 for a $20 click? Every now and again Google would charge an outrageous amount for one click. As much as 15 to 20 times what they normally paid. Here's how to prevent that.
Deleting fake leads leads to more fake leads. If the only thing you do with fake leads from Google Ads is delete the lead delivery email you risk getting swamped by fake leads. Here's why and what to do to improve lead quality.
Getting a lot of junk leads from Google Ads? You can spot conversion fraud because the leads have genuine email addresses and phone numbers but when you call them they don't remember filling in your form.
A love letter to statistics This is for you if you’re interested in making data-driven decisions about your Google Ads. Amy Hebdon, founder at Paid Search Magic, wrote a fantastic article she described a love letter disguised as a how to about PPC data interpretation.
Allocating budget How do you know the best way to split your ad budget between search and display, between Facebook and Google, between one keyword and another?
Does Google rip advertisers off? I'm not saying that everyone who's paid for Google ads has turned a profit. That's definitely not the case. But, when people lose money on Google ads its likely to be because Google Ads was a poor fit for their business, or because of the way they did Google Ads. I can't see Google risking their future to steal your advertising budget
Don’t believe every recommendation from Google. The optimisation score is Google’s newest way of telling you how to run your Google Ads account. Read on to see how following it blindly can hurt your business.
Google showed your ads to the wrong people today. Google showed your some of your adverts to the wrong people today. You (or someone you hire) must deal with this or your ads are going to start costing more and generating fewer leads. Here's how.
Has Google jumped the gun on mobile? How much commercial activity happens on mobile phones? Is it enough to justify a mobile-first approach to development? Case study with mobile vs desktop AdWords generated leads across 7 industries.
How do you get more leads from Google Ads when you’ve maxed out impression share? How do you get more leads from Google Ads when you’ve maxed out impression share? Here's how we solved this problem...
How to get fast traction on a brand new Google Ads campaign. I worked out a process for getting fast results from new AdWords campaigns so I didn't have to worry as much. You're welcome to copy (and improve) my process.
I’m getting too many B2C leads. I want only B2B. I’ve bumped into this problem with almost every client who sells to other businesses (B2B) not to consumers (B2C). Here are some options ..
I'm scared I'll break my campaign If you don't feel confident making changes to Google Ads campaigns this might help...
Improve conversion rates by adding extra calls to action. Need to improve your conversion rate? Here's a tested approach that takes only a few minutes.
Putting a little science into AdWords In the past I relied on a combination of the change history in AdWords, some rough notes and my memory to keep track of my testing. It worked but it wasn't great. It became more difficult as my business grew. I started to feel like I was losing control. I couldn't answer questions about an account without having to root around in Google docs, AdWords and email. Read on to see how I fixed this.
Should you advertise on Search Partners? We check the campaigns we manage regularly to see how much it costs to generate an enquiry on Google search versus search partners. Most of the time it costs more or less the same. Sometimes the difference is huge.
Should you show ads to people on their phones? It is a fact that more people use the internet on their phones than they do on computers these days. But are these glued-to-their-phone types going to buy from you?
The hidden trade-offs of automated bidding I like automated bidding. But, there are some trade offs with automated bidding that are not well documented.
Using the Search Terms Report to find negative keywords. The Search Terms Report is a great place to find negative keywords to add to your AdWords campaign. It shows (some of) the actual words and phrases that triggered your ads.
You don't have enough data to be confident that you're making the right decisions. Here are 3 rules of thumb I use to to help me make more of the right decisions when data is scarce.