What 54 Google Ads experiments taught me about lead gen


Over two years, my team ran 54 Google Ads experiments in a lead generation account (excluding e-commerce), testing various features, including:

This article takes a deeper dive into these tests, explaining their rationale, the results obtained, and the implications for your own Google Ads accounts.

TL;DR

  • We ran 54 experiments over two years covering a range of features for a lead generation company.
  • Key tests included bidding strategies, match types, and ad copy.
  • Exact match keywords almost always perform better than exact match keywords.
  • The Maximize Conversions strategy tends to underperform other bidding strategies.
  • Target CPA bidding helped us determine an optimal CPA level for us.

What are Google Ads tests?

Google Ads experiments allow advertisers to test changes to their campaigns before fully implementing them.

These experiments, conducted at the campaign level, provide a structured framework to easily implement these changes, test their significance, and apply the changes across the campaign if they are effective.

For more detailed information, the official Google Ads support pages offer additional guidance.

Test Setup Overview

Over two years, we conducted 54 experiments, testing various levers in Google Ads. Below are some sample experiments.

Key Test Categories

  • Bid: test different bidding methods against each other. For example, maximize conversions against a target CPA or a target CPA of $90 versus a target CPA of $120.
  • Ad copy: pin certain ad copies, test certain copies or new landing pages.
  • Keyword matching: Testing exact keywords versus phrase matching.
Test Setup Overview

Schedules

There was no set duration for each experiment, but we tried to conduct each experiment for at least 30 days.

However, sometimes when the results were clear in less time, we would make a decision. Most experiments lasted more than three months, and most lasted more than two months.

Assessment

We evaluated all experiences based on conversion rate, conversion volume, and cost per acquisition, balancing these metrics to make a decision. (For ad copy testing, we also used CTR.)

Get the research newsletter marketers rely on.


Experiment results

1. Exact match or phrase match

Exact match keywords performed better across all criteria than phrase matches. They had lower CPAs and higher conversion rates and maintained conversion volumes similar to phrase match keywords.

Knowledge

  • Even though exact match keywords were more expensive, they almost always had higher conversion rates and lower CPA than exact match keywords.
  • While some exact match keywords can be useful, when considered as part of a bulk campaign, it was still best for us to only serve exact match keywords.
  • Phrase matching has its place and can be useful for search term discovery and in new campaigns, but if there is enough volume for exact match keywords, the latter should always be preferred.
  • Exact match keywords often perform better than phrase match keywords because Google hides many search terms on phrase match keywords.
  • In many cases, we’ve found that up to 80% of search term impressions and clicks are hidden from keyword impressions and clicks.
  • You can see this in the search terms report. The hidden percentage on exact match is much lower because the search term possibilities are much lower.
  • With a phrase, you don’t actually know most of the keywords you bid on and you can’t eliminate many search terms, further lowering the quality of those keywords.

2. Maximize conversion bids compared to other bidding strategies

In all experiments where we ran a conversion maximization strategy versus a target CPA strategy, the target CPA strategy always outperformed, generating more leads at the same cost.

When we ran conversion maximization versus manual bidding, conversion maximization was more effective.

Knowledge

  • If you are using manual bidding and have sufficient conversion data, you should switch to automatic bidding. Targeting CPA or maximizing conversions would be a good option.
  • If you don’t have a lot of conversion data, you should wait or test conversion maximization bidding. In general, we found that, relative to the target CPA, Maximize Conversions bids resulted in bids that were too high to be effective. This allowed Google to set bids that were too high, sometimes excessive.
  • Google’s advice was that bidding would stabilize over time. However, we ran these tests often for three months and didn’t see any offers drop.
  • It’s likely that most advertisers, like us, won’t be able to maintain bids that high for that long.

3. Target CPA vs Target CPA at Different Levels

Essentially, we tested tCPA at different levels (e.g., tCPA 90 vs. 120). About half of our target CPA tests failed and the other half passed.

Knowledge

  • The results showed that a target CPA of around 120 produced the best results.
  • Tests with target CPAs lower than 120, for example 110, did not improve performance.
  • The target CPA of 120 consistently delivered higher conversions with a better cost per conversion.
  • For our account, which relies heavily on leads, tCPA tends to be the best model.
  • We found the optimal target CPA that maximizes leads within our budget. Your testing goal should be to find your own optimal target CPA, which can vary.

Next steps

While our results provide valuable insights, remember that every PPC account and campaign is unique.

Testing these strategies in your specific context will help confirm whether our findings apply to your situation.

Our experiences can serve as a starting point for optimizing your lead generation campaigns. By leveraging our results, you can save time and effort in your own testing process.

Validate our results with your own data and, if the results match, confidently implement these changes in your account to improve performance.

Contributing authors are invited to create content for Search Engine Land and are chosen for their expertise and contribution to the search community. Our contributors work under editorial supervision and the quality and relevance of the contributions for our readers are checked. The opinions they express are their own.