Ongoing account maintenance is a big job. You have bids and budgets to manage, new features to implement, ads to rotate, and the list goes on. Regardless of which account element you’re working with, there’s one common mistake that’s particularly easy to overlook: Optimizing using the the same data multiple times.
Tell me if this sounds familiar:
You go into an ad group to optimize your keyword level CPC bids. You know all about the dangers of optimizing based on too little data, so you set your date range to the previous 30 days of performance. You see that performance has been lagging for this keyword set so you lower your CPC bids to see if you can get better bang for your buck in a lower ad position.
Now, you’ve also decided that you’re going to optimize your keyword bids every two weeks, so two weeks later you go back into the same ad group, set your date range to the previous 30 days and check out the results.
See the problem? It’s easier to miss than it might seem. You’re optimizing the same data twice because only half of the performance you’re looking at actually occurred within the timeframe of your most recent optimizations. In other words, the last two weeks of performance reflects the latest keyword bids, but the first two weeks of performance occurred before you made the adjustments. Assuming you’re looking at aggregate data (as opposed to data day by day in your Dimensions report), you won’t be able to parse the new performance data from the old performance data and, thus, won’t be able to optimize effectively.
How big of a deal could this be? Potentially huge. Let’s look at a simple example:
Looking at just one keyword, let’s say based on the last 30 days of performance history, your $2 CPC bid is fetching an average cost to convert of 2X your target CPA. Naturally, you decide to lower that bid to $1 in order to get your CPAs closer to target.
You make the change and let it be for two weeks. You then return to the keyword to check out the results, again setting the date range to last 30 days. Your CPA has now dropped to 1.5X above target. This is still too high, so you lower your CPC bid again to $.75 because you really need to hit that target CPA.
Now, if you were to take a look at the data day to day, you might find that you actually were hitting your CPA target at $1 CPCs for the last two weeks. However, since you were also looking at the last two weeks of your $2 CPCs, the aggregate data will still show an average of 1.5X CPA. Translation: You’re $1 CPC bid is right on the money, but you can’t know that without looking at performance for that bid only, or digging into day by day dimension reports.
This is big problem that’s also easy to avoid. Just make sure your optimization schedule and your default date range for making those optimizations are synched up. This makes sense because you should theoretically be optimizing exactly as often as statistically significant amounts of data accumulate. In other words, if you don’t think a two week date range gives you enough performance data to work with, why would have a bi-weekly optimization schedule?
The frequency at which you optimize will depend on how much traffic you get. Just remember, you need enough data to accumulate as a result of your most recent change in order to optimize based on your most recent change. If you need to expand your date range in order to reach statistical significance, that’s a good sign you’re you shouldn’t be adjusting that bid quite yet.