How To A/B Test Your Meta Ads Creatives (+ Free Cheat Sheet)

Vertex Marketing Agency
1 Mar 202420:23

Summary

TLDRIn this informative video, SED from Vertex Marketing Agency shares a step-by-step guide on AB testing Facebook ads in Ads Manager. He discusses the agency's process for testing creatives, common questions like test duration and handling winners and losers, and emphasizes the importance of generating and refining ad ideas. The video provides insights on ad account setup, the benefits of using broad audiences, and a detailed manual AB testing method, including campaign structure, budget optimization, and monitoring strategies to identify and scale winning ads.

Takeaways

  • 😀 AB testing is crucial for generating new ideas and optimizing Facebook ad performance.
  • 📊 The speaker prefers manual AB testing over Facebook's built-in feature due to flexibility in managing test duration and momentum.
  • 🎯 For effective testing, use a broad audience targeting to ensure a wide reach and easy scalability.
  • 📝 Maintain consistency with the main campaign's objective when setting up a testing campaign to facilitate easy duplication and comparison.
  • 🔢 Utilize ad set budget optimization to control spending on individual tests rather than campaign budget optimization.
  • 📈 Start with a minimum of two and a maximum of six ad variations per ad set to ensure sufficient testing without overwhelming the analysis.
  • 🔑 Test variations should focus on a single variable at a time, such as different hooks or headlines, to identify clear winners.
  • 🚫 Avoid mixing different types of creatives in the same ad set to prevent skewed results.
  • 🕒 Monitor new ads for at least 5 days to gather enough data for an informed decision on ad performance.
  • 🏆 Identify and duplicate the best-performing ads from the testing campaign into the main campaign to capitalize on their success.
  • 🔄 Continuously create new ad sets with fresh creatives to keep the testing process dynamic and iterative.

Q & A

  • What is the primary focus of the video?

    -The primary focus of the video is to demonstrate how to AB test Facebook ads in Ads Manager, covering the process used by the agency to test creatives and answer common questions about the testing process.

  • What does the speaker suggest is their favorite part of managing Facebook ads for clients?

    -The speaker suggests that their favorite part of managing Facebook ads for clients is the AB testing aspect, as it involves generating new ideas and testing them to see if they are effective.

  • What is the significance of using a broad audience in the AB testing process?

    -Using a broad audience in the AB testing process is significant because it allows for targeting a specific geographical area without narrowing down the audience too much, making it easier to scale and test different creatives effectively.

  • Why does the speaker recommend against using the built-in AB testing feature in Facebook Ads Manager?

    -The speaker recommends against using the built-in AB testing feature because it requires setting a start and end date, which can limit the ability to continue running successful tests and can disrupt the momentum of performing ads when moved to a main campaign.

  • What is the recommended number of ad variations per ad set during AB testing?

    -The recommended number of ad variations per ad set during AB testing is between two and six, to ensure enough testing without overwhelming the analysis with too many variables.

  • Why is it important to keep the engagement from a performing ad when moving it to the main campaign?

    -Keeping the engagement from a performing ad is important because it retains social proof in the form of likes, comments, and shares, which can boost the ad's performance and provide valuable insights from the testing phase.

  • What is the recommended approach to scaling a successful ad set?

    -The recommended approach to scaling a successful ad set is to gradually increase the budget every two days by about 20% as long as the ad continues to perform well, based on the cost per result or other relevant metrics.

  • What should be done with the ads that are not performing well in the testing ad set?

    -Ads that are not performing well in the testing ad set should be paused after a sufficient testing period and enough data has been gathered to make an informed decision.

  • How long should an ad be monitored before making a decision about its performance?

    -An ad should be monitored for at least 5 days, depending on the sales cycle, to gather enough data to make an informed decision about its performance.

  • What is the main reason for avoiding mixing different types of creatives in the same ad set during AB testing?

    -Mixing different types of creatives in the same ad set can lead to unequal comparisons and make it difficult to accurately determine which elements of the creative are contributing to its performance.

  • What is the purpose of the provided cheat sheet mentioned in the video?

    -The purpose of the cheat sheet is to provide a quick reference guide for AB testing creatives, ensuring that viewers follow all the necessary steps and can easily reference the process during their own testing.

Outlines

00:00

😀 Introduction to AB Testing on Facebook Ads

The speaker, SED from Vertex Marketing, introduces the topic of AB testing for Facebook ads within the Ads Manager. The video aims to demonstrate the agency's process for testing creatives and answering common questions such as test duration, handling winning and losing ads, and the importance of generating and testing new ideas. The speaker emphasizes the significance of AB testing in ad management and shares a personal preference for testing, suggesting it's a key part of their job that involves creating, testing, and scaling effective ad strategies.

05:03

📈 Setting Up the Ad Account for AB Testing

The video script explains the typical ad account setup used by the agency, which includes a main campaign for the majority of the budget and broad ad sets for targeting specific countries or cities without additional audience restrictions. The speaker discusses the evolution of targeting strategies, moving from lookalike audiences to broader ones due to advancements in Facebook's ability to understand ad content and customer demographics. The script also covers the benefits of using a broad audience for easier scaling and testing, and provides a walkthrough of the ad account structure, including the creation of testing campaigns and the manual approach to AB testing as opposed to using Facebook's built-in AB testing feature.

10:04

🛠 Detailed Steps for Creating a Testing Campaign

The script outlines the process of setting up a new campaign for AB testing, emphasizing the importance of aligning the campaign objective with the main campaign to facilitate the transfer of successful ads. It discusses ad set budget optimization versus campaign budget optimization, advocating for the former to maintain control over budget allocation among different ad sets. The speaker also explains the selection of a broad audience for testing to ensure a large enough pool for accurate results and to avoid overlap with retargeting audiences in the main campaign. The paragraph concludes with advice on naming the ad set and considerations for the number of variations to include in each ad set.

15:06

🎯 Crafting Ad Variations and Launching the Test

This section of the script focuses on creating multiple ad variations within an ad set, recommending a range of two to six ads to test different hooks, headlines, or video lengths while avoiding mixing different types of creatives. The speaker advises against combining various ad formats in the same test to maintain a fair comparison. The script then moves on to the launch and monitoring phase, suggesting a 5-day monitoring period post-launch to gather sufficient data before making any decisions. It also touches on the importance of understanding the sales cycle and adjusting the monitoring period accordingly.

20:07

🏆 Analyzing Results and Scaling Winning Ads

The final paragraph of the script details the process of analyzing the results of the AB test, identifying the top-performing ads, and deciding whether to incorporate them into the main campaign. It discusses the criteria for determining a winning ad, which includes comparing its performance to the worst-performing ads in the main campaign. The speaker shares strategies for scaling successful ads by gradually increasing the budget and monitoring performance over time. The script concludes with a reminder to continually monitor winning ads and to repeat the AB testing process with new creatives to maintain an effective ad strategy.

🎉 Conclusion and Additional Resources

The script concludes with a brief mention of additional resources, specifically a cheat sheet available in the video description, which viewers are encouraged to download for reference during their AB testing process. The speaker reiterates the importance of the steps covered in the video and bids farewell to the audience.

Mindmap

Keywords

💡AB Testing

AB testing, also known as split testing, is a method of comparing two versions of web pages to determine which performs better. In the context of this video, AB testing refers to comparing different Facebook ad creatives and strategies to identify which generates a better response from the audience. The script mentions AB testing as a crucial part of managing Facebook ads, with the presenter showing the process they use in their agency to test creatives and improve ad performance.

💡Facebook Ads Manager

Facebook Ads Manager is a platform where advertisers can create, manage, and track their ad campaigns. The script discusses using the Ads Manager to conduct AB tests, emphasizing its importance in the process of optimizing ad performance and making data-driven decisions based on the results of these tests.

💡Creatives

In advertising, 'creatives' refers to the visual and textual content of an advertisement, such as images, videos, and copy. The video script focuses on testing different creatives in Facebook ads to determine which resonate more effectively with the audience. The presenter explains how to test variations of creatives like images, videos, and copy to improve ad outcomes.

💡Campaign Objective

A campaign objective defines the goal of an advertising campaign, such as increasing sales or generating leads. The script emphasizes the importance of setting the same campaign objective for both the main campaign and the testing campaign to ensure that the AB tests are relevant and can be accurately compared.

💡Ad Set

An ad set in Facebook Ads Manager is a group of ads that share the same targeting, budget, and scheduling. The video script describes structuring ad sets for testing purposes, including setting budgets and choosing the right audience to ensure that the AB tests are conducted effectively.

💡Broad Audience

A broad audience in Facebook advertising refers to targeting a large group of people without narrowing down the audience with specific interests or behaviors. The script mentions using a broad audience for testing to ensure that the ads are reaching a wide range of potential customers and to make the test results more generalizable.

💡Budget Optimization

Budget optimization in Facebook ads is the process of allocating the campaign budget to ad sets that perform better. The script discusses ad set budget optimization as a strategy for managing the testing campaign, allowing the advertiser to control where the budget is spent and to identify high-performing ads more effectively.

💡Lookalike Audience

A lookalike audience is a group of people who are similar to a source group, such as existing customers. The script mentions that lookalike audiences were commonly used in the past but have become less effective compared to broad audiences, which can now be more efficiently targeted using Facebook's advanced technologies.

💡Testing Campaign

A testing campaign is a specific type of ad campaign designed to experiment with different variables to determine what works best. The script outlines the steps to create a testing campaign in Facebook Ads Manager, including setting up a new campaign, choosing the right objective, and structuring ad sets for effective testing.

💡Scaling

In the context of advertising, scaling refers to increasing the budget or reach of a successful ad to maximize its performance. The script advises viewers on how to scale a winning ad set by gradually increasing the budget when an ad is performing well, thus capitalizing on its success.

💡Winner

A 'winner' in AB testing is the version of the ad that performs better than the other versions. The script discusses identifying the winning ad set or ad within a testing campaign by monitoring performance metrics such as cost per result and then integrating the winning creative into the main campaign to improve overall ad effectiveness.

Highlights

Introduction to AB testing Facebook ads using the same process as Vertex Marketing Agency.

Importance of AB testing in generating and refining creative ideas for ads.

The presenter's preference for testing and its role in managing Facebook ads for clients.

Explanation of using broad audience targeting in Facebook ads for better scalability and testing.

Advantages of broad audience targeting over lookalike audiences in current Facebook ad strategies.

Details on setting up a typical ad account with a main campaign and multiple ad sets.

Manual AB testing preference over Facebook's built-in AB testing feature due to flexibility.

Instructions on creating a new campaign for AB testing with the same objective as the main campaign.

Importance of keeping the campaign objective consistent when duplicating successful ads.

The role of adset budget optimization in controlling spend and testing different ads effectively.

Guidelines on structuring the testing campaign with appropriate naming and broad audience selection.

Recommendation on the number of variations per ad set, ranging from two to six for effective testing.

Strategies for creating different creative variations to test within the same ad set.

Advice against mixing different types of creatives in the same ad set for accurate testing.

Process of launching and monitoring ads for a minimum of 5 days to gather sufficient data.

Method for identifying the best performing ads and deciding whether to scale, duplicate, or pause them.

Emphasis on continuous monitoring and optimization of winning ads within the main campaign.

Final thoughts on the rinse and repeat process of AB testing to continually refine ad performance.

Transcripts

play00:00

hey everyone SED from vertex marketing

play00:01

agency and in this video I'm going to be

play00:03

showing you how to AB test your Facebook

play00:05

ads in ads manager and I'm going to show

play00:08

you the exact same process that we take

play00:10

in our agency to test creatives and

play00:12

answer some of the common questions like

play00:14

how long should I leave a test running

play00:16

what do I do with my winners what do I

play00:18

do with my losers and a lot more and I

play00:21

thought actually this video would be the

play00:22

perfect time to wear my this t-shirt so

play00:24

I'd rather be a testing cuz it's true to

play00:26

be honest that's probably my favorite

play00:28

part about my job which is you know

play00:30

managing uh Facebook ads for clients is

play00:32

the AB testing set of things and I think

play00:33

that's also the most important part

play00:35

because it's really all about generating

play00:37

new ideas like with creatives and copy

play00:41

and implementing on those ideas right so

play00:43

getting those let's say those ugc videos

play00:44

those images and just going in ads

play00:47

manager and testing them and hey was it

play00:49

a good idea or was it a bad idea if it's

play00:50

a bad idea you learn from it so that you

play00:53

Tred to put those ideas on the side and

play00:54

just think about the good ones in the

play00:56

future but then if it's a good idea

play00:57

perfect you scale that and then you

play00:59

think huh like how how could I replicate

play01:00

this or how could I create something new

play01:03

that is similar to this and it's like a

play01:04

loop you just want to keep repeating

play01:06

that process but it all starts by

play01:08

knowing how to AB test and how to really

play01:11

structure everything and that's exactly

play01:13

what I'm going to be showing you in this

play01:14

video okay so I am now Inside My Demo ad

play01:17

account and I first want to talk about

play01:19

the typical ad account account setup

play01:21

that we usually will Implement for

play01:23

client so we will always have like a

play01:26

main campaign and that main campaign

play01:27

that well that's the one receiving the

play01:29

majority of the bud buget um and inside

play01:31

that main campaign usually we'll have

play01:33

like maybe one two to three different

play01:35

ads sets but like 99% of the time we

play01:38

will have a broad adset and what does

play01:40

broad mean broad just means that we're

play01:42

just selecting the specific let's say

play01:44

country or maybe states that we want to

play01:46

Target or if it's local advertising

play01:48

we'll select the city um and we actually

play01:50

don't include anything else in the

play01:52

audience but instead we do the targeting

play01:54

with the ad so now with the advancement

play01:57

in technology Facebook can actually just

play02:00

understand a your copy but B your image

play02:03

and your video and the lending page and

play02:05

really just understand what you're all

play02:06

about what you do and who your customers

play02:08

are and because you have tracking set up

play02:11

and you're sending let's say a lead

play02:12

event or a purchase event back to

play02:14

Facebook it knows exactly who your

play02:16

customers are so by using this

play02:18

information they can just go ahead and

play02:19

kind of like create your own lookalike

play02:21

audience which that's what we really

play02:23

used to do like back in would say 2019

play02:26

is we would create you know lookalike

play02:28

audiences but now those tend to not work

play02:30

as well as something as like a broad

play02:31

audience so that's number one um and

play02:34

then another reason why we like the

play02:35

broad audience is because once you can

play02:37

make it work with the broad audience is

play02:39

really easy to scale because you don't

play02:40

have a limit with the frequency I mean

play02:42

if you do it's more of like an issue

play02:44

with Facebook meaning like you're

play02:46

spending just way too much and Facebook

play02:48

just doesn't have a big enough audience

play02:49

for you which at that point you know

play02:51

like that's a whole other issue so

play02:53

that's why we really like the broad

play02:54

audience and and it makes it also really

play02:55

easy to test but I'll get into that in

play02:57

just a second I just want to show you

play02:59

kind of like what the typical add

play03:00

account structure will look like I'm not

play03:02

saying that everyone just only gets one

play03:04

campaign one adset but usually the

play03:07

majority of the budget is going to go

play03:09

towards that main campaign um but it

play03:11

doesn't mean that we don't have like

play03:12

another campaign like doing like a

play03:14

catalog sales or other specific type of

play03:17

campaign maybe targeting specifically

play03:19

like different countries and we just

play03:20

want to have those separate again what

play03:22

I'm saying is most of the time there's

play03:23

going to be one campaign in your at

play03:25

account that is receiving the majority

play03:27

of the budget um and that's what we call

play03:29

like a main campaign but with that being

play03:31

said now let me show you how you would

play03:32

go ahead and structure that uh testing

play03:34

campaign so now creating your testing

play03:36

campaign there's a few different ways of

play03:38

doing it I actually like to create just

play03:40

a brand new campaign and doing it a

play03:42

little bit more manually uh but if you

play03:45

talk to like a meta specialist they

play03:47

would probably say or refer you to this

play03:49

feature right here which is AB testing

play03:51

but I'm personally not a big fan of that

play03:53

and I've talked to a lot of other

play03:54

advertisers and you usually have the

play03:56

same opinion and the reason I'm not a

play03:57

big fan of the ab test feature here is

play03:59

because once you create it you have to

play04:02

set a start and an end date and what I'm

play04:05

going to show you is if ever your ad set

play04:08

or really your test is performing well

play04:09

like well don't pause it and then a meta

play04:11

specialist will usually say well yeah

play04:13

just pause it and bring that inside your

play04:14

main campaign and you know the ad is

play04:16

going to be running there just with

play04:17

experience and also if you have

play04:19

experience running Facebook ads you'll

play04:21

notice often that your test might

play04:23

actually perform really well in your

play04:24

testing campaign but once you bring it

play04:26

inside your main campaign it doesn't

play04:28

have the same momentum that it had

play04:30

inside your testing Campaign which is

play04:31

fine that will happen but that's why you

play04:33

want to just ideally just keep that test

play04:35

running for as long as it performs well

play04:37

and with that feature it just doesn't

play04:39

really allow you to do that so that's

play04:41

one of the main reason why I'm not

play04:42

really a big fan of that feature so I've

play04:45

made this notion document and this is

play04:47

kind of like a cheat cheat and uh the

play04:49

link of this document is going to be in

play04:50

the description of this video you don't

play04:52

need to give me your email or to

play04:53

download anything to get it just click

play04:55

the link and you'll see it but what

play04:56

we're going to do is we're going to

play04:57

follow this cuz that's actually uh what

play04:59

I follow to AB test are uh creative so

play05:02

the first thing you want to do Step One

play05:04

is setting up your campaign so you're

play05:05

going to go to ads manager you're going

play05:07

to create a new campaign and the

play05:09

campaign objective needs to be the same

play05:12

objective that you have inside your main

play05:13

campaign um so if you've selected sales

play05:15

in your main campaign I'm going to

play05:16

select sales here and that's what I've

play05:18

done but if you're service based company

play05:19

in your main campaign you've selected

play05:21

leads you're going to select leads um

play05:22

and that's really important because if

play05:23

you don't select the same objective when

play05:26

you do find a good test you're going to

play05:27

want to duplicate that ad to bring it

play05:29

inside your main campaign and if it

play05:30

doesn't have the same objective you

play05:31

won't be able to duplicate it someone

play05:33

could make the point that okay well just

play05:35

you know just upload and create that ad

play05:37

inside your main campaign sure you can

play05:39

do that but if you do that that's going

play05:40

to be a different ad ID and I mean at

play05:43

this point you're kind of losing some of

play05:44

the learnings that that ad as made um

play05:47

but then the other thing and with that's

play05:48

probably for me that least the main

play05:49

reason is you're losing the engagement

play05:51

so when you're duplicating an ad uh from

play05:54

an ads set like your testing ad set to

play05:56

bring it inside your main campaign if

play05:58

you uh select select the same campaign

play06:00

objective and you also select the option

play06:02

that says keep existing engagement

play06:04

you're going to be able to keep that

play06:05

engagement and I mean social proof right

play06:07

CU you're paying for those likes those

play06:09

comments and those shares and when you

play06:11

get ideally good comments it's free

play06:13

social proof and actually helps it gives

play06:15

your add a little boost so that's why I

play06:17

always like to select the same campaign

play06:19

objective so I'm going to select sales

play06:20

here and I'll usually just select manual

play06:22

sales campaign and uh go to the campaign

play06:26

level here so first thing I say here is

play06:28

like name your campaign testing so I'm

play06:30

going to go ahead and do

play06:32

that testing and then this is going to

play06:35

really depend on your you know your

play06:36

business model so are you in a special

play06:38

ad category credit employment or housing

play06:40

if you are obviously you want to select

play06:42

that if not you're going to get your ad

play06:43

account disabled this here use a catalog

play06:46

I always turn that off um and then

play06:48

create AB testing I do not use that

play06:50

feature so I leave that off I always AB

play06:54

test my ads manually so the way that I'm

play06:56

showing you in this video and I do not

play06:57

use that feature because with this

play06:59

feature you need to add a start and an

play07:02

end date so we're not going to do that

play07:04

and I do not turn this feature on right

play07:06

so that's actually step uh or Point

play07:07

number two here budget optimization so

play07:10

choose adset budget optimization so

play07:13

adset budget optimization is when the

play07:15

budget is on the adset this is a

play07:17

campaign right so we don't want to set

play07:18

the budget here if I were to turn this

play07:20

on what would happen is it would allow

play07:22

me to set the budget on a campaign and

play07:24

it's fine if you only have one test that

play07:26

you're running but what I'm going to

play07:27

show you is how to potentially have like

play07:29

two or three different adset testing

play07:31

different types of ads and what that's

play07:33

going to do if you have this on it's

play07:34

just going to spend the money on the

play07:35

adet that has the lowest cost per result

play07:37

which you might think oh well that's

play07:39

fine like I want to spend the money

play07:40

wherever it makes sense yes but if one

play07:42

ad set has been running for longer than

play07:44

another ad set that one's going to have

play07:46

momentum and probably have maybe a a

play07:49

better cost per result than another ad

play07:50

set and now that new test that you want

play07:52

to run to see if it performs better and

play07:54

every time you're running a new test

play07:56

that probably means that you think those

play07:58

ads will perform better than any current

play07:59

ads that you're running so you want to

play08:02

at least have a minimum amount of spend

play08:04

that you can dedicate to that ad set and

play08:07

have control on potentially increasing

play08:09

that budget decreasing that budget and

play08:12

forcing basically a specific amount of

play08:14

send towards that adset and with

play08:16

campaign budget optimization right this

play08:18

option right here it's just not really

play08:19

going to allow you to do that in the

play08:21

same way that adset budget optimization

play08:23

would do so I'm going to turn that off

play08:25

and uh now we're ready to go on the

play08:27

adset level so for the adset level I

play08:29

usually will do this so I'll name this

play08:32

testing and I'll say Broad and the

play08:35

reason I'm using a broad audience here

play08:37

is that inside my main campaign I'm also

play08:39

using a broad audience I usually want to

play08:43

look at the results that I'm having

play08:44

inside my main campaign and compare the

play08:47

results with the results I'm getting in

play08:49

my side my testing campaign here but if

play08:51

in my main campaign I'm using Broad and

play08:53

here I would you use let's say a

play08:54

lookalike or retargeting audience that

play08:56

would mean that I'm not looking at the

play08:58

same thing right what here I'm testing

play09:00

and what I'm showing you how to do is

play09:02

how to abest your creatives if I guess

play09:05

you wanted to abest audiences then what

play09:08

you would want to do is you would want

play09:09

to use the same ads that are inside your

play09:11

main campaign bring those ads inside

play09:14

that testing ad set and now only change

play09:16

the ad set so you have one variable and

play09:18

that's what I'm a big fan of have one

play09:19

variable so are you testing the audience

play09:21

or you testing the creatives most of the

play09:23

time I am testing the creatives that's

play09:25

why I'm I'm not you know spending too

play09:27

much time showing you and talking about

play09:28

this different like audience variations

play09:31

I'm going to be talking more about

play09:32

different creative variation because to

play09:34

me that's what I found just drives the

play09:36

most amount of results for like you know

play09:39

input versus output I will usually use a

play09:41

broad audience and if you're not using a

play09:44

broad audience inside your main campaign

play09:46

make sure that the audience here is

play09:47

Broad enough so don't have a bunch of

play09:49

Interest targeting like if they need to

play09:51

meet this and this and this and this and

play09:53

this in order to fall in this audience

play09:54

because that means that the audience

play09:56

size is going to be a little bit too

play09:57

small okay so just keep that in mind

play09:59

and another reason you want to have a

play10:01

big enough audience is because here I'm

play10:02

using the same audience I'm using inside

play10:04

my main campaign and inside your main

play10:06

campaign you're only doing retargeting

play10:07

most likely what you will see is an

play10:09

audience overlap so you want to try to

play10:12

avoid that as much as you can by using a

play10:14

big enough audience so that hopefully

play10:15

there's not an overlap but outside of

play10:17

that you're going to come here and if

play10:18

inside your main campaign you're sending

play10:19

people to your website you're going to

play10:21

select that option here if you're doing

play10:22

a website and Shop in your main campaign

play10:24

you're going to select that option right

play10:26

so you really want to duplicate what

play10:28

you're doing with your main campaign on

play10:30

the adset level right here so you're

play10:32

going to go ahead and select obviously

play10:33

the same pixel the same conversion event

play10:35

all the same settings um and this is

play10:37

where you set your budget so in terms of

play10:39

budget it kind of depends on what your

play10:42

cost per result is so if your cost per

play10:45

result is really low you can get away

play10:47

with a lower budget because let's say

play10:49

you're only spending I don't know $100 a

play10:51

day in this test and your cost for

play10:54

purchase is $10 well you know every day

play10:57

you're going to potentially have about

play10:58

10 purchases but if you let that run for

play11:00

about 5 days you're probably going to

play11:02

have 50 purchases and if you only have

play11:04

four ads 50 purchases dividing with all

play11:07

those ads like it's going to give you

play11:08

enough purchase to be able to clearly

play11:11

understand and identify a winner right

play11:13

but since it's a demo I'm just going to

play11:15

set it to let's say 150 a

play11:17

day and that's pretty much it in terms

play11:20

of what I would do in the uh ad set but

play11:22

obviously if you're targeting United

play11:23

States you would go ahead and change the

play11:25

location here but this is demo I'm just

play11:27

going to leave it to default and for me

play11:28

that's Canada and again I don't I'm not

play11:30

changing anything anything else here so

play11:32

let me let me go back here I want to

play11:34

talk about number of variations so each

play11:37

adet should have a minimum of two and a

play11:40

maximum of six ad set so the reason for

play11:43

that is just like what I said so if you

play11:45

have too many uh ads it's going to be a

play11:48

lot harder for you to actually make a

play11:50

decision on which ad is actually

play11:52

performing best because what's probably

play11:53

going to end up happening is some of

play11:55

them will not get enough spend for you

play11:57

to be able to make an informed decision

play11:59

and you have two little ads then I mean

play12:00

you're just not testing enough at this

play12:02

point so you The Sweet Spot is between

play12:04

two and six and if you want to find like

play12:07

maybe even a better middle ground for

play12:08

your company again it depends on your

play12:10

cost per result and your spend so if

play12:12

your cost per result is low and your

play12:14

spend is really high you can probably be

play12:16

closer to the five or six ad that you're

play12:19

testing per ad set but if it's the other

play12:21

way around then you know you probably

play12:23

want to aim around two or three but

play12:24

never go below two and I don't recommend

play12:26

companies to go above six unless you

play12:28

know you're a really really large

play12:29

company and you're spending a lot on

play12:31

Facebook ads um the second thing here

play12:33

that I'm going to talk about is the

play12:34

different types of variation so what you

play12:37

want to do is maybe just try different

play12:38

hooks so it could be like the first 5c

play12:41

of the video if it's an image maybe it's

play12:43

just like the the headline that you're

play12:44

changing maybe it's a copy so same video

play12:47

same videos or same images and you're

play12:49

just trying out different copy so I

play12:51

usually like to grab like let's say a

play12:54

video concept or an image concept and

play12:56

just change a lot of different things

play12:58

about that image or video to create

play13:00

different variation but what I don't do

play13:02

and that's what I explain here I try to

play13:04

avoid using different types of creatives

play13:07

inside the same adset let me give you an

play13:09

example I will not have one video and

play13:12

then testing two different carousels

play13:14

because at that point they're different

play13:16

types of creatives and they're not

play13:18

really equal but usually what I found is

play13:20

that carousels when they're mixed with

play13:21

videos and images they sometimes suck up

play13:24

all the budget so what you want to avoid

play13:26

is mixing different types of creatives

play13:28

together so if you're testing images

play13:30

like I said test different hooks um test

play13:33

different colors maybe in the image but

play13:35

test only images and the concept should

play13:37

be similar and same thing with videos if

play13:39

you're if you're testing like let's say

play13:41

you have a specific video you want to

play13:42

test just make different versions if

play13:44

it's a ugc video ask the Creator okay

play13:46

like record this content but the first

play13:49

like let's say your hook let's try this

play13:51

hook that hook and like three different

play13:53

hooks so now you have three different

play13:54

versions to test but just stick with

play13:56

Carousel if that's what you want to test

play13:57

with that ad set okay so for for step

play13:59

two create enough but not too much

play14:01

creatives and I give you different

play14:03

examples of what to test so it could be

play14:05

a different headline it could be just a

play14:07

different hook or it could actually be

play14:08

the same video but like one video is

play14:11

longer let's say 55 seconds and the

play14:13

other one is like cropped to 25 seconds

play14:16

so you're testing long versus short uh

play14:18

so those different types of tests you

play14:19

can make and again the biggest takeaway

play14:22

here is don't mix and match so number

play14:24

three is launch and monitor so

play14:26

everything's good with your ad set and

play14:27

your ads you're going to go ahead and

play14:28

and click the launch button what you

play14:30

want to do is you want for the next 5

play14:32

days to just monitor your ad just look

play14:34

and see okay our purchases or our leads

play14:36

flowing in but you don't want to make

play14:38

any changes yet and just please note

play14:40

that the review process or the launch

play14:42

and monitor process will depend on your

play14:45

sales cycle so if you have a short sales

play14:48

cycle then regardless I don't recommend

play14:50

that you go below 5 days maybe 4 days

play14:52

would be acceptable if you have a really

play14:55

really uh short s cycle but definitely I

play14:58

I I wouldn't really recommend it like

play15:00

honestly 5 days I think is perfect um

play15:02

but if you have a longer s cycle then

play15:04

maybe you want to let the ads that run

play15:06

for let's say two weeks so it really

play15:07

depends on your company and if ever you

play15:09

do have a really long sale cycle and

play15:11

where it's like maybe you're you're at

play15:13

the end of the day you're trying to see

play15:14

if those leads are converting into deals

play15:16

because you're a service based company

play15:18

then maybe at that point you're not

play15:20

optimizing for the deals because that

play15:22

will take too long maybe you're trying

play15:24

to like identify how many let's say

play15:26

valid lead this adet generat right so

play15:29

you're trying to find maybe a metric

play15:31

that is closer to let's say the deal

play15:34

stage but is not really the deal stage

play15:36

because maybe that just takes too long

play15:37

and if ever you have an Ecom store where

play15:39

your product costs I don't know $500 so

play15:42

there's a long s cycle then again maybe

play15:44

you want to just wait or wait a little

play15:46

bit longer or find another metric that

play15:48

you could use to identify like early

play15:51

success in an adet or with an ad but

play15:54

guys that's a launch and monitor stage

play15:56

identify the winner all right so that's

play15:57

my favorite part so where what you're

play15:58

going to do is you're going to go inside

play16:00

ads manager so let's pretend that this

play16:02

one has been launched we have my testing

play16:03

ad set and my regular broad campaign

play16:06

right both of them have the same

play16:07

audience and I would just go to the

play16:09

result tab obviously this demo account

play16:11

so there's no data but I would look at

play16:13

how many results they generating but

play16:15

more importantly what is the cost per

play16:16

result and if you're in the Ecom space

play16:19

and maybe you're also looking at things

play16:20

like the return on ad span right you're

play16:21

adding that to your view and you're just

play16:24

kind of looking on the adset level now

play16:25

which one is performing better if if

play16:27

your main campaign is performing better

play16:29

than your testing adset don't give up

play16:32

yet just go now select your main

play16:34

campaign and select your testing adset

play16:36

and you want to go to the ad level and

play16:38

what you're trying to see here is are

play16:39

any of my ads that have enough data what

play16:42

I mean by enough data is inside your

play16:44

testing adset when you're looking at

play16:46

your results you want to look at this

play16:48

and say okay because of my budget and my

play16:50

cost per result and what I see in let's

play16:52

say a 5day uh period I can actually

play16:55

confidently see that this ad or that ad

play16:58

is actually performing well or maybe who

play17:01

knows none of them are performing well

play17:02

right but you need to have enough data

play17:04

to be confident that okay this was a

play17:06

good test or this wasn't a good test and

play17:09

if this was a good test let's say one of

play17:11

your ad inside your testing ad set is

play17:14

actually performing better than any of

play17:16

your ad inside your main campaign what

play17:18

you're going to do is you're going to

play17:19

duplicate that ad and you're going to

play17:21

bring it inside your main campaign and

play17:24

if ever it did not perform well then

play17:27

what you're going to do is after the f

play17:28

day again if it has been enough time and

play17:30

you're confident about the data you can

play17:32

go ahead and pause that testing ad set

play17:34

right and then let's say inside of your

play17:36

testing ad set you find an ad that is

play17:39

performing well but when you compare it

play17:40

to your main campaign it doesn't match

play17:42

the results of your best ad inside your

play17:45

main campaign well then the next

play17:46

question you want to ask yourself is it

play17:48

better than the worst ad inside my main

play17:49

campaign if that's the case same thing

play17:51

here you're going to duplicate that ad

play17:53

and bring it inside your main campaign

play17:55

and now the question that I get all the

play17:57

time is okay Cedric so once I brought

play18:00

that good ad inside my main campaign

play18:02

like what do I do now well if the ad is

play18:05

performing well inside your testing

play18:06

campaign leave it there right let that

play18:08

adset run and sometimes what I'll even

play18:10

do is I'll even scale a testing adset

play18:13

what I'll do is for as long as it's

play18:14

performing well every two days I'll go

play18:16

inside my adset and I'll increase the

play18:19

budget by let's say 20% and I'll can

play18:21

just continue scaling scaling scaling um

play18:24

until I see a drop in the cost per

play18:25

result or again Ron as Bend depending on

play18:27

the metric that you're using but don't

play18:30

pause something that works right don't

play18:32

try to reinvent the wheel so if ever

play18:33

testing ad set is performing well leave

play18:36

that running and you can what you can go

play18:38

is you can kind of just like monitor it

play18:40

and that's what actually the the last

play18:42

step is right here is keep an eye on the

play18:44

winner so whenever you do find a winner

play18:46

just keep an eye on it monitor it and if

play18:49

ever let's say you have three ads that

play18:51

are performing really well inside your

play18:52

testing ad set and after let's say 3

play18:54

weeks let's say one of them stops

play18:56

performing well but the other two are

play18:58

still doing well then just pause the one

play19:00

that is not doing well and you're going

play19:01

to keep running those two that are doing

play19:03

well again until they don't perform well

play19:05

and it's kind of like a rid and repeat

play19:07

process and that's why I say to use

play19:08

adset budget optimization and not

play19:10

campaign budget optimization because I

play19:12

really wish for you that you find an

play19:14

adet that performs well and that you're

play19:16

going to leave running for let's say a

play19:18

month so for that one month period you

play19:20

don't want to just have one ad set that

play19:22

is running because you want to test

play19:23

other stuff so what you do is you kind

play19:26

of like rinse and repeat so you would

play19:27

create another adset and in that adset

play19:30

it would just be like three or four or

play19:32

five or six different creatives that

play19:33

you're testing and you're going to do

play19:35

the same process that you just did with

play19:36

the prior ad set right so you let it run

play19:39

for enough days look at your data is it

play19:41

a good test or is it a bad test and it's

play19:43

really uh a Rin and repeat process so

play19:45

guys that is it for this video um now

play19:47

hopefully you know exactly how to AB

play19:49

test your creatives and if ever you want

play19:52

that cheat sheat again it's in the

play19:53

description of this video I recommend

play19:54

that you take a look at it even maybe

play19:56

like download it and when you're AB

play19:58

testing just reference that sheet and

play20:00

make sure that you're following all the

play20:01

different steps but guys that is it for

play20:03

this video bye for

play20:07

[Music]

play20:12

[Music]

play20:16

[Applause]

play20:21

now

Rate This

5.0 / 5 (0 votes)

Etiquetas Relacionadas
Facebook AdsA/B TestingMarketing StrategyCreative OptimizationConversion BoostAd CampaignBudget ManagementPerformance MonitoringSocial Media MarketingDigital Advertising
¿Necesitas un resumen en inglés?