Upload Adobe Analytics Data Sources from Google Sheets with Data Connector Add-on
Summary
TLDRIn this tutorial, Andre Otto Schalk introduces the Google Sheets add-on for Adobe Analytics, guiding viewers on creating data source templates, uploading offline data, and setting up FTP accounts. The video demonstrates how to modify templates, input data for analytics, and automate data loading schedules, ensuring seamless integration with Adobe Analytics for enhanced data management.
Takeaways
- π The video is about using the Google Sheets data connector for Adobe Analytics to upload offline data from Google Sheets to Adobe Analytics.
- π The process starts by creating a template in Google Sheets, which contains three areas: friendly names for variables, information about the variables, and the main data entry area.
- π Users can customize the template by adding or removing columns, but they must keep the date column as the first column.
- π To upload data, users need to set up an FTP account with Adobe Analytics and save the account details in Google Sheets.
- π The data upload process involves selecting the data sheet, choosing the FTP account, and clicking the 'Load Now' button to start the upload.
- π If there are any errors during the upload, such as incorrect FTP credentials, the status will indicate the issue.
- π Users can schedule data uploads to happen automatically at regular intervals, such as daily or hourly.
- π The add-on allows for the uploading of future data by scheduling uploads, ensuring data is synchronized with reports over time.
- π The video also explains how to handle scenarios where data from various platforms, like ad platforms or email marketing, needs to be manually uploaded to Adobe Analytics.
- π The video provides a detailed walkthrough of setting up the template, filling it with data, configuring the FTP account, and automating the data upload process.
Q & A
What is the main purpose of the Google Sheets add-on for Adobe Analytics discussed in the video?
-The main purpose of the add-on is to facilitate the process of uploading offline data directly to Adobe Analytics from Google Sheets, allowing users to work with data sources more efficiently.
How does the template creation process begin in the Google Sheets add-on?
-The template creation begins by clicking on the 'Create a template' link, which appends a new tab to the spreadsheet and switches the sidebar to a list of variables compatible with data sources.
What are the three main areas of the template mentioned in the video?
-The three main areas of the template are: 1) the yellow line for friendly names of variables, 2) the gray line for information about the wearables used, and 3) the area starting from line number 3 for entering data to be uploaded to Adobe Analytics.
Why should the gray line in the template not be changed?
-The gray line should not be changed because it contains information about the wearables used for the data source, and altering it may cause the add-on to not work properly.
How can users customize the template to fit their specific data needs?
-Users can customize the template by deleting unnecessary columns, such as a transaction ID, and adding new variables like 'number of orders', 'number of units', and 'revenue' using the 'Add to sheet' button.
What is the significance of the date column in the template?
-The date column is significant as it serves as the timestamp for the data being uploaded to Adobe Analytics, and it should always be the first column in the template.
How does the video demonstrate the use of the add-on for uploading upper funnel data from ad platforms?
-The video demonstrates this by updating the template to include tracking codes and metrics like impressions, clicks, and ad cost, and then filling in the spreadsheet with relevant data for both paid search and email marketing.
What is the role of the FTP account in uploading data to Adobe Analytics?
-The FTP account is essential for setting up a connection to Adobe Analytics, allowing the data from the Google Sheets add-on to be uploaded and processed.
Why is it important to save the data source before attempting to load the data?
-Saving the data source before loading is crucial because it ensures that the data is available for processing and uploading to Adobe Analytics; unsaved data will not be processed.
How can users avoid uploading duplicate data to Adobe Analytics?
-Users can avoid uploading duplicate data by selecting the 'Unloaded rows' option when loading data, which ensures that only new or previously unloaded data is uploaded.
What is the benefit of scheduling data uploads in the add-on?
-Scheduling data uploads allows for automatic processing and uploading of data at specified intervals, which is convenient for handling data with future dates and ensures that data is synchronized with reports without manual intervention.
Outlines
π Introduction to Google Sheets Data Connector for Adobe Analytics
Andre Otto Schalk introduces the Google Sheets Data Connector for Adobe Analytics, outlining the process of working with data sources, uploading offline data directly from Google Sheets to Adobe Analytics. The video will guide through creating a template, filling it with data, setting up an FTP account, and scheduling data uploads. The template structure is explained, including the areas for friendly names, wearables information, and the main data entry area. The video also covers how to modify the template by adding or removing variables and the importance of keeping the date column first.
π Filling the Template with Data for Adobe Analytics
This section details the process of filling in the template with actual data. The script explains how to input dates and tracking codes for different marketing channels, such as paid search and email marketing. It demonstrates how to rename variables for clarity and how to manually input or copy-paste data into the spreadsheet. The importance of correct data format and the use of tooltips for variable explanations are highlighted. The video also discusses setting up FTP accounts within the add-on for data uploads, emphasizing the convenience of having multiple spreadsheets accessing the same FTP accounts.
π Setting Up and Managing FTP Accounts for Data Uploads
The script explains how to add a new FTP account in the add-on, advising on naming conventions to distinguish between different uses. It covers the necessity of obtaining the correct FTP host, username, and password from Adobe Analytics backend and the importance of saving the FTP account details for future use. The video also shows how to update or delete FTP accounts and the security measure of hiding the FTP sheet after setup to protect sensitive information.
π Loading Data and Handling Errors in Adobe Analytics
This part of the script describes the process of loading data into Adobe Analytics, including selecting the correct data sheet and FTP account. It demonstrates the initial loading process and the potential error messages that may appear, such as incorrect username and password errors. The video shows how to correct these errors by revisiting the FTP account setup and re-saving the credentials. It also explains the status indicators that show whether each line of data was successfully uploaded or skipped.
π Options for Data Loading and Scheduling Uploads
The script discusses the options available for loading data, such as choosing to upload only unloaded rows to avoid duplication or all rows for repeated uploads. It also introduces the scheduling feature, which automates the data upload process at set intervals. The video provides a scenario for email marketing where forecasted data with future dates needs to be uploaded automatically, explaining how to set up and activate a daily schedule for data processing and upload.
π Automating Future Data Uploads with Scheduling
This section elaborates on the automation of uploading future-dated data sources by setting up a schedule within the add-on. The script shows how to create a new sheet for forecasted data, enter dates, tracking codes, and forecasted metrics, and then set up a schedule for daily processing. It explains how the system will automatically process and upload data as dates become relevant, and how to manage scheduled jobs, including deleting them when no longer needed.
π Conclusion and Final Thoughts
Andre Otto Schalk concludes the video by summarizing the key steps for working with data sources in the Google Sheets Data Connector for Adobe Analytics. He emphasizes the importance of using the template effectively, setting up FTP accounts, and utilizing the load and schedule features for seamless data uploads. The video ends with a thank you note for watching and an invitation for further engagement.
Mindmap
Keywords
π‘Google Sheets
π‘Adobe Analytics
π‘Data Connector
π‘Template
π‘FTP Account
π‘Data Source
π‘Offline Data
π‘Variables
π‘Tracking Code
π‘Classification
π‘Scheduling
Highlights
Introduction to the Google Sheets add-on for Adobe Analytics, focusing on working with data sources and uploading offline data.
Creating a template for data entry in the Google Sheets add-on, including the explanation of the three areas within the template.
Instructions on how to modify the template by adding or removing variables to suit specific data needs.
Importance of keeping the date column as the first column when reordering columns in the template.
Demonstration of preparing clickstream data flow and classification for data sources.
Explanation of uploading upper funnel data from ad platforms to Adobe Analytics manually.
Guidance on updating the template to include relevant metrics like impressions, clicks, and costs for different marketing channels.
Process of filling in the spreadsheet with data for paid search and email marketing, including the importance of correct data format.
Setting up an FTP account within the add-on for data upload to Adobe Analytics.
Instructions on how to save and secure FTP account information within the Google Sheets add-on.
Loading data to Adobe Analytics via the add-on and handling potential errors during the process.
Understanding the difference between uploading unloaded rows and all rows to avoid data duplication.
Automating data uploads with scheduling options for future-dated data sources in Adobe Analytics.
Managing scheduled jobs for data uploads, including the ability to delete or modify them as needed.
Summary of the workflow for using data sources in the Google Sheets add-on for Adobe Analytics, emphasizing the importance of the template, FTP setup, and scheduling.
Transcripts
hi this is Andre Otto Schalk and this is
Google shoot data connector for Adobe
Analytics in this video we'll get to
know how to work with data sources how
to upload offline data to analytics
directly from Google sheets
thanks to this add-on so first we will
create a template then we will fill in
this template with data then we will set
up an FTP account and finally load and
schedule the data to be sent to Adobe
Analytics so first let's create a
template and as you see once you have
clicked on a link create a template and
you tab was appended to the spreadsheet
and also the sidebar has been switched
to a list of variables that are
compatible with data sources so let's
first review what we have in the main
area so the template itself contains
three areas the first area is the yellow
line and this is the line where you can
type in any type of friendly names for
your variables dimensions and metrics
the second area is the line highlighted
with the gray color and this is for your
information so that you will know which
wearables you are using for this
particular data source so these lines
shouldn't be changed at any time
otherwise the add-on may not work
properly and the main area is here that
starts from the line number 3 so this
area is where you will need to enter
your data that you want to upload to
other be analytic and on the right hand
side you can see lots of variables all
of them are supported by data sources
and if you want to update this template
you can do this with these variables so
for example let's say that I don't want
to have in my template for one event one
transaction ID I can simply delete these
columns and instead or in addition to
these columns I want maybe to add number
of orders
number of units and also revenue so I
just click on add to sheet button and
these variables will be appended to the
template so this is how you can work
with the template you can always if you
want to you can reorder the columns just
make sure that the date column is the
first one and this is probably
everything what you need to know about
templates so now if you worry about the
scenario that I'm going to show you in
this video in the previous videos we
prepare the clickstream data flow and
basically here we uploaded some page
views and one of the metrics or
variables that we used was the tracking
code and here we have Adwords and email
tracking codes that we used for example
then we appended a new tab and this step
is used for uploading a classification
and as you can see we prepared the
classification for several keys and now
let's consider the scenario when we want
to in reach our data in Adobe electrics
with upper funnel data from ad platforms
for example not not only for from ad
platforms but from any other type of
platforms that are not directly
connected to the bioethics in our case
imagine that for some reason you may not
have integration with either DSP or with
Google ads directly and probably for
some reason you can't do this at the
moment but you are happy to regularly
upload this data manually so you know
that maybe every month maybe twice a
month you will export some data from ad
platform you will export some data from
your email provider and then those upper
funnel metrics will be uploaded to other
bin oolitic as a data source through
these Google sheets so
in other words what we will need we will
need the tracking code we will need
metrics like impressions clicks and at
cost if this is email marketing this can
be whatever is supported and provided by
them email platforms for example how
many mails were scheduled how many mails
were sent how many were opened etc so I
hope this concept is clear and this is
just an example what you can do with
data sources so this is why I will
update the template I don't need to have
product column I don't need to have
these three columns I will delete them
and I will start with the date and
tracking code so basically date is our
time stamp that we'll be using and for
the tracking code for example paid
search we will be using a few wins so
for example I will take event 60 and 61
and that will be used for number of
impressions and add cost so let's add
them and I will also rename them so that
it will be clear for me what this events
are used for so ad impressions and ad
cost these two will be used for
performance marketing so for email
marketing I will be using some other
events that will be 67-68 for this
particular case I will be using only two
and one of them stands for number of
emails that were sent and the second one
emails opened so how many mails were
viewed and obviously if we were able to
track that but basically the email
platform was able to track that okay so
this is our template
so now let's fill it in so basically the
date similar to how we filled it in for
previous work flows we need to enter the
date so for example this will be 18th of
December 2019 I don't need to specify
here the time but if I want you I can do
this
by the way again if you hover the mouse
cursor over different variables you may
find there some tips or explanation of
the format of what this variable is used
for etcetera so do not forget about it
so enter the date and then I will be
first filling in this spreadsheet for
paid search which is in my case at worse
then obviously this data have to be
exported from the add platform then
somehow processed so that you will have
this format but for simplicity for this
particular video I will be typing in
this just to demonstrate how it works so
for example here we had thousand
impressions our ad cost was for example
$50 or 50 something else it depends on
the currency that you set for the report
suite then for the mail marketing
obviously this will be 0 we don't need
this numbers and I will just copy and
paste the same row and now it will be
emailed so just to remind you that email
and dead words are the keys that I used
for tracking code or for tracking
purposes and here for email marketing ad
impressions and that cost obviously will
have to be set to zero and emails sent
and opened so this is something that I
need to enter and for example this would
be 10,000 and opened
8,500 okay so pretty much the same I
have to do for every day and this is how
it should work so for example maybe here
we had some other numbers and I hope you
got the idea how it may look like and
when you prepare such data source the
next step is to upload this data source
to other the analytics and for this
purpose you need to add an FTP so how to
do this we will need to go back to the
main menu and click on the ftp accounts
link and this will add a new tab that
will be filled in with your current ftp
accounts that you have already saved
under your Google account so basically
it doesn't matter which spreadsheet you
will be working with if you work with
this add-on you will see all the
accounts that you have set up this is
convenient because maybe you have
different spreadsheets that are used to
send the data to the same FTP account so
here let's add a new FTP so I advise you
to always follow some naming conventions
my name is convention is the following
first I want to make sure that the
prefix tells me what this FTP account is
used for because it can be for a
classification can be for data sources
can be for something else so in my case
this is data source and then if you want
you can also add a suffix or maybe
another prefix for reports with but in
my case I know that this is used only
for one reports with this is why I will
not enter something referring to reports
with so I will just add that this is
tracking code so then I will need to
enter FTP host username and password
this is something that you will need to
find
or set up on the back end of other be
analytic and fill it in and once you
have added in you data store and you FTP
account do not forget to save it it's
essential that you first saved the data
otherwise it will not be available for
you when you load the data so save it
alright so now when the data servers is
saved you will see that the password is
not shown and this is done on purpose
for security reasons but if you want to
update the name of the FTP account if
you want to update the username it's not
a problem to do that you don't need to
re-enter the password in this case if
you want to update the password you need
to reenter it and then again click Save
so I also want to remind you that if you
want you delete an FTP account you need
to delete the row with that FTP account
and again save the FTP otherwise it will
the changes will not be saved so now I
will return my FTP account and toggle
again we enter the enter the password
and save it once the FTP is are saved I
advise you to always click on the hide
FTP sheet so that you will not see it
and this is again a good idea for
security reasons especially if you want
to share this with your colleagues so
that nobody will see the real passwords
alright so once the FTP is set up what
we need to do is to go to load and
schedule so this user interface looks
very similar to how it looks for
clickstream data workflow so first you
need to select the data sheet the data
should be sent from so in our case this
is data source and for the FTP accounts
you will need to select the FTP account
that you set
for these data source all right so now
basically this is it this is the minimum
what you need to set up prior to you
load the data to other the analytics and
let's try to do this I mean it's a small
mistake on purpose so that you will see
some statuses that may appear that you
may face with so let's load the data and
you will see what I'm referring to
once you have clicked on the load Now
button the data started processing and
once the data is processed you will see
the status and the date of the of the
processing and here we have a status
failure meaning that there was a mistake
somewhere or a problem and afterwards
you will see the explanation of what was
wrong so here we can see that the
username and passwords are incorrect so
I made it on made that mistake on
purpose so that you will see what type
of statuses may be and so that you will
understand how this works so now what we
need to do is go back to FTPS and i will
replace the FTP credentials with the
correct ones
save FTP and then hide this alright
so now I will try again to load the data
and let's click on this button so now
you can see that in addition to the
general status that tells you that the
data was successfully loaded and it also
shows you the file name the data was
uploaded to a dev analytics in and also
for every line it shows you the status
because sometimes not every line can be
included into the file that will be
centered they've been oolitic so here we
know that every line that is in our
spreadsheet was successfully uploaded to
other B analytics and this is how it
works but basically you know we have
uploaded the data successfully now let's
talk about the options that we have here
the first option is unloaded rows or all
rows so if we click again on load now
you will see that nothing will happen
with the data you can see that the
general status tells you that no data
was loaded at that time and that also
shows you when the data was processed
why the data wasn't loaded because all
the line items in the table in your data
were loaded previously and this is used
to avoid duplicated uploads to other B
analytics because if you have uploaded
something why do you need to upload it
again but if you really have such a case
and write some cases but I'm pretty sure
that in your case maybe you will always
use only unloaded rows but anyway if you
understand that you may want to load the
data more than once you need to switch
to all rows now if I click on the load
Now button you will see that for every
line there will be in you date or
timestamp under the column B so let's do
this and this will mean that every line
will be upload
again so we can see that the file was
successfully loaded we can see that the
time has been updated and the status
tells us that all the lines were loaded
now let's talk about scheduling
sometimes you really want your data to
be loaded automatically on schedule and
let's imagine the following scenario for
email marketing we have our forecast
what this forecast tells us is how many
emails we plan to send in the future and
maybe this is our kind of target
operational target that we should send
for example I know hundred thousand
emails for next month
maybe this is your business target like
for example how many conversions should
happen from those emails or how many
leads or how much revenue you should get
from this or that campaign and if you
have such numbers that you plant or set
as targets you will need to upload them
with future dates but the thing is that
when you upload the data sources with
the dates that are in the future these
data sources will be will not be
processed by the B analytics by design
because only the data sources with the
dates in the past can be processed and
in this case you would need to manually
load almost every day the new data if
you want to have the synchronized with
your reports and this is not convenient
and now I'm going to show you how you
can automate it with this add-on for
this purpose I will create in U Street
so I will create a new sheet we'll go
back to dimensions and metrics and I
will add the variables date tracking
code and I will add one event for email
marketing specifically that will be used
for forecasted
number of emails so this will look like
this so now I will also rename the tab
so that it will be clear what this data
source is and I will enter here email
obviously the email id or dragon code
will be everywhere in this table and
this will be forecasted number of emails
sent and here for example I will enter
some values obviously this is something
that I have to forecast first but this
is just for example so something like
that and as for the dates today is 29th
of December so will how its emulate this
like we started from 20th of December
2019 and what if the next line is in the
future so for example this will be first
of January 2020 and other days so if now
we will try to load this data you will
see that only first line will be
processed because 20th of December is in
the past while the other dates are in
the future and they can't be processed
by data source so let's try to load this
and to the same FTP so I forgot that the
data sheets should be now data source
for custom URLs all right
so now what you can see that the data
was processed and was uploaded to a
Debian latex but the file included only
one line which is this one that has
status loaded the other lines have
status skipped this means that this
lines were not sent order be analytic
sat the time the spreadsheet was
processed just because the dates are in
the future however if you come back to
the spreadsheet on the 2nd of January
and it will click load now you will see
that if you other lines will be loaded
successfully so on the 2nd of January
these two lines will be loaded
successfully just because these dates
already either in the past or one of the
dates is today and in that particular
date but the other lines will not be
loaded they will still be with the
status skipped now imagine that you may
want to load this manually and in this
case you will need to open this
every day or every week and click the
button it's not convenient and what you
can do instead you can schedule this and
you can schedule this either to be
processed hourly or daily in my case in
this scenario it's enough to set daily
so what this will mean every day the
infrastructure will process this
spreadsheet automatically and obviously
those lines that can be loaded to other
be analytic s-- will be loaded the other
lines will still be in the state of
skipped and they will be processed at
the next iteration in the next job when
this schedule will be activated so now
let's try to activate this schedule
option and once you clicked on this
button you will see that there will be a
list of available scheduled jobs for
data logs so now we can see that this is
this the sheet or the tab name telling
us what was scheduled here we can see
the option which means that on the
unloaded rows will be uploaded then we
can see - what FTP the date is going to
be loaded then we can see the scheduled
this is daily if you have more than one
jobs you will see them in the list here
in this area if you don't want to load
this data automatically maybe one day
you understood that all the data already
now loaded so what you can do you can
delete this job click on this red button
and the schedule will be deleted so this
is how it works
so just to summarize when you want to
load data sources you need to use these
three these three links plus FTP
accounts and you can start with a
template then you can update the
template the way you want just make sure
that the first line is something that
you can update the way you want and the
second line shouldn't be changed at any
time the blue columns will appear after
the data is loaded these two columns are
not included in the template so only the
lines with all the columns with the
yellow color are included in your data
load and you can load the data either
manually or automatically with the
schedule and the only thing that you
need also to setup here is the FTP
account the data should be Center this
is how data sources work here in this
data connector for the binary text
add-on for Google sheets hope this was
helpful thank you for watching
Browse More Related Video
![](https://i.ytimg.com/vi/duMP3vNIaMQ/hq720.jpg)
LinkedIn Data Scraping Tutorial | 1-Click To Save to Sheets
![](https://i.ytimg.com/vi/83e0HCmLFfY/hq720.jpg)
Azure Stream Analytics with Event Hubs
![](https://i.ytimg.com/vi/WYLgPwV4HYo/hq720.jpg)
Amazon Keyword Dominator Boosted by FBAExcel - Import New Keyword Data Sources for Keyword Research
![](https://i.ytimg.com/vi/he-H-Qwes14/hq720.jpg)
Scrape website data without code using Bardeen
![](https://i.ytimg.com/vi/NNSHu0rkew8/hq720.jpg)
Power BI Tutorial for Beginners
![](https://i.ytimg.com/vi/dhpWu-0mfJQ/hq720.jpg)
Making data analytics work: Building a data-driven organization
5.0 / 5 (0 votes)