Upload Adobe Analytics Data Sources from Google Sheets with Data Connector Add-on

Andrey Osadchuk
6 Jan 202025:24

Summary

TLDRIn this tutorial, Andre Otto Schalk introduces the Google Sheets add-on for Adobe Analytics, guiding viewers on creating data source templates, uploading offline data, and setting up FTP accounts. The video demonstrates how to modify templates, input data for analytics, and automate data loading schedules, ensuring seamless integration with Adobe Analytics for enhanced data management.

Takeaways

  • πŸ˜€ The video is about using the Google Sheets data connector for Adobe Analytics to upload offline data from Google Sheets to Adobe Analytics.
  • πŸ˜€ The process starts by creating a template in Google Sheets, which contains three areas: friendly names for variables, information about the variables, and the main data entry area.
  • πŸ˜€ Users can customize the template by adding or removing columns, but they must keep the date column as the first column.
  • πŸ˜€ To upload data, users need to set up an FTP account with Adobe Analytics and save the account details in Google Sheets.
  • πŸ˜€ The data upload process involves selecting the data sheet, choosing the FTP account, and clicking the 'Load Now' button to start the upload.
  • πŸ˜€ If there are any errors during the upload, such as incorrect FTP credentials, the status will indicate the issue.
  • πŸ˜€ Users can schedule data uploads to happen automatically at regular intervals, such as daily or hourly.
  • πŸ˜€ The add-on allows for the uploading of future data by scheduling uploads, ensuring data is synchronized with reports over time.
  • πŸ˜€ The video also explains how to handle scenarios where data from various platforms, like ad platforms or email marketing, needs to be manually uploaded to Adobe Analytics.
  • πŸ˜€ The video provides a detailed walkthrough of setting up the template, filling it with data, configuring the FTP account, and automating the data upload process.

Q & A

  • What is the main purpose of the Google Sheets add-on for Adobe Analytics discussed in the video?

    -The main purpose of the add-on is to facilitate the process of uploading offline data directly to Adobe Analytics from Google Sheets, allowing users to work with data sources more efficiently.

  • How does the template creation process begin in the Google Sheets add-on?

    -The template creation begins by clicking on the 'Create a template' link, which appends a new tab to the spreadsheet and switches the sidebar to a list of variables compatible with data sources.

  • What are the three main areas of the template mentioned in the video?

    -The three main areas of the template are: 1) the yellow line for friendly names of variables, 2) the gray line for information about the wearables used, and 3) the area starting from line number 3 for entering data to be uploaded to Adobe Analytics.

  • Why should the gray line in the template not be changed?

    -The gray line should not be changed because it contains information about the wearables used for the data source, and altering it may cause the add-on to not work properly.

  • How can users customize the template to fit their specific data needs?

    -Users can customize the template by deleting unnecessary columns, such as a transaction ID, and adding new variables like 'number of orders', 'number of units', and 'revenue' using the 'Add to sheet' button.

  • What is the significance of the date column in the template?

    -The date column is significant as it serves as the timestamp for the data being uploaded to Adobe Analytics, and it should always be the first column in the template.

  • How does the video demonstrate the use of the add-on for uploading upper funnel data from ad platforms?

    -The video demonstrates this by updating the template to include tracking codes and metrics like impressions, clicks, and ad cost, and then filling in the spreadsheet with relevant data for both paid search and email marketing.

  • What is the role of the FTP account in uploading data to Adobe Analytics?

    -The FTP account is essential for setting up a connection to Adobe Analytics, allowing the data from the Google Sheets add-on to be uploaded and processed.

  • Why is it important to save the data source before attempting to load the data?

    -Saving the data source before loading is crucial because it ensures that the data is available for processing and uploading to Adobe Analytics; unsaved data will not be processed.

  • How can users avoid uploading duplicate data to Adobe Analytics?

    -Users can avoid uploading duplicate data by selecting the 'Unloaded rows' option when loading data, which ensures that only new or previously unloaded data is uploaded.

  • What is the benefit of scheduling data uploads in the add-on?

    -Scheduling data uploads allows for automatic processing and uploading of data at specified intervals, which is convenient for handling data with future dates and ensures that data is synchronized with reports without manual intervention.

Outlines

00:00

πŸ“Š Introduction to Google Sheets Data Connector for Adobe Analytics

Andre Otto Schalk introduces the Google Sheets Data Connector for Adobe Analytics, outlining the process of working with data sources, uploading offline data directly from Google Sheets to Adobe Analytics. The video will guide through creating a template, filling it with data, setting up an FTP account, and scheduling data uploads. The template structure is explained, including the areas for friendly names, wearables information, and the main data entry area. The video also covers how to modify the template by adding or removing variables and the importance of keeping the date column first.

05:01

πŸ“ Filling the Template with Data for Adobe Analytics

This section details the process of filling in the template with actual data. The script explains how to input dates and tracking codes for different marketing channels, such as paid search and email marketing. It demonstrates how to rename variables for clarity and how to manually input or copy-paste data into the spreadsheet. The importance of correct data format and the use of tooltips for variable explanations are highlighted. The video also discusses setting up FTP accounts within the add-on for data uploads, emphasizing the convenience of having multiple spreadsheets accessing the same FTP accounts.

10:03

πŸ” Setting Up and Managing FTP Accounts for Data Uploads

The script explains how to add a new FTP account in the add-on, advising on naming conventions to distinguish between different uses. It covers the necessity of obtaining the correct FTP host, username, and password from Adobe Analytics backend and the importance of saving the FTP account details for future use. The video also shows how to update or delete FTP accounts and the security measure of hiding the FTP sheet after setup to protect sensitive information.

15:10

πŸ—“ Loading Data and Handling Errors in Adobe Analytics

This part of the script describes the process of loading data into Adobe Analytics, including selecting the correct data sheet and FTP account. It demonstrates the initial loading process and the potential error messages that may appear, such as incorrect username and password errors. The video shows how to correct these errors by revisiting the FTP account setup and re-saving the credentials. It also explains the status indicators that show whether each line of data was successfully uploaded or skipped.

20:16

πŸ›  Options for Data Loading and Scheduling Uploads

The script discusses the options available for loading data, such as choosing to upload only unloaded rows to avoid duplication or all rows for repeated uploads. It also introduces the scheduling feature, which automates the data upload process at set intervals. The video provides a scenario for email marketing where forecasted data with future dates needs to be uploaded automatically, explaining how to set up and activate a daily schedule for data processing and upload.

25:17

πŸ”„ Automating Future Data Uploads with Scheduling

This section elaborates on the automation of uploading future-dated data sources by setting up a schedule within the add-on. The script shows how to create a new sheet for forecasted data, enter dates, tracking codes, and forecasted metrics, and then set up a schedule for daily processing. It explains how the system will automatically process and upload data as dates become relevant, and how to manage scheduled jobs, including deleting them when no longer needed.

πŸ‘‹ Conclusion and Final Thoughts

Andre Otto Schalk concludes the video by summarizing the key steps for working with data sources in the Google Sheets Data Connector for Adobe Analytics. He emphasizes the importance of using the template effectively, setting up FTP accounts, and utilizing the load and schedule features for seamless data uploads. The video ends with a thank you note for watching and an invitation for further engagement.

Mindmap

Keywords

πŸ’‘Google Sheets

Google Sheets is a web-based spreadsheet program offered by Google within the Google Drive service. It allows users to create, edit, and format spreadsheets online. In the video, Google Sheets is used as the platform to create templates and upload offline data to Adobe Analytics through a specific add-on.

πŸ’‘Adobe Analytics

Adobe Analytics is a web analytics tool that measures and analyzes the impact of digital marketing. It helps businesses understand their customers' behavior and optimize their marketing strategies. The video is focused on how to upload data to Adobe Analytics from Google Sheets.

πŸ’‘Data Connector

A data connector in the context of the video refers to an add-on that facilitates the connection between Google Sheets and Adobe Analytics, enabling the transfer of data from one platform to another. It's a key component in the process described in the video.

πŸ’‘Template

In the script, a template refers to a predefined spreadsheet structure that includes specific areas and variables for organizing and entering data. It's used to standardize the format of the data that will be uploaded to Adobe Analytics.

πŸ’‘FTP Account

FTP stands for File Transfer Protocol, a standard network protocol used for transferring files from one host to another. In the video, setting up an FTP account is a necessary step to enable the automated upload of data from Google Sheets to Adobe Analytics.

πŸ’‘Data Source

A data source in Adobe Analytics is a location from which data is collected. The video discusses how to create and upload data sources from Google Sheets to Adobe Analytics, which can include various marketing metrics and dimensions.

πŸ’‘Offline Data

Offline data refers to data that is collected outside of the main data collection process, such as from external platforms or through manual processes. The video explains how to upload this type of data directly to Adobe Analytics.

πŸ’‘Variables

Variables in the context of the video are elements within the Google Sheets template that correspond to different data points, such as dimensions and metrics. They are used to customize the template for specific data needs.

πŸ’‘Tracking Code

A tracking code is a snippet of code used in digital marketing to monitor the performance of marketing campaigns. In the script, tracking codes are used to identify and categorize different data sources, such as AdWords and email campaigns.

πŸ’‘Classification

In Adobe Analytics, classification refers to the categorization of data into meaningful groups. The video mentions creating a classification tab in Google Sheets to organize data before uploading it to Adobe Analytics.

πŸ’‘Scheduling

Scheduling in the video refers to the automated process of uploading data to Adobe Analytics at set intervals. It's a feature of the add-on that allows for regular and consistent data updates without manual intervention.

Highlights

Introduction to the Google Sheets add-on for Adobe Analytics, focusing on working with data sources and uploading offline data.

Creating a template for data entry in the Google Sheets add-on, including the explanation of the three areas within the template.

Instructions on how to modify the template by adding or removing variables to suit specific data needs.

Importance of keeping the date column as the first column when reordering columns in the template.

Demonstration of preparing clickstream data flow and classification for data sources.

Explanation of uploading upper funnel data from ad platforms to Adobe Analytics manually.

Guidance on updating the template to include relevant metrics like impressions, clicks, and costs for different marketing channels.

Process of filling in the spreadsheet with data for paid search and email marketing, including the importance of correct data format.

Setting up an FTP account within the add-on for data upload to Adobe Analytics.

Instructions on how to save and secure FTP account information within the Google Sheets add-on.

Loading data to Adobe Analytics via the add-on and handling potential errors during the process.

Understanding the difference between uploading unloaded rows and all rows to avoid data duplication.

Automating data uploads with scheduling options for future-dated data sources in Adobe Analytics.

Managing scheduled jobs for data uploads, including the ability to delete or modify them as needed.

Summary of the workflow for using data sources in the Google Sheets add-on for Adobe Analytics, emphasizing the importance of the template, FTP setup, and scheduling.

Transcripts

play00:01

hi this is Andre Otto Schalk and this is

play00:04

Google shoot data connector for Adobe

play00:06

Analytics in this video we'll get to

play00:08

know how to work with data sources how

play00:10

to upload offline data to analytics

play00:12

directly from Google sheets

play00:14

thanks to this add-on so first we will

play00:19

create a template then we will fill in

play00:21

this template with data then we will set

play00:24

up an FTP account and finally load and

play00:27

schedule the data to be sent to Adobe

play00:31

Analytics so first let's create a

play00:33

template and as you see once you have

play00:37

clicked on a link create a template and

play00:40

you tab was appended to the spreadsheet

play00:43

and also the sidebar has been switched

play00:47

to a list of variables that are

play00:49

compatible with data sources so let's

play00:53

first review what we have in the main

play00:55

area so the template itself contains

play00:58

three areas the first area is the yellow

play01:01

line and this is the line where you can

play01:05

type in any type of friendly names for

play01:07

your variables dimensions and metrics

play01:09

the second area is the line highlighted

play01:13

with the gray color and this is for your

play01:16

information so that you will know which

play01:18

wearables you are using for this

play01:20

particular data source so these lines

play01:23

shouldn't be changed at any time

play01:25

otherwise the add-on may not work

play01:27

properly and the main area is here that

play01:31

starts from the line number 3 so this

play01:34

area is where you will need to enter

play01:38

your data that you want to upload to

play01:40

other be analytic and on the right hand

play01:44

side you can see lots of variables all

play01:46

of them are supported by data sources

play01:48

and if you want to update this template

play01:51

you can do this with these variables so

play01:55

for example let's say that I don't want

play01:58

to have in my template for one event one

play02:01

transaction ID I can simply delete these

play02:05

columns and instead or in addition to

play02:08

these columns I want maybe to add number

play02:11

of orders

play02:13

number of units and also revenue so I

play02:18

just click on add to sheet button and

play02:21

these variables will be appended to the

play02:24

template so this is how you can work

play02:28

with the template you can always if you

play02:30

want to you can reorder the columns just

play02:33

make sure that the date column is the

play02:35

first one and this is probably

play02:39

everything what you need to know about

play02:42

templates so now if you worry about the

play02:46

scenario that I'm going to show you in

play02:48

this video in the previous videos we

play02:51

prepare the clickstream data flow and

play02:54

basically here we uploaded some page

play02:57

views and one of the metrics or

play03:02

variables that we used was the tracking

play03:04

code and here we have Adwords and email

play03:08

tracking codes that we used for example

play03:11

then we appended a new tab and this step

play03:16

is used for uploading a classification

play03:18

and as you can see we prepared the

play03:21

classification for several keys and now

play03:24

let's consider the scenario when we want

play03:27

to in reach our data in Adobe electrics

play03:31

with upper funnel data from ad platforms

play03:33

for example not not only for from ad

play03:37

platforms but from any other type of

play03:40

platforms that are not directly

play03:42

connected to the bioethics in our case

play03:45

imagine that for some reason you may not

play03:47

have integration with either DSP or with

play03:52

Google ads directly and probably for

play03:56

some reason you can't do this at the

play03:58

moment but you are happy to regularly

play04:00

upload this data manually so you know

play04:03

that maybe every month maybe twice a

play04:07

month you will export some data from ad

play04:10

platform you will export some data from

play04:12

your email provider and then those upper

play04:15

funnel metrics will be uploaded to other

play04:18

bin oolitic as a data source through

play04:21

these Google sheets so

play04:25

in other words what we will need we will

play04:28

need the tracking code we will need

play04:30

metrics like impressions clicks and at

play04:34

cost if this is email marketing this can

play04:36

be whatever is supported and provided by

play04:39

them email platforms for example how

play04:42

many mails were scheduled how many mails

play04:44

were sent how many were opened etc so I

play04:48

hope this concept is clear and this is

play04:52

just an example what you can do with

play04:54

data sources so this is why I will

play04:57

update the template I don't need to have

play05:00

product column I don't need to have

play05:04

these three columns I will delete them

play05:06

and I will start with the date and

play05:09

tracking code so basically date is our

play05:12

time stamp that we'll be using and for

play05:15

the tracking code for example paid

play05:19

search we will be using a few wins so

play05:24

for example I will take event 60 and 61

play05:29

and that will be used for number of

play05:32

impressions and add cost so let's add

play05:36

them and I will also rename them so that

play05:41

it will be clear for me what this events

play05:43

are used for so ad impressions and ad

play05:51

cost these two will be used for

play05:54

performance marketing so for email

play05:58

marketing I will be using some other

play05:59

events that will be 67-68 for this

play06:06

particular case I will be using only two

play06:09

and one of them stands for number of

play06:14

emails that were sent and the second one

play06:21

emails opened so how many mails were

play06:26

viewed and obviously if we were able to

play06:31

track that but basically the email

play06:34

platform was able to track that okay so

play06:37

this is our template

play06:39

so now let's fill it in so basically the

play06:43

date similar to how we filled it in for

play06:46

previous work flows we need to enter the

play06:52

date so for example this will be 18th of

play06:57

December 2019 I don't need to specify

play07:01

here the time but if I want you I can do

play07:04

this

play07:05

by the way again if you hover the mouse

play07:07

cursor over different variables you may

play07:10

find there some tips or explanation of

play07:12

the format of what this variable is used

play07:15

for etcetera so do not forget about it

play07:18

so enter the date and then I will be

play07:23

first filling in this spreadsheet for

play07:27

paid search which is in my case at worse

play07:30

then obviously this data have to be

play07:34

exported from the add platform then

play07:36

somehow processed so that you will have

play07:39

this format but for simplicity for this

play07:42

particular video I will be typing in

play07:45

this just to demonstrate how it works so

play07:48

for example here we had thousand

play07:51

impressions our ad cost was for example

play07:57

$50 or 50 something else it depends on

play08:00

the currency that you set for the report

play08:02

suite then for the mail marketing

play08:05

obviously this will be 0 we don't need

play08:08

this numbers and I will just copy and

play08:11

paste the same row and now it will be

play08:16

emailed so just to remind you that email

play08:19

and dead words are the keys that I used

play08:22

for tracking code or for tracking

play08:25

purposes and here for email marketing ad

play08:30

impressions and that cost obviously will

play08:31

have to be set to zero and emails sent

play08:35

and opened so this is something that I

play08:38

need to enter and for example this would

play08:42

be 10,000 and opened

play08:49

8,500 okay so pretty much the same I

play08:52

have to do for every day and this is how

play09:00

it should work so for example maybe here

play09:03

we had some other numbers and I hope you

play09:10

got the idea how it may look like and

play09:15

when you prepare such data source the

play09:20

next step is to upload this data source

play09:23

to other the analytics and for this

play09:26

purpose you need to add an FTP so how to

play09:29

do this we will need to go back to the

play09:32

main menu and click on the ftp accounts

play09:35

link and this will add a new tab that

play09:39

will be filled in with your current ftp

play09:42

accounts that you have already saved

play09:45

under your Google account so basically

play09:48

it doesn't matter which spreadsheet you

play09:51

will be working with if you work with

play09:53

this add-on you will see all the

play09:55

accounts that you have set up this is

play09:57

convenient because maybe you have

play09:59

different spreadsheets that are used to

play10:02

send the data to the same FTP account so

play10:07

here let's add a new FTP so I advise you

play10:12

to always follow some naming conventions

play10:15

my name is convention is the following

play10:18

first I want to make sure that the

play10:21

prefix tells me what this FTP account is

play10:24

used for because it can be for a

play10:27

classification can be for data sources

play10:29

can be for something else so in my case

play10:31

this is data source and then if you want

play10:36

you can also add a suffix or maybe

play10:38

another prefix for reports with but in

play10:41

my case I know that this is used only

play10:43

for one reports with this is why I will

play10:45

not enter something referring to reports

play10:48

with so I will just add that this is

play10:50

tracking code so then I will need to

play10:56

enter FTP host username and password

play10:58

this is something that you will need to

play11:00

find

play11:01

or set up on the back end of other be

play11:04

analytic and fill it in and once you

play11:10

have added in you data store and you FTP

play11:13

account do not forget to save it it's

play11:15

essential that you first saved the data

play11:18

otherwise it will not be available for

play11:21

you when you load the data so save it

play11:24

alright so now when the data servers is

play11:28

saved you will see that the password is

play11:31

not shown and this is done on purpose

play11:35

for security reasons but if you want to

play11:37

update the name of the FTP account if

play11:40

you want to update the username it's not

play11:43

a problem to do that you don't need to

play11:45

re-enter the password in this case if

play11:48

you want to update the password you need

play11:49

to reenter it and then again click Save

play11:51

so I also want to remind you that if you

play11:55

want you delete an FTP account you need

play11:59

to delete the row with that FTP account

play12:03

and again save the FTP otherwise it will

play12:08

the changes will not be saved so now I

play12:10

will return my FTP account and toggle

play12:15

again we enter the enter the password

play12:18

and save it once the FTP is are saved I

play12:23

advise you to always click on the hide

play12:26

FTP sheet so that you will not see it

play12:29

and this is again a good idea for

play12:32

security reasons especially if you want

play12:34

to share this with your colleagues so

play12:36

that nobody will see the real passwords

play12:40

alright so once the FTP is set up what

play12:45

we need to do is to go to load and

play12:50

schedule so this user interface looks

play12:54

very similar to how it looks for

play12:56

clickstream data workflow so first you

play13:00

need to select the data sheet the data

play13:03

should be sent from so in our case this

play13:06

is data source and for the FTP accounts

play13:10

you will need to select the FTP account

play13:13

that you set

play13:15

for these data source all right so now

play13:21

basically this is it this is the minimum

play13:25

what you need to set up prior to you

play13:28

load the data to other the analytics and

play13:30

let's try to do this I mean it's a small

play13:35

mistake on purpose so that you will see

play13:38

some statuses that may appear that you

play13:41

may face with so let's load the data and

play13:44

you will see what I'm referring to

play13:47

once you have clicked on the load Now

play13:51

button the data started processing and

play13:55

once the data is processed you will see

play13:58

the status and the date of the of the

play14:02

processing and here we have a status

play14:04

failure meaning that there was a mistake

play14:06

somewhere or a problem and afterwards

play14:10

you will see the explanation of what was

play14:12

wrong so here we can see that the

play14:14

username and passwords are incorrect so

play14:17

I made it on made that mistake on

play14:19

purpose so that you will see what type

play14:21

of statuses may be and so that you will

play14:25

understand how this works so now what we

play14:27

need to do is go back to FTPS and i will

play14:32

replace the FTP credentials with the

play14:37

correct ones

play14:50

save FTP and then hide this alright

play14:56

so now I will try again to load the data

play15:01

and let's click on this button so now

play15:09

you can see that in addition to the

play15:13

general status that tells you that the

play15:17

data was successfully loaded and it also

play15:19

shows you the file name the data was

play15:21

uploaded to a dev analytics in and also

play15:24

for every line it shows you the status

play15:26

because sometimes not every line can be

play15:29

included into the file that will be

play15:32

centered they've been oolitic so here we

play15:34

know that every line that is in our

play15:36

spreadsheet was successfully uploaded to

play15:40

other B analytics and this is how it

play15:43

works but basically you know we have

play15:45

uploaded the data successfully now let's

play15:50

talk about the options that we have here

play15:53

the first option is unloaded rows or all

play15:57

rows so if we click again on load now

play16:00

you will see that nothing will happen

play16:03

with the data you can see that the

play16:06

general status tells you that no data

play16:09

was loaded at that time and that also

play16:11

shows you when the data was processed

play16:13

why the data wasn't loaded because all

play16:16

the line items in the table in your data

play16:19

were loaded previously and this is used

play16:23

to avoid duplicated uploads to other B

play16:26

analytics because if you have uploaded

play16:27

something why do you need to upload it

play16:30

again but if you really have such a case

play16:33

and write some cases but I'm pretty sure

play16:35

that in your case maybe you will always

play16:38

use only unloaded rows but anyway if you

play16:40

understand that you may want to load the

play16:43

data more than once you need to switch

play16:45

to all rows now if I click on the load

play16:48

Now button you will see that for every

play16:49

line there will be in you date or

play16:54

timestamp under the column B so let's do

play16:58

this and this will mean that every line

play17:01

will be upload

play17:02

again so we can see that the file was

play17:06

successfully loaded we can see that the

play17:08

time has been updated and the status

play17:11

tells us that all the lines were loaded

play17:14

now let's talk about scheduling

play17:20

sometimes you really want your data to

play17:23

be loaded automatically on schedule and

play17:27

let's imagine the following scenario for

play17:31

email marketing we have our forecast

play17:34

what this forecast tells us is how many

play17:37

emails we plan to send in the future and

play17:41

maybe this is our kind of target

play17:43

operational target that we should send

play17:45

for example I know hundred thousand

play17:48

emails for next month

play17:51

maybe this is your business target like

play17:55

for example how many conversions should

play17:57

happen from those emails or how many

play18:01

leads or how much revenue you should get

play18:05

from this or that campaign and if you

play18:08

have such numbers that you plant or set

play18:11

as targets you will need to upload them

play18:15

with future dates but the thing is that

play18:18

when you upload the data sources with

play18:20

the dates that are in the future these

play18:23

data sources will be will not be

play18:26

processed by the B analytics by design

play18:28

because only the data sources with the

play18:30

dates in the past can be processed and

play18:34

in this case you would need to manually

play18:37

load almost every day the new data if

play18:40

you want to have the synchronized with

play18:43

your reports and this is not convenient

play18:45

and now I'm going to show you how you

play18:47

can automate it with this add-on for

play18:50

this purpose I will create in U Street

play18:54

so I will create a new sheet we'll go

play18:57

back to dimensions and metrics and I

play19:01

will add the variables date tracking

play19:04

code and I will add one event for email

play19:09

marketing specifically that will be used

play19:12

for forecasted

play19:16

number of emails so this will look like

play19:20

this so now I will also rename the tab

play19:24

so that it will be clear what this data

play19:27

source is and I will enter here email

play19:37

obviously the email id or dragon code

play19:42

will be everywhere in this table and

play19:43

this will be forecasted number of emails

play19:50

sent and here for example I will enter

play19:55

some values obviously this is something

play19:58

that I have to forecast first but this

play20:00

is just for example so something like

play20:15

that and as for the dates today is 29th

play20:20

of December so will how its emulate this

play20:23

like we started from 20th of December

play20:30

2019 and what if the next line is in the

play20:35

future so for example this will be first

play20:38

of January 2020 and other days so if now

play20:47

we will try to load this data you will

play20:50

see that only first line will be

play20:55

processed because 20th of December is in

play20:57

the past while the other dates are in

play21:00

the future and they can't be processed

play21:02

by data source so let's try to load this

play21:04

and to the same FTP so I forgot that the

play21:08

data sheets should be now data source

play21:11

for custom URLs all right

play21:18

so now what you can see that the data

play21:21

was processed and was uploaded to a

play21:23

Debian latex but the file included only

play21:27

one line which is this one that has

play21:31

status loaded the other lines have

play21:34

status skipped this means that this

play21:36

lines were not sent order be analytic

play21:39

sat the time the spreadsheet was

play21:41

processed just because the dates are in

play21:44

the future however if you come back to

play21:48

the spreadsheet on the 2nd of January

play21:50

and it will click load now you will see

play21:52

that if you other lines will be loaded

play21:55

successfully so on the 2nd of January

play21:59

these two lines will be loaded

play22:01

successfully just because these dates

play22:03

already either in the past or one of the

play22:06

dates is today and in that particular

play22:09

date but the other lines will not be

play22:12

loaded they will still be with the

play22:15

status skipped now imagine that you may

play22:18

want to load this manually and in this

play22:22

case you will need to open this

play22:23

every day or every week and click the

play22:26

button it's not convenient and what you

play22:29

can do instead you can schedule this and

play22:32

you can schedule this either to be

play22:34

processed hourly or daily in my case in

play22:38

this scenario it's enough to set daily

play22:41

so what this will mean every day the

play22:45

infrastructure will process this

play22:47

spreadsheet automatically and obviously

play22:50

those lines that can be loaded to other

play22:52

be analytic s-- will be loaded the other

play22:55

lines will still be in the state of

play22:57

skipped and they will be processed at

play22:59

the next iteration in the next job when

play23:03

this schedule will be activated so now

play23:06

let's try to activate this schedule

play23:09

option and once you clicked on this

play23:13

button you will see that there will be a

play23:15

list of available scheduled jobs for

play23:20

data logs so now we can see that this is

play23:23

this the sheet or the tab name telling

play23:28

us what was scheduled here we can see

play23:32

the option which means that on the

play23:34

unloaded rows will be uploaded then we

play23:38

can see - what FTP the date is going to

play23:41

be loaded then we can see the scheduled

play23:44

this is daily if you have more than one

play23:47

jobs you will see them in the list here

play23:49

in this area if you don't want to load

play23:54

this data automatically maybe one day

play23:57

you understood that all the data already

play23:59

now loaded so what you can do you can

play24:02

delete this job click on this red button

play24:06

and the schedule will be deleted so this

play24:11

is how it works

play24:12

so just to summarize when you want to

play24:15

load data sources you need to use these

play24:19

three these three links plus FTP

play24:24

accounts and you can start with a

play24:27

template then you can update the

play24:30

template the way you want just make sure

play24:33

that the first line is something that

play24:36

you can update the way you want and the

play24:39

second line shouldn't be changed at any

play24:41

time the blue columns will appear after

play24:44

the data is loaded these two columns are

play24:47

not included in the template so only the

play24:51

lines with all the columns with the

play24:53

yellow color are included in your data

play24:57

load and you can load the data either

play25:01

manually or automatically with the

play25:03

schedule and the only thing that you

play25:06

need also to setup here is the FTP

play25:09

account the data should be Center this

play25:12

is how data sources work here in this

play25:17

data connector for the binary text

play25:19

add-on for Google sheets hope this was

play25:21

helpful thank you for watching

Rate This
β˜…
β˜…
β˜…
β˜…
β˜…

5.0 / 5 (0 votes)

Related Tags
Adobe AnalyticsGoogle SheetsData UploadOffline DataTutorialData ConnectorAnalyticsFTP SetupMarketing DataAutomationData Integration