Tableau Desktop Specialist Exam Practice Questions - Part 1 | Become a Certified Tableau Developer

AllAboutDATA
8 Oct 202317:31

Summary

TLDRThis YouTube video introduces a series on Tableau Desktop certification, offering tips and study materials for exam success. The script covers exam details, including the structure of 45 multiple-choice questions, time limits, scoring, and fees. It delves into the exam's four domains, focusing on 'Connecting to and Preparing Data' with topics like live vs. extract connections, metadata, and data relationships. The video promises a question-by-question breakdown in subsequent episodes, aiming to clarify concepts and boost certification chances.

Takeaways

  • πŸ˜€ The video is part of a Tableau Desktop Certification series aimed at sharing tips and tricks for exam success and providing study material.
  • πŸ“š Viewers are encouraged to subscribe to the channel and hit the Bell icon for updates on future videos.
  • ⏱ The exam consists of 45 multiple-choice and multiple-select questions with a time limit of 60 minutes and a passing score of 750 out of 1000.
  • πŸ’° The exam's registration fee is $100 plus GST.
  • πŸ‘©β€πŸ’» The target audience is individuals with at least 3 months of experience using Tableau Desktop.
  • πŸ”‘ The certification, once obtained, does not expire and is valid for a lifetime.
  • πŸ“ˆ The exam is divided into four domains, with varying numbers of questions expected from each.
  • πŸ” Domain one focuses on connecting to and preparing data, with topics such as live versus extract connections, metadata, and data relationships.
  • πŸ“ The video series will discuss 20 questions per video, aiming to cover as many as possible to assist viewers in passing the exam.
  • πŸ”‘ The first question of the series addresses when to use data blending to combine data, with the correct answer being when using data sources that cannot be combined with the default method of using a relationship.
  • πŸ”„ The difference between live and extract connections is highlighted, with live allowing real-time data and extract requiring periodic refreshes.
  • 🚫 Direct connections to extracts are not recommended due to various limitations, such as inability to refresh and potential data model loss.

Q & A

  • What is the primary purpose of the certification series discussed in the video?

    -The primary purpose of the certification series is not only to help viewers clear the certification exam but also to gain knowledge and skills related to Tableau Desktop.

  • How many questions are there in the Tableau Desktop certification exam?

    -There are 45 questions in the Tableau Desktop certification exam, which are in the form of multiple-choice and multiple-select items.

  • What is the time limit for completing the Tableau Desktop certification exam?

    -The time limit for the Tableau Desktop certification exam is 60 minutes.

  • What is the passing score for the Tableau Desktop certification exam?

    -The passing score for the Tableau Desktop certification exam is 750 out of 1000.

  • What is the registration fee for the Tableau Desktop certification exam, including GST?

    -The registration fee for the Tableau Desktop certification exam is $100 plus GST.

  • Who is the target audience for the Tableau Desktop certification exam?

    -The target audience for the Tableau Desktop certification exam are individuals who have skills with Tableau Desktop and a minimum of 3 months of experience.

  • How many domains are there in the Tableau Desktop certification exam, and what are they?

    -There are four domains in the Tableau Desktop certification exam: 1) Connecting to and preparing data, 2) Exploring and analyzing the data, 3) Sharing insights, and 4) Understanding Tableau Concepts.

  • What is the focus topic for Domain 1: Connecting to and preparing data in the certification exam?

    -The focus topics for Domain 1 include live versus extract, what happens when you connect to the data, metadata, table extensions, relationships versus joins versus blends, and multiple ways to rename a field, among others.

  • What is the correct answer to the question 'When should you use data blending to combine data?'

    -The correct answer is when using data sources that cannot be combined with the default method of using a relationship.

  • What are the differences between live and extract connections in Tableau Desktop?

    -Live connections allow real-time data and work directly with the data source, reflecting any changes made to the dataset. Extract connections create a subset of the data that needs to be refreshed periodically. Live connections take more time to load data compared to extract connections.

  • Why are direct connections to the extract not recommended in Tableau Desktop?

    -Direct connections to the extract are not recommended because table names will be different, the extract cannot be refreshed, and the data model and relationships will be lost.

  • How many data sources can be blended in Tableau Desktop?

    -In Tableau Desktop, you can blend two data sources, creating a primary and a secondary dataset.

  • What is the advantage of using relationships to combine tables in Tableau Desktop?

    -The advantages of using relationships to combine tables include making it easier to analyze data across multiple tables at different levels of granularity and ensuring that tables are only queried when fields from the tables are added to the views.

  • What happens when you assign a Geographic role to a field in Tableau Desktop?

    -When you assign a Geographic role to a field in Tableau Desktop, the software adds two fields: latitude and longitude.

  • What are the options available while creating an extract in Tableau Desktop?

    -The options available while creating an extract in Tableau Desktop include filters, aggregation, number of rows, hide unused fields, and history and data storage.

Outlines

00:00

πŸ“š Introduction to Tableau Desktop Certification Series

The script introduces a new series on the YouTube channel dedicated to Tableau Desktop certification. The host offers to share tips, tricks, and study materials for exam preparation. Viewers are encouraged to subscribe and enable notifications for updates. The certification aims to validate skills and knowledge, with the exam consisting of 45 multiple-choice and multiple-select questions to be completed within 60 minutes. A passing score is 750 out of 1000, with a registration fee of $100 plus GST. The target audience includes individuals with at least three months of experience using Tableau Desktop. The certification, once obtained, does not expire and is valid for life. The video will cover the exam pattern, which is divided into four domains, with the first domain focusing on connecting to and preparing data.

05:01

πŸ“ Exam Details and Domain One Overview

This paragraph delves into the specifics of the certification exam, including the number of questions, types of questions, and the time limit. It outlines the passing score and registration fees. The script then moves on to describe the first domain in detail, which includes topics such as live versus extract connections, metadata, table extensions, relationships, joins, blends, and field renaming. The host plans to discuss 20 questions per video, aiming to cover as many as possible in the series to aid viewers in passing the exam. The first question discussed is about the appropriate use of data blending, with options provided and the correct answer explained.

10:07

πŸ” Differences Between Live and Extract Connections

The script continues with a comparison between live and extract connections in Tableau. It explains that live connections allow for real-time data interaction, while extracts are subsets of data that need periodic refreshing. Live connections are said to take more time to load data due to handling full data sets, unlike extracts. The paragraph also addresses misconceptions about live connections not slowing down operations or queries and clarifies that both connections have distinct functionalities. The correct options for the differences between live and extract are identified, with the script guiding viewers on the correct answers to questions about direct connections to extracts and data source details.

15:11

πŸ—ΊοΈ Geographic Roles and Data Source Editing

This section of the script discusses the assignment of geographic roles to fields in Tableau, which by default adds latitude and longitude fields. It also covers how to change the data type of a field within a view and add default comments to a field. The script provides step-by-step instructions for these actions. Additionally, it explores the effects of using a union to combine tables, explaining that it typically results in an increase in the number of rows. The paragraph concludes with a question about the color of tick marks for primary and secondary data sources in a blend, revealing the correct answer through a visual example.

πŸ› οΈ Managing Metadata and Creating Extracts

The final paragraph of the script focuses on managing metadata in Tableau, which allows users to view hidden fields, field names in the original data source, and to clean and fix issues. It also addresses the automatic assignment of data types and roles when connecting to a data source. The script clarifies that Tableau creates a live connection by default. Lastly, it discusses the options available when creating an extract, such as filtering, aggregating, specifying the number of rows, hiding unused fields, and managing history and data storage. The correct answers for the available options while creating an extract are provided, and the video concludes with a call to action for likes, shares, subscriptions, and staying tuned for future videos.

Mindmap

Keywords

πŸ’‘Tableau Certification

Tableau Certification is a professional credential offered by Tableau Software that validates one's skills in using Tableau Desktop for data visualization and analysis. In the video, the host is starting a series dedicated to helping viewers prepare for the certification exam, indicating the importance of this certification in the field of data analytics.

πŸ’‘Multiple Choice and Multiple Select

These terms refer to the types of questions that will appear on the Tableau certification exam. Multiple choice questions have one correct answer, while multiple select items allow for more than one correct answer. The script mentions that multiple select items have more 'verage' than multiple choice, implying they may be more complex or carry more weight in scoring.

πŸ’‘Time Limit

The 'Time Limit' of 60 minutes is the maximum duration allowed for candidates to complete the Tableau certification exam. This constraint is crucial for exam preparation, as it sets the pace for how quickly one must work through the questions.

πŸ’‘Passing Score

A 'Passing Score' of 750 out of 1,000 is the minimum score required to pass the Tableau certification exam. This score represents the threshold that candidates must exceed to earn the certification, emphasizing the need for a strong understanding of Tableau Desktop.

πŸ’‘Registration Fees

The 'Registration Fees' of $100 plus GST is the cost associated with signing up for the Tableau certification exam. This fee is an important consideration for potential candidates who must budget for the exam as part of their certification journey.

πŸ’‘Target Audience

The 'Target Audience' for the Tableau certification exam is individuals who have skills with Tableau Desktop and a minimum of 3 months of experience. This indicates that the exam is designed for those who have practical experience with the software and are looking to validate their abilities.

πŸ’‘Exam Pattern

The 'Exam Pattern' refers to the structure of the certification exam, which is divided into four domains. Each domain covers different aspects of Tableau Desktop usage, and the script outlines the approximate number of questions and topics that candidates can expect from each domain.

πŸ’‘Data Blending

Data Blending is a technique used in Tableau to combine data from different sources when the data cannot be combined using a relationship. The script discusses when to use data blending, such as when using data sources that cannot be combined with the default method of using a relationship.

πŸ’‘Live vs. Extract Connection

The terms 'Live' and 'Extract Connection' refer to two different ways of connecting to data sources in Tableau. Live connections allow for real-time data interaction, while extract connections involve a subset of data that needs to be refreshed periodically. The script differentiates between these two by explaining their functionalities and implications for data analysis.

πŸ’‘Metadata

In the context of Tableau, 'Metadata' refers to the information about the data source, such as data type, field names, and other properties. The script mentions metadata in relation to managing and understanding the structure of the data, which is crucial for effective data analysis and preparation.

πŸ’‘Pivot and Split

Pivot and Split are features in Tableau that allow users to manipulate data fields. Pivoting a field turns it into columns, while splitting divides a field into multiple parts based on certain criteria. These features are essential for reshaping data to fit specific analysis needs, as mentioned in the script.

πŸ’‘Data Interpreter

The 'Data Interpreter' in Tableau is a tool that helps users understand and prepare data for analysis. It can suggest the best way to interpret the data, such as recognizing dates or numerical values, which is an important step in the data preparation process discussed in the script.

πŸ’‘Union

A 'Union' in Tableau is a method of combining data from two tables by appending rows from one table to another. The script mentions that a union typically results in an increase in the number of rows, which is an important consideration when merging datasets.

πŸ’‘Geographic Role

Assigning a 'Geographic Role' to a field in Tableau allows the software to recognize the field as a geographic identifier, such as latitude and longitude. This enables users to perform spatial analysis and visualizations. The script notes that assigning this role adds specific fields to the dataset.

πŸ’‘Default Properties

In Tableau, 'Default Properties' are the initial settings that apply to a field when it is added to a visualization. These properties can include comments, data types, and other attributes that define how the field behaves and is displayed. The script provides an example of how to add a comment to a field using default properties.

Highlights

Introduction to the Tableau Desktop Certification series, offering tips and study materials for exam preparation.

The certification aims to not only clear the exam but also to gain practical skills.

Details of the exam structure, including the number of questions, question types, and time limit.

The passing score and registration fee for the certification exam.

Target audience for the certification, focusing on individuals with at least 3 months of experience with Tableau Desktop.

The certification is valid for a lifetime once obtained.

Exam pattern divided into four domains with the number of questions expected from each.

Domain one focuses on connecting to and preparing data, with key topics such as live versus extract and metadata table extensions.

Explanation of multiple-choice and multiple-select item questions and their differences.

Discussion on when to use data blending to combine data sources.

Differences between live and extract connections, including their impact on data loading times and refresh.

Reasons why direct connections to extracts are not recommended.

Information stored in Tableau Data Source (TDS) files, including data source type and connection information.

Limitations on the number of data sources that can be blended in Tableau.

The process of adjusting data granularity through aggregation and the use of 'Group by'.

Advantages of using relationships to combine tables in Tableau.

The correct pathway to edit the data source in Tableau.

The creation of aliases for fields and the types of fields that cannot be aliased.

The addition of geographic roles to fields and the default fields added by Tableau.

The method to change the data type of a field within a view.

Adding default comments to a field and the correct pathway to do so.

The effects of using a union on the number of columns and rows in a dataset.

Use cases for managing metadata in Tableau, including viewing hidden fields and the original field names.

The automatic assignment of data type and role when connecting to a data source in Tableau.

Options available when creating an extract in Tableau, such as filters, aggregation, and data storage.

Transcripts

play00:00

hello guys welcome back to my YouTube

play00:02

channel today I'm going to start table

play00:05

desktop certification series here I will

play00:08

share tips and tricks how to clear the

play00:10

exam as well as their study material

play00:13

before starting this video if you have

play00:14

not subscribed to the channel hit the

play00:16

Subscribe button and press the Bell icon

play00:18

for upcoming and interesting videos so

play00:20

without any further delay let's get

play00:23

started the purpose of the certification

play00:25

series is not only to clear the

play00:28

certification but to gain the not as

play00:31

well first let's discuss some of the

play00:34

details of the exam then we will move to

play00:36

the question so number of the questions

play00:38

are 45 and these question are multiple

play00:41

choice and multiple select items

play00:43

multiple select item question have more

play00:45

verage than multiple choice we will

play00:47

discuss both type of question in our

play00:49

series and the time limit is 60 Minutes

play00:52

passing scores is 750 out of 1,000 and

play00:56

registration fees is $100 plus GST

play00:59

Target target audience target audience

play01:01

is basically who has skills with tblo

play01:03

desktop with the minimum of 3 months of

play01:06

experience okay maintaining

play01:09

certification so this certification

play01:11

doesn't expire once you clear this

play01:13

certification this certification will

play01:15

valid

play01:17

lifetime now discuss about the exam

play01:21

pattern so exam is divided into four

play01:24

domains so first domain is connecting to

play01:27

and preparing data you can expect 10 to

play01:30

11 question from this domain domain two

play01:32

is exploring and analyzing the data you

play01:35

can expect 16 to 17 question from this

play01:39

domain domain three is sharing insights

play01:41

you can expect 11 or 11 to 12 question

play01:44

from this

play01:46

domain understanding T Concepts you can

play01:48

expect six to seven question from this

play01:50

domain okay you can expect one or two

play01:54

question extra or less per domain okay

play01:57

so today we are going to discuss domain

play02:00

one domain one is connecting to and

play02:02

preparing the data you will get 10 to 11

play02:05

question from this domain so these are

play02:08

the focus topic for this domain one is

play02:10

live versus extract so you you can

play02:13

expect one or two or maybe three

play02:15

question from this second one is what

play02:17

happen when you connect to the data this

play02:19

is like directly mentioned in our

play02:22

syllabus of the table desktop series and

play02:25

third one is metadata table extension so

play02:28

mainly Focus ontd you can expect one or

play02:30

two question from the table extension

play02:33

relationship versus join versus blend

play02:36

you can expect 3 to four or maybe more

play02:38

question based on relationship versus

play02:41

join versus blend and then multiple ways

play02:44

to rename a field alas default

play02:47

properties data types and how to change

play02:49

their data type how to access default

play02:52

properties pivot and split and data

play02:54

interpreter so these are 11 topics that

play02:57

are mainly focused in this domain

play03:00

okay now let's move to our question part

play03:04

and discuss some of the question one by

play03:06

one in every video I'm going to discuss

play03:09

20 question and I'll try to discuss as

play03:12

much as question I can in this series so

play03:14

that it would be beneficial for you to

play03:16

clear the exam the first question of our

play03:19

series is when should you use data

play03:22

blending to combine the data let's check

play03:25

their option when the data has same

play03:27

structure but is for different time of

play03:30

period this is wrong because in the data

play03:32

blending we can connect the data from

play03:35

different time period as well okay when

play03:37

data is from same connection this is

play03:39

also wrong because we use data blending

play03:42

to connect data from multiple data

play03:44

sources or multiple connection when the

play03:46

data will be used in multiple worksheet

play03:48

within a workbook okay this is also

play03:51

wrong because for this we can use join

play03:54

and relationship as well so this is also

play03:56

wrong and last one is D when using data

play03:59

sources that cannot be combined with the

play04:02

default method of using a relationship

play04:04

this is right one because let's say we

play04:07

are working uh with the published data

play04:09

source so with published data source the

play04:12

only way to combine the data is data

play04:14

blending okay the answer is

play04:22

D second one is what is the difference

play04:25

between live and extract connection

play04:28

select all that apply let's check their

play04:32

option live allows realtime data while

play04:36

extract our kind of batches that needs

play04:38

to be refreshed from time to time get

play04:41

the updated data okay this is correct

play04:44

because live basically work we are

play04:46

working directly with the data sources

play04:48

and any changes made to the data set

play04:50

will be reflect in our database so this

play04:53

is right and extract our kind of batches

play04:56

batches is basically subset of the data

play04:58

that needs to be refreshed so this is

play05:01

Right Live connection takes more time to

play05:04

load the data then exct this is also

play05:06

correct because as we are working with

play05:08

the live data set or full set of the

play05:10

data so live take more time to load the

play05:13

data rather than extract connection okay

play05:15

A and B are right live connection

play05:18

doesn't slow down operation queries

play05:20

which result in fetching data quickly

play05:23

this is wrong because if we calculate

play05:25

the time taken by live and extract to

play05:28

fetch out the data so live takes more

play05:31

time as compared to extract so C option

play05:33

is wrong both the connection same time

play05:36

to work with and they don't have any

play05:38

functional difference this is also wrong

play05:40

because live is full subset live is full

play05:44

data and extract is subset of the data

play05:46

so this is also wrong so the right

play05:48

option

play05:49

is a and

play05:55

b third one is why are the direct

play05:58

connection to the extract is not

play06:00

recommended select all apply the table

play06:03

names will be different this is right

play06:06

extract cannot be refreshed this is also

play06:10

right data model and the relationship

play06:12

will be R this is also right the extract

play06:14

connection cannot be removed once it is

play06:17

connected directly this is wrong for

play06:19

this let's discuss the official document

play06:21

of the Tableau to no more about this

play06:24

option okay so from this article you can

play06:27

see why we don't connect directly to the

play06:30

extract table names will be different

play06:32

you cannot extract the refresh the data

play06:34

model will be lost okay you can read the

play06:37

more details about here I will provide

play06:39

this link in the description box

play06:44

okay but the right option

play06:48

is

play06:51

a b and c okay now next question is data

play06:57

source. TDS contains only information we

play07:01

need to connect to the data source

play07:03

including the following select all apply

play07:06

so basically in this question we need to

play07:09

tell which type of information TDS store

play07:12

okay so it store data source type this

play07:15

is correct group sets calculated Fields

play07:18

beans this is also correct connection

play07:20

information specified on the data source

play07:23

EX for example data server address Port

play07:26

location of the file table this is also

play07:27

right default build properties for

play07:30

example number format aggregation and so

play07:32

this is also right okay

play07:35

so answer

play07:36

is all

play07:42

option okay next question is how many

play07:45

data source can be blend in the table 2

play07:49

3 4 5 so the right answer is two because

play07:53

when we connect the data by using data

play07:55

blending it create two data set that is

play07:57

one is primary and secondary okay so the

play08:00

right answer is two in relationship we

play08:03

need to define the following joint type

play08:06

inner joint left join right join or it

play08:09

is not required to define a join in

play08:12

relationship if you know the answer

play08:14

please write down in the comment

play08:16

section right answer is it is not

play08:19

required to define a joint in

play08:21

relationship because relationship

play08:23

doesn't requir a joint to connect the

play08:25

data okay so the right answer is d

play08:32

which option leads to adjusting the

play08:34

granularity of the data in the

play08:37

table aggregate Group by Group by field

play08:42

and extract answer is aggregate because

play08:46

Aggregate and the granularity are

play08:48

opposite to each other if more

play08:50

granularity less aggregation less

play08:53

granularity more aggregation okay so the

play08:55

right answer

play08:58

is

play09:02

a next question is advantage to using

play09:05

relationship to combine table select all

play09:08

apply first option is make it easier to

play09:11

change the key Fields used to combine

play09:13

the table this is wrong because we can

play09:16

do this by using data blending as well

play09:18

so this is not an advantage make it

play09:20

easier to analyze data across multiple

play09:23

tables at different levels of grity this

play09:26

is correct because relationship keeps

play09:29

both the table as a separated table and

play09:31

we can analyze both the tables at

play09:33

different level of granularity okay

play09:35

makes it easier to combine rows from one

play09:38

table with rows from another this is

play09:40

wrong because we can do this by using

play09:42

Union not with relationship tables are

play09:45

only quered when fields are from the

play09:48

tables are added to the Views this is

play09:50

correct right answer is B and

play09:55

D next question is select the correct

play09:58

Pathway to to edit the data source let's

play10:01

try it how we can edit the data

play10:06

source go to data then our data source

play10:10

and then added data source let's check

play10:13

which option is matching with this data

play10:15

menu select a data source and added data

play10:18

source so the right answer is

play10:23

a alas cannot be created for the

play10:27

following select all applies options are

play10:30

discrete Dimension continuous Dimension

play10:32

measures and dates let's try one by

play10:35

one so this is discrete Dimension when

play10:38

we right click on it we are getting the

play10:40

option so this is wrong and now change

play10:44

it to continuous we are not getting any

play10:47

option to change alas so this is right

play10:49

option let's try on the data as well we

play10:53

are not getting any option to change the

play10:54

alas for the date so this is also right

play10:57

option we can check for for the ship

play10:59

data as well we are not getting and for

play11:03

any measures we are not getting okay so

play11:07

the right answer is b c

play11:17

d next question is when assigning

play11:20

Geographic role to a field table adds

play11:22

two Fields the measures area of the data

play11:24

pin okay so by default T added five

play11:27

fields in the tables that is measure

play11:32

name latitude longitude and here you

play11:35

sometime you will get order of count

play11:37

number of rows extract count or order of

play11:41

count okay and the fifth one is measure

play11:43

value okay so for but in this question

play11:46

it is asking for only for Geographic

play11:48

role so for Geographic role it adds

play11:50

latitude and longitude so the right

play11:53

answer is

play11:57

a

play12:00

choose the correct path to change the

play12:02

data type of a field in the view let's

play12:05

check how we can change the data type of

play12:07

a field in the

play12:09

view let's say we want to change it for

play12:12

customer name right click on it go to

play12:15

change data type and then select the

play12:18

data type okay let's check which option

play12:21

is matching with this right click the

play12:23

field in the data pen that is common in

play12:25

all the fields change the data type that

play12:28

is also matching

play12:29

and then choose the appropriate data

play12:31

type so B option is matching if you look

play12:34

at this there is one extra step data

play12:36

type that is not available when we check

play12:40

how to change the data type okay so the

play12:42

right answer is

play12:47

B next question is choose the correct

play12:50

path to add a default commands for a

play12:52

field let's see how we can add a

play12:54

commment to a field for this right click

play12:57

on any field

play13:00

then go to default properties then go to

play13:02

comments okay from here you can add the

play13:05

comments let's say hello and click on

play13:09

okay you can see when you over on it you

play13:11

will get the comments okay so let's

play13:14

check which option is matching right

play13:17

click a field in the data pen that is

play13:19

common in all then go to default

play13:21

properties and then command so a option

play13:23

is matching so right answer is

play13:27

a

play13:32

a union of the two table usually result

play13:34

in increase in the number of columns

play13:37

increase in the number of rows decrease

play13:40

in the number of column decrease in the

play13:41

number of rows so how a union works so

play13:45

Union is basically upend the data of the

play13:47

two tables so right option is increase

play13:50

in the number of rows that is

play13:57

B let's move to the next

play14:00

slide next question is when using a

play14:04

blend what is the color of the tick mark

play14:06

on the primary and the secondary data

play14:08

source respectively options are red and

play14:10

blue orange and blue blue and red blue

play14:13

and orange let's check it I have already

play14:17

added two data

play14:18

sources let's drag one field from

play14:21

parametry and one field from secondary

play14:24

so you can see blue color is assigned to

play14:26

the primary and the orange assigned to

play14:29

secondary the right answer is

play14:38

D next question is valid use case of

play14:41

managing metadata select all apply

play14:44

options are to view all hidden Fields

play14:47

this is correct because by using metata

play14:49

we can view all the hidden fields from

play14:51

the dashboards or the worksheets to see

play14:54

the field name in the original dat so

play14:55

this is also correct to see the table a

play14:58

field belong this is also correct to

play15:00

clean and automatically fix the issue

play15:02

this is wrong so the right answer is a b

play15:05

and

play15:10

c when user connect to table the data

play15:14

fields in their data set are

play15:16

automatically assigned a dash and dash

play15:19

options are type and Rule data type and

play15:22

value role and type Dimension and

play15:25

measures the right answer is first W

play15:28

assign type type and then assign role so

play15:30

right answer is

play15:35

a so what are type and role type is

play15:38

basically data type table assign a

play15:41

specific data type to each field based

play15:43

on their data and role is whether it is

play15:47

discrete or continuous Dimension or

play15:49

discrete or continuous measure

play15:52

okay Dash is used to so that labels

play15:56

appears differently in the so alas by

play16:00

default what does table do when you

play16:02

connect to the data so table create a

play16:04

live connection to the data so if you

play16:06

notice whenever we connect to the data

play16:09

TBL automatically connect data to the

play16:11

live connection okay so the right answer

play16:12

is create live connection to the data

play16:15

this question is important because this

play16:17

is directly mentioned in the

play16:19

syllabus let's move to the last question

play16:22

of this video what are the options

play16:24

available while creating extract select

play16:27

all apply first let's go and create

play16:37

extract click on

play16:39

extract go to addit and you can see we

play16:43

are getting filter option aggregate

play16:47

number of rows hide unused field and

play16:50

history and data storage as well okay

play16:53

let's see how many options are matching

play16:55

with

play16:57

this

play17:01

filters aggregation hide unused field

play17:04

and history means all are matching the

play17:07

right answer

play17:11

is

play17:13

all I have not added all in the option

play17:16

menu because I want you to check all the

play17:18

option and then select correct options

play17:21

okay that's it for this video thank you

play17:24

so much guys if you like the video

play17:26

please do like share and subscribe and

play17:28

stay tuned for upcoming parts

Rate This
β˜…
β˜…
β˜…
β˜…
β˜…

5.0 / 5 (0 votes)

Related Tags
Tableau CertificationData AnalysisExam TipsStudy MaterialData BlendingLive ConnectionExtract RefreshData ModelingCertification SeriesTableau DesktopData Visualization