Requirements, admin portal, gateways in Microsoft Fabric | DP-600 EXAM PREP (2 of 12)

Learn Microsoft Fabric with Will
18 Apr 202429:20

Summary

TLDRThis DP600 exam preparation course's first chapter guides viewers on planning a data analytics environment in Azure Synapse Analytics. It covers identifying solution requirements, including capacity planning, data ingestion methods, and storage options. The chapter also instructs on using the Azure Synapse admin portal for settings management and creating custom Power BI report themes. Interactive scenarios and practice questions test understanding, aiming to solidify key concepts for the exam.

Takeaways

  • πŸ“˜ The course is designed to prepare for the DP600 exam, focusing on planning a data analytics environment in Azure Synapse Analytics.
  • πŸ” The first chapter covers crucial elements like identifying solution requirements, recommending settings in the Azure Synapse admin portal, choosing data gateway types, and creating custom Power BI report themes.
  • 🎯 The goal is to extract client requirements through a workshop to build a plan for a new Azure Synapse environment and potentially secure a new contract.
  • 🏒 Capacity planning involves determining the number and size of capacities needed, influenced by factors like data residency regulations, billing preferences, and workload types.
  • πŸ“Š Data ingestion methods are explored, with options including shortcut, database mirroring, ETL via data flows or pipelines, and event streams, depending on data source and team skills.
  • πŸš€ The importance of choosing the right data storage option (Lakehouse, data warehouse, or KQL database) based on data type, team skills, and real-time needs is emphasized.
  • πŸ› οΈ The Azure Synapse admin portal is introduced as a key tool for managing tenant settings, capacities, and other organizational configurations.
  • 🌐 Understanding data gateways, both on-premise and virtual network, is crucial for securely accessing and ingesting data into Azure Synapse.
  • 🎨 Customizing Power BI report themes for consistency in reporting is discussed, including updating existing themes or importing new ones via JSON files.
  • πŸ“ The session concludes with practice questions to reinforce learning and a teaser for the next lesson, which will delve into setting up access control, sensitivity labeling, workspaces, and capacities in Azure Synapse.

Q & A

  • What is the main focus of the first chapter in the DP600 exam preparation course?

    -The main focus of the first chapter is to teach how to plan a data analytics environment in Azure Synapse Analytics (referred to as 'fabric' in the transcript), covering key elements such as identifying solution requirements, recommending settings in the fabric admin portal, choosing data gateway types, and creating custom Power BI report themes.

  • What are the three elements to focus on when identifying requirements for a fabric solution?

    -The three elements to focus on when identifying requirements for a fabric solution are capacities, data ingestion methods, and data storage options.

  • How does data residency regulation impact the number of capacities required in a fabric environment?

    -Data residency regulations can impact the number of capacities required because they dictate where data must be stored. For example, if data must reside in the EU due to GDPR, a separate capacity might be needed for EU data.

  • What factors can influence the sizing of a capacity in fabric?

    -The factors that can influence the sizing of a capacity in fabric include the intensity of expected workloads, such as high volumes of data ingestion, heavy data transformation, and machine learning training requirements. Additionally, the client's budget and their propensity to wait for data processing can also dictate the capacity sizing.

  • What are the different data ingestion methods available in fabric, and what are the deciding factors for choosing them?

    -The data ingestion methods available in fabric include shortcut, database mirroring, ETL via data flow, ETL via data pipeline, and event stream. The deciding factors for choosing them include the location of the external data, the skills existing in the team, the volume of data, and the need for real-time data streaming.

  • Why might an organization choose to use the on-premise data gateway in fabric?

    -An organization might choose to use the on-premise data gateway in fabric to securely access and bring data into fabric from on-premise SQL Servers or other on-premise data sources.

  • What is the role of the virtual network data gateway in fabric?

    -The virtual network data gateway in fabric provides a secure mechanism to access data that is stored in Azure behind a virtual network, such as in blob storage or ADLS Gen2.

  • What are the key settings that Camila, as a fabric administrator, needs to understand in the fabric admin portal?

    -Camila needs to understand key settings such as allowing users to create fabric items, enabling preview features, managing guest users, single sign-on options, blocking public internet access, enabling Azure private link, and allowing service principal access to fabric APIs.

  • How can custom Power BI report themes be created and applied in fabric?

    -Custom Power BI report themes can be created by updating the current theme in Power BI Desktop, writing a JSON template, or using a third-party online tool. They can be applied by importing a JSON file theme in Power BI Desktop under the 'View' tab in the 'Themes' section.

  • What is the process of setting up a virtual network data gateway in fabric?

    -The process of setting up a virtual network data gateway in fabric involves registering a Power Platform resource provider in Azure, creating a private endpoint and subnet in the Azure environment, and then creating a new virtual network data gateway connection in fabric.

Outlines

The video is abnormal, and we are working hard to fix it.
Please replace the link and try again.

Mindmap

The video is abnormal, and we are working hard to fix it.
Please replace the link and try again.

Keywords

The video is abnormal, and we are working hard to fix it.
Please replace the link and try again.

Highlights

The video is abnormal, and we are working hard to fix it.
Please replace the link and try again.

Transcripts

play00:00

hello and welcome to this first chapter

play00:02

in this dp600 exam preparation course

play00:06

we're going to be looking at how to plan

play00:08

a data analytics environment in fabric

play00:11

now this is the first chapter in 11

play00:14

chapters that we're going to be going

play00:15

through teaching you everything you need

play00:16

to know to hopefully pass the dp600 exam

play00:20

in this chapter we're going to be

play00:21

covering exactly what you need to know

play00:23

if you look at the study guide in

play00:25

Microsoft learn these are the elements

play00:27

that we're going to be covering how do

play00:28

we identify re requirements for a

play00:31

solution so the various components

play00:33

features performance capacity skus that

play00:36

kind of thing how do we make decisions

play00:37

about that we're also going to be

play00:38

looking at how to recommend settings in

play00:40

the fabric admin portal how do we choose

play00:43

data Gateway types and also creating

play00:46

custom powerbi report theme towards the

play00:48

end of the lesson we'll be testing your

play00:50

knowledge with five sample questions and

play00:53

just as a reminder all of the lesson

play00:55

notes and key points and link to further

play00:58

learning resources they're going to be

play01:00

published on the school community so if

play01:01

you're not already a member I'll leave a

play01:03

link in the description now you play the

play01:05

main character in a scenario and this

play01:08

scenario is going to walk you through

play01:10

everything you need to know for those

play01:11

four elements of the study guide are you

play01:13

ready let's begin so you are a

play01:15

consultant and you're starting your

play01:17

first day on a new project and this is

play01:20

Camila she is your client for the

play01:22

project and on the phone before the

play01:24

meeting Camila had mentioned that she

play01:26

wants to implement fabric but she

play01:28

doesn't know really where to start and

play01:30

that's where you come in you're going to

play01:32

start with a requirements Gathering

play01:34

Workshop so you organize a full- day

play01:36

workshop with Camila the client to truly

play01:38

understand their business and their

play01:40

requirements now your goals for this

play01:42

Workshop are to extract a set of

play01:45

requirements from the client to help you

play01:47

build a plan for their new fabric

play01:50

environment and another goal is to do

play01:53

such a great job in planning their

play01:55

environment that the client is going to

play01:56

give you a new contract by the end of it

play01:58

to build the thing okay so this

play02:00

requirements Gathering Workshop what are

play02:02

you going to ask Camila what do you need

play02:04

to know when you're identifying the

play02:06

requirements you should think about

play02:08

focusing on these three elements to

play02:10

begin with the capacities so how many do

play02:12

we need what sizing do the capacities

play02:14

need to be in this new environment then

play02:16

we're going to look at data ingestion

play02:18

methods so there's lots of different

play02:19

ways that we can ingest data into fabric

play02:22

you're going to ask a set of questions

play02:24

that's going to kind of deduce the best

play02:26

method for getting data into fabric

play02:28

based on the requirements similarly

play02:30

we've got data storage so we've got

play02:32

three different options for storing data

play02:34

in fabric how do you ask the right

play02:36

questions and identify the requirements

play02:38

to choose the right one so let's start

play02:40

off thinking about capacity requirements

play02:42

now the requirements that we need here

play02:44

are really the number of capacities that

play02:46

are required and the sizing so the SKU

play02:49

the stock keeping units you probably

play02:51

know by now that in fabric we have

play02:53

capacities of varying sizes so what

play02:56

determines the number of capacities

play02:58

required so from previous videos you've

play03:00

probably understood that one of the

play03:02

things that impacts the number of

play03:03

capacities required is compliance with

play03:05

data residency regulations so the

play03:08

capacity dictates where your data is

play03:10

stored so if you have regulations that

play03:13

dictate that your data must reside in

play03:15

the EU for example for gdpr that's going

play03:18

to be one capacity in your business if

play03:20

you have other requirements that say

play03:21

these data sets need to be stored in the

play03:24

US you're going to have to have a

play03:25

separate capacity for that as well

play03:27

another thing that can impact the number

play03:29

of Capac capacities is the billing

play03:31

preference so the capacity is how you

play03:33

get build in fabric so some

play03:35

organizations might want to separate the

play03:37

billing between different departments in

play03:39

their organization so they might have

play03:41

one capacity for the finance department

play03:44

one capacity for your Consulting

play03:45

division one capacity for your marketing

play03:47

department for example another thing

play03:49

that could determine the number of

play03:50

capacities that you need is segregating

play03:53

by workload type so if you have a lot of

play03:55

heavy intensive data engineering

play03:57

workloads then you might want to put

play03:59

those in a separate capacity and give it

play04:01

enough resource to allow you to do that

play04:04

in a confined capacity then your serving

play04:06

of business intelligence you might want

play04:08

to do that in a separate capacity so

play04:10

that the read performance on those kind

play04:11

of dashboards is not impacted by the

play04:14

heavy data engineering staff maybe

play04:15

machine learning stuff that's being done

play04:17

in other capacities you might also want

play04:19

to segregate by department just through

play04:21

business preference as well aligned with

play04:23

that billing preference so some

play04:24

companies like to have their capacity

play04:26

aligned to various departments within

play04:28

their business so these are the things

play04:30

you need to extract in terms of

play04:31

requirements when you're talking with

play04:33

this client and what about the sizing

play04:35

well we've touched on that already but

play04:36

some of the things that impacts the

play04:38

sizing of a capacity are the intensity

play04:41

of the expected workloads so are you

play04:43

going to be doing High volumes of data

play04:45

ingestion are you going to be getting

play04:46

gigabytes of fresh data into Fabric or

play04:49

even terabytes of data into fabric every

play04:52

day these are going to use a lot of your

play04:54

resources and to go through them quickly

play04:56

it helps if you have a higher capacity

play04:58

similarly heavy data transformation so

play05:01

if you're doing a lot of heavy

play05:02

transformations in spark that's going to

play05:04

use a lot of resources so if that's

play05:06

something you're going to be doing

play05:07

regularly in your business you want to

play05:09

be choosing a high capacity for that

play05:11

again machine learning training can be

play05:13

very resource intensive going to take

play05:15

hours or sometimes even days to train a

play05:17

machine learning model if that's

play05:19

something you're going to be doing

play05:20

regularly you want to be having that on

play05:21

a high capacity the budget of your

play05:24

client also dictates the capacity the

play05:27

sizing of the capacity that you're going

play05:29

to choose obviously the more resources

play05:31

the higher that SKU that you decide the

play05:34

more expensive it's going to be and some

play05:35

clients might be very sensitive around

play05:38

the cost and related to that is can you

play05:40

afford to wait or can the client afford

play05:42

to wait because if you procure a F2 skew

play05:46

it's probably going to go through your

play05:47

data but it might take a very long time

play05:50

and in some business that might not be a

play05:51

problem maybe you're just doing data

play05:52

ingestion once per day you ingest all of

play05:55

your fresh data and it might take a lot

play05:58

longer on an F2 capacity but that's not

play06:00

necessarily a problem maybe you can do

play06:01

it overnight and by the time people come

play06:03

in in the morning all of that data has

play06:05

been ingested or transformed and it's

play06:07

ready for consumption in the morning so

play06:09

what's your propensity to wait now some

play06:12

other companies might have regular data

play06:14

coming in every hour like gigabytes of

play06:16

data every hour and in that scenario you

play06:19

really need a high capacity to be able

play06:21

to churn through all of that stuff and

play06:23

get it processed before the next hourly

play06:27

load for example another thing that can

play06:28

determine the sizing of the capacity is

play06:32

does the client want access to f64

play06:35

Features so there's quite a lot actually

play06:37

of features that open up when you get to

play06:40

f64 so co-pilot being a good example

play06:43

currently and there's many many more

play06:45

I'll list them on the screen here these

play06:46

are features that only really are

play06:48

available if you choose f64 capacity or

play06:52

above so that's something to bear in

play06:53

mind if you want to use any of these

play06:54

features you need an f64 plus so what

play06:57

about the data ingestion requirements

play06:59

well here what we really need to know is

play07:01

what are the fabric items and or

play07:04

features that you need to get data into

play07:06

Fabric and how are you going to

play07:07

configure these items once you've built

play07:09

them now some of the options here and

play07:11

this is not an exhaustive list there's

play07:13

lots of different options here we've got

play07:15

the shortcut database mirroring ETL via

play07:18

data flow ETL via data Pipeline and a

play07:20

notebook and the event stream so these

play07:22

are some of the options that you might

play07:24

want to consider so what are the

play07:26

questions that you need to ask of a

play07:28

client when you're ident identifying the

play07:30

requirements to help you make the

play07:31

decision here well these are some of the

play07:33

deciding factors the main one really is

play07:35

where is the external data stored if

play07:37

it's in ADLs Gen 2 Amazon S3 or an S3

play07:42

compatible storage location like Cloud

play07:44

flare for example Google Cloud Storage

play07:46

or the data verse but then these are the

play07:48

ones that are going to be available for

play07:50

you to shortcut into fabric so if you

play07:53

get any questions in the exam around you

play07:55

know my data is stored in ADLs Gen 2

play07:58

well obviously the shortcut is a good

play08:00

option for that now it's not necessarily

play08:01

the only option you can still do ETL via

play08:05

any of these storage locations but it

play08:06

does open up that shortcut possibility

play08:09

now if you see Azure SQL Azure Cosmos DB

play08:12

or snowflake mentioned then immediately

play08:14

you should start thinking okay this

play08:15

could be database mirrored so you can

play08:18

use database mirroring to create that

play08:20

kind of live link to the database and

play08:22

it's going to maintain a mirror inside a

play08:25

fabric is it on premises now if your

play08:27

data stored on premises then you're

play08:30

going to be probably want to be using

play08:31

the ETL via data flows or data pipelines

play08:35

because these two activities these two

play08:37

items allow you to create that on-

play08:40

premise data Gateway on your on- premise

play08:42

server and then connect to that via the

play08:43

data flow or the data Pipeline and if

play08:45

you got realtime events realtime

play08:47

streaming data obviously you probably

play08:49

want to using the event stream to get

play08:51

that data into fabric anything else

play08:53

really you're going to be looking at ETL

play08:55

by either the data flow the data

play08:57

Pipeline and the notebook and when to

play08:59

choose which one well I've done a very

play09:01

long video I'll leave a link in the

play09:02

description or you can click here to

play09:04

make that decision about which of these

play09:06

is best for that particular organization

play09:09

so related to that is also what skills

play09:11

exist in the team because you don't want

play09:13

to build a solution that can't be

play09:15

maintained managed by the company or

play09:17

your company or your client's company so

play09:19

if you're looking for a predominantly no

play09:21

and low code experience then you're

play09:23

going to want to be focusing on the ETL

play09:26

via data flows and data pipelines both

play09:28

of these are fairly low and no code

play09:30

experiences help you get data into

play09:32

fabric if you got a lot of SQL

play09:34

experience in your team then here you

play09:36

can be using the data pipeline you can

play09:38

use the script activities to do

play09:40

Transformations on your data as it's

play09:41

coming in and if you have people that

play09:43

are familiar with spark python Scala

play09:46

that kind of thing then you can use the

play09:49

ETL notebook if you're you know perhaps

play09:52

you've got data coming from a rest API

play09:54

and you want to be using python

play09:55

libraries to get that in that's a good

play09:57

option for you there so once we're on

play09:59

the topic of data ingestion there's a

play10:01

few other features you need to be aware

play10:03

of that might come up in the exam that

play10:05

can help you identify different

play10:07

requirements for getting data into

play10:08

fabric these are the on premise data

play10:10

Gateway which we' mentioned the v-net

play10:13

the virtual network data Gateway fast

play10:15

copy and staging so you might be asking

play10:18

some questions about these things in the

play10:19

exam as well so when do we decide on

play10:22

these sorts of things well you need to

play10:23

ask how the data in the external system

play10:26

is being secured right so if it's on

play10:29

premise if it's an on premise SQL Server

play10:31

you have to be using the on premise data

play10:33

Gateway if your data is living in Azure

play10:36

behind some sort of virtual Network or

play10:38

private endpoint that kind of thing then

play10:40

you want to be setting up the vnet data

play10:43

gateway to access that and in terms of

play10:44

the volume of data this is also going to

play10:46

have an impact on the items that you

play10:48

choose for doing your data ingestion and

play10:51

also some of the features available so

play10:53

if you've got low or medium data per day

play10:55

well if it's low then you probably don't

play10:57

need any of these specific features like

play10:59

out of the box Solutions will be good

play11:00

enough but if you've got quite a lot of

play11:02

data gigabytes per day in that kind of

play11:05

range you want to be using some of the

play11:07

features like Fast copy and staging

play11:10

similarly if you got very high amounts

play11:11

of data these are going to be one of

play11:13

using the fast copy and the staging but

play11:15

if you're using data flows alternatively

play11:17

you can use data pipelines and if you

play11:19

can get data in Via a fabric notebook

play11:22

then that's another option as well so

play11:24

before we move on I just want to mention

play11:25

a bit more detail around the data

play11:27

gateways now as you probably know

play11:30

already there are two types of data

play11:32

gateways that we can configure in

play11:34

Microsoft fabric number one is the on

play11:36

premise data Gateway and number two is

play11:38

the virtual network data Gateway and a

play11:41

data Gateway in essence helps us access

play11:44

data that's otherwise secured so if his

play11:47

data is on an on- premise SQL server for

play11:49

example it gives us a secure way to

play11:52

access that data and bring it into

play11:53

fabric likewise if you've got data

play11:56

behind a virtual Network secured in

play11:58

azure in like blob storage or ADLs Gen 2

play12:01

it provides us with a secure mechanism

play12:04

to access that data so I'm not going to

play12:06

show you stepbystep how to set up a data

play12:08

Gateway in this lesson but what I have

play12:11

done is linked to two other videos by

play12:13

other creators that show the process in

play12:15

detail if you want to go and have a look

play12:17

I'll leave that in the school Community

play12:18

but I do want to just cover kind of the

play12:20

high level process for each of them just

play12:22

so you understand a bit more about what

play12:24

that looks like if you've never set one

play12:26

up before so for the on- premise data

play12:27

Gateway there's a few high level steps

play12:30

number one we need to install the data

play12:32

Gateway on the on-premise server and if

play12:34

you've already got an on-premise data

play12:36

Gateway set up on your on premise server

play12:39

perhaps you're using it in traditional

play12:41

powerbi data flows for example then

play12:43

you're going to need to update it to the

play12:45

latest version CU that's going to be

play12:46

compatible with Microsoft fabric the

play12:49

next step is to in fabric create a new

play12:52

on premise data Gateway connection and

play12:54

then from that you can connect to that

play12:56

data Gateway from either a data flow and

play12:59

now also a data pipeline so the data

play13:01

pipeline was recently added in the last

play13:02

few weeks I think it's still in preview

play13:05

that connection so you might not get

play13:06

asked about it in the exam but it's good

play13:08

to know that now it's actually possible

play13:10

via the data flow and the data pipeline

play13:11

to set up the v-net data Gateway we're

play13:14

going to start in Azure there's a few

play13:16

settings that you need to configure in

play13:17

your Azure environment before you can

play13:19

set up the v-net data Gateway connection

play13:23

so you're going to need to register a

play13:25

Power Platform resource provider within

play13:27

your Azure subscription and then within

play13:29

the item that you want to share or you

play13:31

want to access for example in your Azure

play13:34

blob storage item in Azure you need to

play13:36

create a private endpoint in the

play13:37

networking settings then create a subnet

play13:40

and then we're going to use that in

play13:41

fabric to create a new virtual network

play13:44

data Gateway connection and then again

play13:45

from that you can connect to it via your

play13:47

data flow to be able to access that data

play13:50

that is behind that virtual Network in

play13:53

Azure so next let's look at the data

play13:54

storage requirements and when we're

play13:57

talking to our client here ident ifying

play13:59

requirements really what we're trying to

play14:01

extract is okay what fabric data stores

play14:05

are going to be best for these

play14:07

requirements and what overall

play14:09

architectural pattern are we going to be

play14:11

aiming for with this solution now the

play14:13

options here are obviously The Lakehouse

play14:15

the data warehouse and the kql database

play14:18

and some of the deciding factors to

play14:19

choose between these are what's the data

play14:22

type okay so is it structured or semi

play14:26

structured or even unstructured so are

play14:28

you going to be getting raw files CSV

play14:31

Json maybe from AR rest API is it

play14:34

unstructured is it image data video is

play14:37

it audio data for example these are all

play14:40

going to be wanted to store in The

play14:41

Lakehouse because this is kind of the

play14:43

only place in fabric where you can store

play14:45

a variety of different file formats if

play14:48

your data is relational and structured

play14:50

then obviously you can keep that in

play14:52

either the lake house or the data

play14:54

warehouse and if it's real time and

play14:55

streaming you're going to be one to

play14:56

streaming that into your kql data datase

play14:59

next up another important consideration

play15:01

when choosing a data store is what

play15:04

skills exist in the team so if you're

play15:06

predominantly tsql based then you're

play15:08

going to be want to be using the data

play15:10

warehouse experience if you're

play15:12

predominantly spark and python Scala

play15:15

that kind of thing then you're going to

play15:16

be waned to storing your data

play15:17

predominantly in the lake housee and if

play15:19

you're predominantly using kql in your

play15:21

organization that's going to be want to

play15:23

using kql database for your data storage

play15:27

congratulations you've completed your

play15:29

first engagement for Camila you've

play15:31

convinced her to set up a proof of

play15:33

concept project in her organization so

play15:36

she's already created the fabric free

play15:38

trial she set up her environment but

play15:41

immediately she's hit a bit of a hurdle

play15:43

so this is your next mission she Rings

play15:45

you up and she says hey I need some help

play15:47

I opened the fabric admin portal and

play15:50

nearly had a heart attack please can you

play15:52

help me understand all of these settings

play15:54

so you set up a call with Camila to help

play15:56

her understand the fabric admin portal

play15:59

how are you going to teach her and what

play16:00

are you going to teach her about the

play16:01

admin portal what are the most important

play16:04

settings that she needs to know about

play16:05

okay just before we get into the fabric

play16:07

admin portal and look at some of the

play16:09

settings available to us in there it's

play16:11

important to note that to be able to

play16:13

access the admin portal of course first

play16:15

you need a fabric license but then you

play16:17

need to have one of the following roles

play16:20

you need to be either a global

play16:22

administrator a Power Platform

play16:24

administrator or a fabric administrator

play16:27

So within the admin port portal here in

play16:30

fabric you'll see this menu on the left

play16:33

hand side so these are some of the

play16:34

important settings in tenant settings

play16:37

here you can allow users to create

play16:38

fabric items so if you just set up

play16:40

fabric in your organization you need to

play16:42

allow people to actually create fabric

play16:44

items without that you can't really get

play16:45

very far enable preview features so

play16:47

every time Microsoft release new

play16:49

features normally they put them in the

play16:52

admin portal and you can allow or

play16:54

disallow users in your organization to

play16:57

use them you can also allow users to

play16:59

create workspaces there's a whole host

play17:01

of security related features that you

play17:03

can manage and get control over in your

play17:06

tenant so example how do you manage

play17:08

guest users allowing single sign on

play17:10

options for things like snowflake big

play17:12

query red shift accounts that kind of

play17:14

thing how do you block public internet

play17:16

access so that's really important to

play17:18

know enabling other features like Azure

play17:21

private link for example allowing the

play17:23

service principal access to the fabric

play17:25

apis so if you're going to be doing some

play17:26

automation you need to allow access

play17:28

access to service principles to the API

play17:31

there's also options in there for

play17:33

allowing git integration so if you're

play17:35

setting up Version Control that needs to

play17:37

be enabled there and there's also some

play17:39

features like allowing co-pilot within

play17:41

the organization as well now in general

play17:43

some of the settings can be one of three

play17:46

things it could be enabled for the

play17:47

entire organization it can be enabled

play17:50

for specific security groups so say you

play17:52

only want super users to be able to use

play17:55

this feature or admins within your

play17:58

fabric environment to use a specific

play18:00

feature and you can enable it for

play18:01

specific security groups or you can

play18:03

enable it for all except certain

play18:05

security groups so everyone in your

play18:07

organization gets access apart from

play18:10

these people perhaps guest users is a

play18:12

good example now other settings in the

play18:15

fabric tenant settings are kind of

play18:17

binary you either enable them or you

play18:19

disable them for the entire organization

play18:21

another important point in the fabric

play18:23

admin portal are the capacity settings

play18:26

so this section here and in here you can

play18:28

create new capacities delete capacities

play18:31

manage the capacity commissions and also

play18:34

change the size of a capacity so these

play18:36

are some important capacity settings

play18:38

that you need to be aware of understand

play18:40

how they work and how to manage them

play18:42

within your fabric environment so great

play18:44

you've taught Camila about the fabric

play18:46

admin portal and she's very grateful but

play18:48

before the meeting ends she has one more

play18:51

thing she wants to ask you about she

play18:53

says one final thing before you go and

play18:55

it might seem a bit random but when we

play18:57

might go to fabric I want our bi team to

play19:00

create more consistent reports have you

play19:02

got any ideas about how we can achieve

play19:04

that and of course the first thing you

play19:06

think of are custom powerbi report

play19:08

themes now there are many ways to create

play19:11

a custom report theme in powerbi you can

play19:14

either update the current theme if

play19:15

you're in powerbi desktop or you can

play19:17

write a kind of Json template yourself

play19:20

using the documentation if you're

play19:21

feeling a bit Brave you can do that

play19:23

yourself or you can also use a third-

play19:25

party online tool there's quite a few

play19:26

report theme generator tools that exist

play19:29

online but it's unlikely you're going to

play19:30

be tested on that in the exam so your

play19:32

task is to show Camila how to create a

play19:35

custom report theme so let's have a look

play19:38

at how you can do that within powerbi

play19:40

desktop so here I've got a report and

play19:43

what I'm going to do to access the

play19:44

report themes you need to go to the view

play19:46

tab then you can see these themes here

play19:48

and obviously these are the preset

play19:50

themes so you can just click and update

play19:52

the current theme very simply like that

play19:54

but to do most of the customization you

play19:56

to click on this button here and you can

play19:58

add access the current theme and all

play20:00

accessible themes are currently

play20:02

installed on this machine then there's a

play20:03

few settings down here that quite

play20:05

important to know so browse for themes

play20:08

if you click on that it's going to allow

play20:09

you to import a powerbi report theme so

play20:12

if you've already got a theme Here For

play20:14

example this one here then you can

play20:16

select that and install it into your

play20:18

environment like so if you want to

play20:20

customize the current theme you can do

play20:23

that like this it's going to bring you

play20:24

through to this UI environment just to

play20:26

you know change some colors change some

play20:29

text change some visuals what you have

play20:31

to think about for this section of the

play20:33

exam is what could they ask you you have

play20:35

to think about how could you possibly be

play20:37

tested on this so in terms of the power

play20:40

guy report theme stuff you're likely to

play20:41

be tested on these buttons here and what

play20:44

they do plus they could ask you about a

play20:46

Json theme so they could show you a Json

play20:49

theme and maybe ask you about okay how

play20:52

can you edit this theme what doesn't

play20:54

look right in this theme that kind of

play20:56

thing so it's good to have a bit of

play20:58

familiarity about the different sections

play21:01

in these Json files so the name of it

play21:04

how you can store your data colors as a

play21:06

list some of these different settings

play21:07

here you're probably not expected to

play21:09

memorize all of the different settings

play21:10

in Json format but you might get shown a

play21:13

theme in Json format and asked to modify

play21:16

it or asked to comment on it in some way

play21:18

to export the current theme you can also

play21:20

use this save current theme and that's

play21:22

going to allow you to export a Json file

play21:24

that you can share within your

play21:25

organization and you've also got here

play21:27

access to the theme Gallery so this is

play21:29

going to bring you through to the theme

play21:31

Gallery website where you can download

play21:33

other people's themes for your report to

play21:35

finish up this video and this lesson

play21:37

we're going to go through five practice

play21:40

questions just to kind of solidify that

play21:42

knowledge make sure you're understanding

play21:43

some of the key Concepts within a

play21:45

context of a scenario so the first

play21:48

question is you're running an F2

play21:50

capacity and you regularly experience

play21:53

throttling with that capacity now

play21:54

there's a number of long running spark

play21:56

jobs that take on average 3 hours to

play21:58

complete and you need these to complete

play22:00

in under 1 hour so you plan to increase

play22:02

the SKU of the capacity where would you

play22:05

go to make this change would you go to

play22:07

the workspace settings and configure

play22:09

spark settings would you go to the admin

play22:11

portal and the capacity settings section

play22:13

and then click through to Azure to

play22:15

update your capacity would you go to the

play22:16

monitoring Hub and look at the Run

play22:18

history or would you use the capacity

play22:20

metrics app so pause the video here and

play22:23

have a bit of a think and then I'll move

play22:25

on so the answer is to B so you can

play22:28

manage Your Capacity settings within

play22:29

admin portal and then capacity settings

play22:32

and then you can actually click through

play22:33

to Azure it gives you a link to the

play22:35

Azure portal and that's where you're

play22:37

going to change the capacity within the

play22:39

Azure portal obviously you can't be

play22:41

within the spark settings that's for

play22:43

managing the configuration of your spark

play22:44

cluster within a workspace and in the

play22:46

monitoring Hub we can't get anything

play22:48

there to do with capacity settings

play22:50

that's just going to tell you how your

play22:51

jobs are running and in the capacity

play22:53

metrics app that's just a readon app for

play22:56

having a look at how your capacity is

play22:58

being used so that wouldn't also be

play23:00

suitable either question number two your

play23:02

data governance team would like to

play23:04

certify a semantic model to make it

play23:06

discoverable in your organization now

play23:08

only the data governance team should be

play23:10

able to do this in what order should you

play23:13

complete the following tasks to certify

play23:15

a semantic model so have a look at the

play23:17

five actions here and what you're going

play23:20

to have to do is put these in an order

play23:23

so these are this is an ordered list it

play23:25

should be so one of the things you'll

play23:26

have to do first second third fourth and

play23:29

fifth so once you've got these in order

play23:31

we'll move on so let's look at the

play23:32

answer now so the correct order looks a

play23:34

bit like this so we start by creating a

play23:38

security group for the data governance

play23:40

team the clue in the question was that

play23:43

only the data governance team should be

play23:45

able to do this so when you see that you

play23:48

think okay well they need to be within a

play23:50

security group to enable this number two

play23:53

and you could argue that one and two

play23:54

could be interchangeable but these are

play23:56

the first two items anyway but enable

play23:58

will the make certified content

play24:01

discoverable So within the admin portal

play24:03

the tenant settings there's a section

play24:05

for Discovery you're going to need to

play24:07

enable that for the organization and

play24:09

then after that you're going to have to

play24:11

make sure that that settings is applied

play24:13

only to the data governance security

play24:15

group that you set up then you're going

play24:17

to need to ask the data governance team

play24:19

to go into the semantic model settings

play24:22

and then endorsement and Discovery and

play24:24

click certify for that semantic model

play24:27

and then you want to validate that that

play24:28

has been set up correctly and your

play24:30

business user can see that certified

play24:33

semantic model within the one leg data

play24:36

Hub three you join a new company and

play24:38

you're given a powerbi report theme as a

play24:40

Json file to use for all new projects

play24:43

how do you apply this Json file theme to

play24:45

the report that you're currently

play24:46

developing is it a in powerbi desktop go

play24:49

to view themes and customize current

play24:52

theme B go to the fabric admin portal

play24:55

click on custom branding and then set

play24:57

the default report theme C use tabulate

play25:00

editor 2 to update the theme or D in

play25:02

power desktop go to the view themes and

play25:06

then browse for themes so the answer

play25:08

here is D in power desktop go to the

play25:11

view themes and then browse for themes

play25:14

so d and a are quite similar but a is

play25:17

for customizing a current theme so

play25:18

that's not going to be allowing you to

play25:20

import a Json file that's going to allow

play25:22

you to use the user interface to update

play25:25

the current theme so that's not what we

play25:27

want to do we want to import adjacent

play25:29

file as our report theme which is

play25:31

possible using D B that functionality

play25:33

doesn't actually exist custom branding

play25:36

does exists but that allows you just to

play25:37

update the colors and the icons within

play25:40

fabric not a default report thing and

play25:43

tabul editor 2 is also the incorrect

play25:45

answer question four you have 1,000 Json

play25:48

files stored in Azure data Lake storage

play25:51

ADLs Gen 2 that you want to bring into

play25:53

fabric the ADLs Gen 2 storage account is

play25:57

secured you using a virtual Network

play25:59

which of these actions would you need to

play26:01

perform first is it a in fabric goto

play26:05

manage connections and gateways and then

play26:07

click on create a new virtual network

play26:09

data Gateway B create a shortcut to the

play26:12

ADLs Gen 2 storage account C in Azure

play26:15

register a new resource provider and

play26:17

create a private endpoint and subnet or

play26:19

D install an on- premise data Gateway on

play26:22

an Azure virtual machine in the same

play26:24

virtual Network or E enable Public

play26:27

Access in the storage account network

play26:28

settings so for this one you'll remember

play26:31

that the answer is C so the first step

play26:33

in setting up a virtual network data

play26:35

gateways well we need to go into Azure

play26:37

we need to perform some network

play26:39

configuration okay so you need to

play26:41

register that new resource provider that

play26:44

Microsoft Power Platform resource

play26:46

provider within your subscription and

play26:47

then on the item create a private

play26:49

endpoint and a subnet all of the other

play26:51

options some of them are steps in the

play26:54

process but not the first step so the

play26:56

question was which of these actions

play26:58

would you need to perform first so yes

play27:01

we do need to do a but it's not going to

play27:03

be the first thing that you're going to

play27:04

do B is kind of a bit of a a red herring

play27:07

here CU you might have seen ADLs Gen 2

play27:09

and thought ah shortcut but actually you

play27:11

need to configure the virtual Network

play27:13

daily Gateway before you can even think

play27:15

about kind of connecting to it D

play27:17

installing the on premise data Gateway

play27:19

well we know that we're looking at a

play27:21

virtual Network here so you're going to

play27:23

be choosing the virtual network data

play27:25

Gateway rather than an on- premise data

play27:26

Gateway and E in enable Public Access in

play27:29

the storage account network settings

play27:31

while that's going to expose your data

play27:33

to the public internet so not advisable

play27:36

question five you have data stored in

play27:38

tables in Snowflake which of the

play27:40

following cannot be used to bring the

play27:42

data into fabric a use the data pipeline

play27:45

copy data activity B create a shortcut

play27:47

to the snowflake tables from your Lake

play27:49

housee C use the data flow Gen 2 with

play27:52

the snowflake connector D use database

play27:55

mirroring to create a mirrored snowflake

play27:57

data datase in fabric so the answer here

play28:00

is B to create a shortcut to the

play28:02

snowflake tables from your lake house as

play28:05

you'll know you can only shortcut to

play28:07

ADLs Gen 2 or Amazon S3 or Google Cloud

play28:11

Storage so the ability to shortcut is

play28:14

generally on files when we're talking

play28:17

about tables in databases well there's

play28:19

all of the other three we can use so you

play28:22

can do a copy data activity from a data

play28:24

pipeline to bring that data in if you

play28:25

want to copy it in or you can use a data

play28:28

flow Gen 2 or you can use database

play28:31

mirroring because snowflake is one of

play28:33

the databases where database mirroring

play28:35

is possible Camila says thanks she's

play28:37

seriously impressed with your knowledge

play28:39

well done in this lesson we've looked at

play28:41

how you can identify requirements for a

play28:43

fabric solution we've looked at the

play28:45

different types of data gateways that

play28:47

are available to us in fabric we've

play28:49

looked at the settings in the admin

play28:51

portal and we've also looked at how to

play28:53

create custom powerbi report themes and

play28:56

the good news is you've want an

play28:57

extension to the contract Camila would

play28:59

like you to implement and manage her

play29:01

data analytics environment so you've got

play29:03

the next stage of the contract in the

play29:05

next lesson we'll look at how you can do

play29:07

that how you can set up Access Control

play29:10

sensitivity labeling workspaces

play29:12

capacities all that kind of stuff how do

play29:15

we set these things up inside fabric so

play29:17

make sure you click here for the next

play29:19

lesson

Rate This
β˜…
β˜…
β˜…
β˜…
β˜…

5.0 / 5 (0 votes)

Related Tags
Data AnalyticsFabric EnvironmentExam PreparationCapacity PlanningData GatewayAdmin PortalCustomizationPowerBI ReportsAzure IntegrationData Ingestion