Effortlessly Scrape Data from Websites using Power Automate and Power Apps

DamoBird365
10 Sept 202217:35

Summary

TLDRIn this tutorial, the presenter demonstrates how to scrape exchange rate data from a website using Power Automate cloud flow. They extract table data, convert it into a JSON array, and showcase how to utilize this data in various applications like Power Apps, Excel, or Dataverse. The process involves using HTTP actions, data manipulation expressions like substring and nth index of, and concludes with displaying the scraped data in a Power App gallery. The video is a practical guide for those interested in web data extraction and integration.

Takeaways

  • 🌐 The demonstration focuses on scraping data from a website using Power Automate cloud flow.
  • 📊 The example data is a table of exchange rate information which is extracted and converted into a JSON array.
  • 🔄 The JSON array can be utilized for various purposes such as updating Excel tables, adding items to a list, or saving records to Dataverse.
  • 🔧 The process involves using HTTP action to access the website and retrieve HTML content.
  • 🔍 The script uses 'indexOf' and 'nthIndexOf' to locate the specific table containing the exchange rate data within the HTML.
  • 📖 The data is extracted as a substring between the opening and closing tags of the table.
  • 🔄 The extracted HTML table is converted to XML and then to JSON for easier manipulation and use.
  • 🛠️ The JSON data is simplified using 'select' and 'compose' actions to extract only the necessary information.
  • 📱 The final step involves passing the JSON array to a Power App where it can be displayed in a gallery or used for other purposes.
  • 🔗 The demonstration also covers updating the Power Automate flow to respond to a Power App trigger and creating a JSON schema for the response data.

Q & A

  • What is the main focus of the demonstration in the video script?

    -The main focus of the demonstration is to show how to scrape data from a website using a Power Automate cloud flow, specifically targeting a table of exchange rate data.

  • What is the final format of the scraped data in the demonstration?

    -The final format of the scraped data is a JSON array, which can be used for various purposes such as adding data to a table in Excel, a list, or saving as records to Dataverse.

  • How does the presenter plan to use the scraped data in their scenario?

    -In the scenario presented, the scraped data is planned to be passed to a Power App and displayed in a gallery.

  • What is the first step the presenter takes in building the flow for scraping data?

    -The first step in building the flow is to use a manual trigger and an HTTP action to access the website via a GET request and retrieve the HTML content.

  • Why does the presenter use the 'nth index of' expression in the flow?

    -The presenter uses the 'nth index of' expression to find the second occurrence of the opening table tag in the HTML content, as the data of interest is within the second table on the webpage.

  • How does the presenter extract the table data from the HTML content?

    -The presenter extracts the table data by finding the positions of the opening and closing table tags, then using the 'substring' expression to get the data between these tags.

  • What conversion does the presenter perform on the extracted table data?

    -The presenter converts the extracted table data into XML first and then into a JSON array for easier manipulation and use in Power Apps.

  • What Power Automate expression is used to simplify the JSON array data into a more usable format?

    -The presenter uses the 'select' action in Power Automate to simplify the JSON array data by looping through each object and extracting the desired values based on column headers.

  • How does the presenter update the Power Automate flow to respond back to the Power App?

    -The presenter updates the flow to use an HTTP response action to send the data back to the Power App, and creates a JSON schema based on the output of the 'select' action.

  • What is the method used in the Power App to trigger the flow and display the scraped data?

    -In the Power App, the presenter uses a button to trigger the flow and populates a gallery with the data by using a collection that is created as a result of running the flow.

Outlines

00:00

🌐 Introduction to Web Scraping with Power Automate

The speaker introduces a tutorial on how to scrape data from a website using a Power Automate cloud flow. The example focuses on extracting exchange rate data from a table on a website and converting it into a JSON array. The potential uses of the data once converted are discussed, such as adding it to an Excel table, a list, or saving it as records in Dataverse. The speaker encourages viewers to watch the demonstration if interested and reminds them to like and subscribe.

05:01

🔗 Setting Up the Power Automate Flow

The tutorial continues with the setup of a Power Automate flow that starts with a manual trigger. The speaker plans to convert this into a PowerApps trigger for practical use but uses a premium connection for initial testing. The HTTP action is used to access the website's data via a GET request, and the URL is inputted into the flow. After saving and testing, the HTML of the webpage is retrieved and inspected for the table data. The speaker demonstrates how to find the table's opening and closing tags and extract the data between them using the 'substring' expression.

10:01

🔄 Converting HTML to XML and JSON

The speaker explains the next steps in processing the extracted HTML table data. The data is first converted into XML to facilitate easier conversion into JSON format. The XML conversion is done using the 'xml' expression, and then the JSON conversion is performed using the 'json' expression. The speaker then demonstrates how to extract the table rows from the JSON array and prepare the data for further use, such as sending it to a PowerApp or creating new list items.

15:03

📱 Integrating with PowerApp and Displaying Data

The final part of the tutorial involves updating the Power Automate flow to use a PowerApps trigger and responding back to the PowerApp with the scraped data. The speaker shows how to create a JSON schema based on the data's structure and use the HTTP response action to return the data to the PowerApp. In the PowerApp, a button is used to trigger the flow, and a gallery is populated with the data retrieved from the website. The speaker concludes the demonstration by showing the real-time updating of the data in the gallery and encourages viewers to like and subscribe for more content.

Mindmap

Keywords

💡Power Automate

Power Automate, formerly known as Microsoft Flow, is a service that helps you create automated workflows between your favorite apps and services to synchronize files, get notifications, collect data, and more. In the video, the presenter demonstrates how to use Power Automate to scrape data from a website, showcasing its capability to interact with web content and automate tasks.

💡Cloud Flow

A cloud flow in Power Automate is a type of workflow that runs in the cloud and is triggered by events or on a schedule. It's used to automate processes across various applications and services. The video's demonstration focuses on creating a cloud flow to scrape data from a website, emphasizing the cloud-based, automated nature of the task.

💡JSON Array

JSON (JavaScript Object Notation) is a lightweight data-interchange format that is easy for humans to read and write and easy for machines to parse and generate. A JSON array is an ordered collection of values, represented in square brackets. In the context of the video, the presenter extracts data from a website and converts it into a JSON array, which is then used to pass data to a Power App, illustrating the format's utility in data interchange.

💡Power App

Power Apps is a service that allows users to build custom applications for web and mobile devices without the need for extensive coding knowledge. In the video, the presenter mentions using a Power App to display data in a gallery, highlighting how Power Automate can integrate with Power Apps to create dynamic and interactive user interfaces.

💡HTTP Action

HTTP actions in Power Automate are used to interact with web services via HTTP requests, allowing you to get, post, update, or delete data from a web endpoint. The video script describes using an HTTP action to access a website and retrieve HTML content, which is a fundamental step in the data scraping process.

💡Table Tag

In HTML, a table tag is used to create a table. The tag defines the structure of the table, including rows and columns. The video script refers to searching for a table tag in the HTML source of a website, which is part of the process to identify and extract the data that the presenter wants to scrape.

💡XPath

XPath is a query language for selecting nodes from an XML document. Although not explicitly mentioned in the script, the process of finding the 'nth' occurrence of a tag is similar to using XPath to navigate through the structure of an HTML document to locate specific elements. The video demonstrates using a method akin to XPath to find the second table on a webpage for data extraction.

💡Substring

A substring is a contiguous sequence of characters within a string. In the video, the presenter uses the concept of a substring to extract the portion of the HTML content that contains the table data of interest, which is a crucial step in the data scraping process.

💡XML

XML (eXtensible Markup Language) is a markup language much like HTML, but is used to store and transport data. The video script mentions converting the scraped HTML table data into XML, which is an intermediate step before converting it into a JSON format, demonstrating the versatility of data formats in data manipulation processes.

💡Dataverse

Dataverse is a scalable data service and app platform that allows you to securely store and manage data. In the video, the presenter briefly mentions the possibility of saving scraped data as records to Dataverse, indicating one of the many potential uses for the data once it has been extracted and formatted.

💡Gallery

In the context of Power Apps, a gallery is a control that displays a collection of items, such as images or text, in a responsive layout. The video demonstrates how to use a gallery to display data that has been scraped from a website and passed to a Power App, showcasing the practical application of data within an app interface.

Highlights

Introduction to scraping data from a website using Power Automate cloud flow.

Focus on extracting exchange rate data from a table on a website.

Demonstration of pulling data into a JSON array for use in Power Apps.

Explanation of the versatility of using data arrays for various applications.

Step-by-step guide to building a flow from scratch to scrape data.

Use of HTTP action to access the website and retrieve HTML content.

Identification of the specific table containing the desired data.

Technique to find the nth occurrence of a tag using 'indexOf' and 'nthIndex'.

Extraction of the table's HTML content using substring and dynamic values.

Conversion of the HTML table into XML for easier data manipulation.

Transformation of XML data into a JSON array for Power Apps compatibility.

Accessing the table row objects from the JSON array for data extraction.

Use of 'select' action to loop through objects and extract specific values.

Creation of a simplified data array based on the table's column headers.

Integration of the scraped data into a Power App using a button trigger.

Displaying the scraped data in a Power App gallery for real-time updates.

Overview of the practical applications of the scraped data, such as updating Excel tables or lists.

Conclusion of the demonstration with a summary of the key learning points.

Transcripts

play00:00

hi there folks in today's demonstration

play00:02

i'm going to show you how we can scrape

play00:04

data from a website using a power

play00:07

automate cloud flow

play00:09

so in today's example i'm actually going

play00:10

to look at a table of exchange rate data

play00:13

i'll then pull that from the website

play00:15

into a json array which and then in my

play00:19

scenario going to pass to a power app

play00:21

and display in a gallery

play00:23

now of course because the data is in an

play00:25

array you could use it to add data to a

play00:28

table in excel or maybe add some items

play00:31

into a list or indeed save as records to

play00:35

dataverse plenty of options with the

play00:37

data once it's converted to an array

play00:41

if that's something that interests you

play00:42

please make sure you watch on if you

play00:43

haven't already make sure you like and

play00:46

subscribe and without further ado let's

play00:49

jump into the demonstration

play00:51

so here you can see the table in

play00:53

question on our website here we've got

play00:55

three columns of data

play00:57

with the currency and then two different

play01:00

values

play01:02

i'm going to pull this across into a

play01:03

flow which i'm going to build from

play01:06

scratch today

play01:07

now if i start with my flow you can see

play01:10

that i have the manual trigger

play01:12

and i will eventually convert that into

play01:14

the powerapps trigger but for now for

play01:15

testing i'm going to kick things off

play01:18

with a premium connection so we do need

play01:21

to use the http action and that will

play01:24

allow us to access that website

play01:26

via the get and then i can paste the url

play01:29

that was keen to get out there into this

play01:32

parameter here

play01:34

now if i go ahead and save this and test

play01:37

it we'll be able to see that just from

play01:39

this quick action that we've added we're

play01:41

able to get all of that html back into

play01:44

our flow

play01:47

so open that up we can have a look at

play01:48

the body here and here we have all the

play01:50

html so if i jump back onto that website

play01:53

and want to right click and go to view

play01:56

page source

play01:57

i can then do a search for the table tag

play02:01

which is what i'm going to be doing

play02:02

today in power automate

play02:04

so you can see that i have three

play02:06

potential opening table tags there if i

play02:08

jump down onto the second one and maybe

play02:10

start scrolling across a bit hopefully

play02:12

we'll start seeing some data that's uh

play02:14

recognizable so i see at least one of

play02:16

the columns there we've got monet

play02:18

and we've also got h a h

play02:22

um my french is not great so apologize

play02:24

but definitely we're pulling through

play02:26

some of those values we've got the

play02:27

currency for euros and this is the the

play02:30

string of data that we're going to look

play02:32

for in our flow

play02:34

so how do we get that well we're going

play02:36

to have to use compose several times

play02:38

throughout this solution

play02:39

and i'm going to use index of and of

play02:42

course that is going to allow me to

play02:44

return

play02:45

the

play02:47

position of a string so i'm going to

play02:50

look at that body and i want to return

play02:53

that opening tag of table

play02:55

but

play02:56

i just did a search there on the page

play02:58

source and it wasn't the first table it

play02:59

was actually the second table

play03:01

so i'm going to use the expression nth

play03:04

index of which will allow me to find the

play03:08

nth occurrence and in this case if i put

play03:10

in a comma i want to find the second

play03:12

occurrence of that tag

play03:14

now i've highlighted that expression i'm

play03:16

going to copy it and say okay

play03:18

we'll rename this action to the opening

play03:22

tag

play03:23

and then i'm going to create another

play03:25

compose to find the closing tag

play03:29

and that's why i've copied that

play03:30

expression

play03:32

so closing tag if i jump into the

play03:35

expression builder here and paste that

play03:37

in

play03:38

rather than this opening tag i now want

play03:40

to find the closing tag we just got the

play03:43

forward slash in front of it there and

play03:45

if i hit okay we now have the position

play03:48

of the second occurrence of the opening

play03:50

tag and the second occurrence of the

play03:52

closing tag

play03:54

with the next compose we can go and get

play03:57

that sub string

play03:58

and we're going to use the expression

play04:01

substring to do that

play04:03

so in terms of dynamic values we want to

play04:06

check that body for this particular

play04:08

substring

play04:09

and the substring expression looks for a

play04:12

starting index which is our opening tag

play04:15

so we can pick there the opening tag

play04:18

and then we're looking for a length

play04:20

now if the closing tag is at position

play04:22

500 and the opening tag is at position

play04:25

400 then the length is going to be the

play04:27

closing tag minus the opening tag so we

play04:30

can use sub

play04:33

open close brackets and we're going to

play04:35

insert the closing tag a comma

play04:38

and then the opening tag and all being

play04:40

well if i hit ok that should update and

play04:42

save so that will get us our table

play04:46

string

play04:47

i'll call that table

play04:50

i'm going to go ahead and save and test

play04:52

and there's one thing to note at this

play04:53

point the closing tag is the position of

play04:56

the beginning of the strings the

play04:58

beginning of that closing tag but i do

play05:00

need to have everything up into the end

play05:03

of that closing tag and if i expand this

play05:06

compose action here we can see that

play05:08

whilst i've got the opening tag if i go

play05:10

all the way to the end i'm missing the

play05:13

closing tag

play05:15

so

play05:15

simple simple step all i need to do is

play05:18

just type in that closing tag and that

play05:20

completes off our table or our html

play05:23

table

play05:25

next step is all about converting it

play05:27

into xml because if we convert it into

play05:29

xml i can convert it a lot easier into

play05:33

json so another compose i'm going to

play05:35

rename that as xml

play05:38

and it's quite simple i type in the

play05:40

expression xml open close brackets and

play05:43

select the output from that compose

play05:45

table and say okay

play05:47

now that it's an xml

play05:49

i can easily convert it into json

play05:52

and i can do that using the json

play05:55

expression so type in json open and

play05:58

close brackets

play05:59

and if i

play06:01

insert that compose xml and hit ok i'm

play06:04

just going to rename this as well so

play06:06

that i know this is my my json array

play06:10

if i go ahead and save and test that we

play06:12

can have a look at the output of both

play06:14

the xml and the json array action

play06:19

now i've got a video on doing this in

play06:21

more detail more complex situations but

play06:23

this is a relatively straightforward

play06:25

situation for converting it into xml and

play06:28

then into

play06:30

json so we can see now we have our json

play06:33

array and of course to expand this

play06:35

you can see that the table tag is now a

play06:37

key for one of the objects and we've got

play06:40

a table header

play06:42

array here with the column names

play06:46

and then the important bit is this table

play06:49

row array because you can see now that

play06:51

we have these repeating objects

play06:54

that have

play06:56

the currency

play06:57

and then these two values that we're

play06:59

looking to select

play07:01

and so for sending this back to

play07:03

powerapps or doing anything with it to

play07:05

be honest if i'm wanting to

play07:07

create a new list item i want to

play07:09

simplify this data quite significantly

play07:12

to be honest

play07:13

so

play07:15

i'm going to copy this

play07:16

object

play07:18

and i'll open it up in notepad plus

play07:21

shortly to have a look at it but the

play07:23

first thing i want to do is to get

play07:26

access to this table row object so i can

play07:28

get access to this full array

play07:30

and if i look at the path i can see that

play07:32

the path will be table and then follow

play07:34

this line down is tr

play07:38

so go back into edit

play07:40

and i'm going to go into compose

play07:44

and then i want to create a new

play07:46

expression and a little tip here

play07:48

i want to get the expression for this

play07:50

action here i'm going to go into the

play07:52

expression tab type in the number one

play07:55

pick that compose json array and then

play07:58

just

play07:59

get rid of that one

play08:00

and then

play08:01

because we looked at that array a minute

play08:03

ago i want to get the table and then the

play08:07

tr object and hopefully if i say okay to

play08:11

that and hit test it should return to me

play08:14

all those objects with the row data

play08:19

so test that and have a look

play08:21

and we can see that we now have if i

play08:24

expand that all of the individual

play08:26

objects containing each of those rows

play08:31

so this is where we can use a select and

play08:32

the select will allow us to then pick

play08:34

out

play08:35

the

play08:36

header we've got here for the currency

play08:39

and then those two values and so a

play08:42

select will let us loop through each of

play08:44

these objects individually and grab

play08:47

those values

play08:50

so if i go back into edit and i'm going

play08:53

to use my select action

play08:56

and if we insert that compose as the

play08:59

input

play09:00

we then need to define our map so we

play09:03

need those three column headers so i'm

play09:05

going to jump back onto the website and

play09:07

just grab

play09:09

these column headers of course i could

play09:10

call them anything that i want to be

play09:12

honest but i'll just go with

play09:14

the values that are on the website for

play09:16

now

play09:17

i'm not even going to try and pronounce

play09:18

it because i cannot speak a word of

play09:20

french

play09:23

we'll grab that last one there

play09:25

and chuck that one in and just tidy this

play09:27

up with a few extra return lines and if

play09:29

i bring across notepad plus and have a

play09:31

look at one of these objects here

play09:33

you can see that first of all we need to

play09:35

get into this td key and then we need to

play09:38

get into this p key in order to get the

play09:42

euro string but it is actually different

play09:44

for the other two objects beneath

play09:46

so if you know how to access objects

play09:48

within an array they're called by

play09:50

integer indexes so this is integer index

play09:54

zero this is one this is

play09:56

two

play09:58

and we want to loop through all of these

play10:01

in our select so what we do here in the

play10:04

expression tab is to type in item

play10:08

question mark

play10:09

and then we need to think about what we

play10:11

want to retrieve from this object

play10:15

so item is going to return

play10:18

everything

play10:20

we need to get into this td

play10:22

key so if i type in now in single quotes

play10:26

td

play10:29

and then we want to get into the first

play10:32

object in the next array so thinking

play10:34

about inter integer indexes that will be

play10:38

zero so i can put in question mark

play10:41

and then in brackets zero

play10:43

and then after that if i put another

play10:45

question mark and jump back onto this

play10:47

notepad plus

play10:49

i want to get into the value p

play10:52

for this particular one here so all i

play10:54

need to do is square brackets single

play10:56

quotes and the letter p so i'm going to

play10:59

copy that

play11:01

and think about the next one so the next

play11:04

one if i paste that in and bring up the

play11:07

notepad

play11:08

if i was to return p i would get

play11:10

everything in this object

play11:12

but i want to get the text

play11:16

so i can copy that key name and i've got

play11:19

to remember also that we've got

play11:21

several objects so i'm no longer an

play11:22

object zero i'm in an object one

play11:25

so if i go back here i need to change

play11:27

this to object one and rather than it

play11:29

just being p it needs to be p text so i

play11:32

can use a forward slash

play11:33

text the other option would be to put

play11:35

another question mark and put the text

play11:37

value in square brackets

play11:39

so if i copy that say okay we'll go into

play11:42

this one here i can paste that in all i

play11:44

need to change here is the value 1 to

play11:46

value 2 because we're now into the last

play11:51

object here 0 1

play11:55

and just just to demonstrate if i was to

play11:58

remove this text here

play12:00

i could put in the question mark and the

play12:03

hash text so this would be the

play12:05

alternative in single quotes if you're

play12:07

not familiar using the forward slashes

play12:10

so if i say okay and save that and test

play12:13

that that should hopefully get us

play12:15

a nicely repurposed array of data

play12:18

based on those three columns

play12:21

so it's run okay if i go to the select

play12:24

here we go we can see we've got the

play12:25

euros starlings dollars

play12:28

etc so all those values have been nicely

play12:30

pulled through

play12:31

and based on your requirement you can

play12:33

either now save that to excel into a

play12:34

list dataverse etc but for today i'm

play12:37

going to pass this back to my power nap

play12:41

so for the power app the first thing i

play12:43

need to do is i need to update the

play12:45

trigger so if i delete that trigger

play12:48

i can then go and select the powerapps

play12:50

triggers and there's one and two we're

play12:52

gonna go with version two even though it

play12:53

doesn't really matter version two has

play12:55

the improvements for the input

play12:56

parameters we don't need that today but

play12:59

i prefer the version two

play13:01

and then in terms of response

play13:04

whilst there is a respond to

play13:08

power naps or flow that you see here it

play13:10

will only currently allow you to return

play13:12

a string there is a new feature that has

play13:14

come out this week it's not yet reached

play13:16

my tenant sadly that does allow you to

play13:19

parse strings in

play13:22

power apps to create an array when that

play13:24

comes out i'll maybe do a quick video

play13:27

for today's video because we're already

play13:29

in the premium actions i'm going to use

play13:32

the http response

play13:34

which is here it's a premium action

play13:37

and this allows me to respond back to

play13:39

the powerapp with the data from the

play13:41

select

play13:42

so the data is like i mentioned from the

play13:45

select we do need to however create this

play13:48

json schema and the way we do that is

play13:50

from running the flow

play13:52

so i go ahead and test that again

play13:55

and this should run the flow and then we

play13:57

can jump into the select action and copy

play14:00

the output to create our schema

play14:03

so highlighting all of this we could do

play14:05

a ctrl a and ctrl c

play14:07

back into edit i can then go into

play14:10

generate from sample and insert my

play14:12

payload and say done and you'll see that

play14:15

i have a nicely created schema here that

play14:18

now will allow this action to return the

play14:21

data back to powernaps

play14:24

so go ahead and give that a save i also

play14:27

better give it a more meaningful name or

play14:29

i'll never find it

play14:32

to scrape

play14:34

data

play14:35

from web

play14:37

hit the save button i'll jump onto my

play14:39

power app so with the powerapp i'm just

play14:42

going to use a button today to run the

play14:44

flow you could do it

play14:46

via

play14:47

the unvisible property of your screen

play14:50

and i'm going to populate a gallery so

play14:53

if i go and insert my vertical gallery

play14:57

and i need to attach my flow to my power

play15:00

app so if i go into the power out

play15:02

automate button here on the left hand

play15:04

side go to add flow and go and search

play15:07

for this new solution scrape data from

play15:09

web so that's just attaching that flow

play15:12

into my power app

play15:14

and then once that's done if i go to my

play15:17

button

play15:19

i should be able to

play15:21

find that new

play15:23

expression for that flow so pressing

play15:26

that button now will run that flow but i

play15:28

want that the results to be in a

play15:30

collection so i can clear collect

play15:33

i can type in my collection name which

play15:35

can be my scraped

play15:39

data

play15:41

or scrapped data i've called it

play15:45

scraped data i need to put the closing

play15:47

bracket at the end there so that keeps

play15:49

that expression happy and then i need to

play15:52

update my gallery so that it's now using

play15:55

that collection which is created as a

play15:57

result of running my flow

play15:59

so if i put it into play mode and hit

play16:00

the button hopefully it will bring

play16:03

through the data which is fantastic

play16:05

and then if i want to display other

play16:07

fields i can control c and control v in

play16:10

order to create just a duplicate text

play16:12

field which we can see here on the left

play16:15

hand side

play16:16

and then i could change the expression

play16:18

here to return the other value

play16:22

and there we go so we have scraped the

play16:26

data from the website

play16:28

returned it into a json array

play16:32

then we've passed that onto our powerapp

play16:34

into a gallery and that we can now see

play16:37

that data so this will be

play16:39

real time if this website gets an update

play16:42

and these prices change if we were then

play16:44

to trigger that flow it would of course

play16:47

trigger this http action pull the data

play16:50

as it is find the opening closing tag

play16:53

get the table converted xml into json

play16:56

and then we can start building out our

play17:00

new select

play17:01

action to create our nicely repurposed

play17:05

array of data

play17:08

so that marks the end of the

play17:09

demonstration uh plenty to take in there

play17:11

again and some good use of some of the

play17:14

data operations select and item and

play17:16

substring in index of or nth index of if

play17:19

you're looking to find a particular

play17:21

string or occurrence of a string

play17:25

if you haven't already please make sure

play17:27

you like and subscribe and hope to see

play17:30

you again sometime soon thanks very much

play17:32

for watching cheers

Rate This

5.0 / 5 (0 votes)

الوسوم ذات الصلة
Data ScrapingPower AutomateJSON ConversionWeb DataAutomationExchange RatesPowerAppsData ParsingXML to JSONFlow Trigger
هل تحتاج إلى تلخيص باللغة الإنجليزية؟