The easiest way to get data from ANY site in minutes

Mike Powers
20 Apr 202415:08

Summary

TLDRIn this tutorial, Mike introduces 'Browse AI', a user-friendly tool that simplifies web scraping without coding. It features robots for crawling pages and a low-code UI, along with integrations for seamless data transfer to CRMs or databases. Browse AI offers pre-built templates for popular sites like YouTube, Yelp, and LinkedIn, and also allows custom scraper creation. Mike demonstrates how to extract government contracts from sam.gov and set up integrations with Google Sheets, showcasing the tool's efficiency in gathering and organizing data.

Takeaways

  • 🌐 Browse AI is an AI-powered tool designed for easy data extraction and monitoring from any website.
  • πŸ€– The platform uses robots that can crawl web pages through a user-friendly, low-code UI.
  • πŸ”— Browse AI offers numerous integrations with CRMs and databases, allowing seamless data transfer.
  • πŸ“š Pre-built templates are available for popular websites like YouTube, Yelp, Google, LinkedIn, and more, simplifying data scraping.
  • πŸ› οΈ Users can also create custom robot scrapers to extract specific information and organize it into CSV or Excel files.
  • πŸ”— The tool includes a feature to navigate to the next page and capture more data beyond the initial page view.
  • πŸ“ˆ Browse AI can be integrated with Google Sheets, automatically updating the spreadsheet with scraped data.
  • πŸ“‹ A bulk run feature allows users to run multiple URLs at once, significantly speeding up the scraping process.
  • πŸ‘€ The monitor tab enables users to set up automatic, recurring scraping tasks at specified intervals.
  • πŸ“ˆ Users can choose from a wide range of pre-built robots for various services, making web scraping accessible to non-technical users.
  • πŸ“š Data can be downloaded as CSV or JSON, and can also be sent directly to Google Sheets for ongoing updates.

Q & A

  • What is the main purpose of Browse AI as described in the script?

    -Browse AI is a tool designed to scrape and monitor data from any website easily and quickly, using a low-code UI and pre-built templates for popular websites.

  • Who is the speaker in the video, and what is his focus?

    -The speaker is Mike, who focuses on discussing AI and automation to help save time and make more money in businesses.

  • What are the key features of Browse AI that the speaker highlights?

    -The key features highlighted are the use of robots for web crawling, a user-friendly low-code UI, integrations with various CRMs and databases, and pre-built templates for scraping data from popular websites.

  • How does Browse AI make web scraping accessible to non-technical users?

    -Browse AI makes web scraping accessible by providing a simple interface where users can click on elements they want to scrape without needing to write any code.

  • What types of integrations does Browse AI support?

    -Browse AI supports integrations with various systems such as CRMs, databases, and services like Google Sheets, allowing users to pass the scraped data through their preferred platforms.

  • Can Browse AI be used to scrape data from government contracting websites?

    -Yes, the script demonstrates using Browse AI to scrape government contracts from sam.gov, a U.S. Government Contracting site.

  • How does Browse AI handle pagination to scrape more than one page of results?

    -Browse AI allows users to set up navigation to the next page, enabling it to scrape data from multiple pages and compile it into a single dataset.

  • What is the process of creating a custom robot scraper in Browse AI?

    -The process involves installing the Browse AI Chrome extension, granting permissions, recording actions on the website, selecting elements to scrape, naming variables, and configuring the robot with a name and search parameters.

  • How can users utilize Browse AI to scrape data from job listing websites like Indeed?

    -Users can use pre-built templates in Browse AI to input job titles, locations, and the number of job listings they want to scrape, and the tool will extract the relevant data.

  • What is the 'Bulk Run' feature in Browse AI, and how does it save time?

    -The 'Bulk Run' feature allows users to run a robot on multiple URLs at once by inputting a list of URLs and limits in a CSV file, which significantly speeds up the scraping process for multiple pages or websites.

  • How can users stay updated with the most recent data from websites they are interested in?

    -Users can set up a monitor in Browse AI to automatically run the scraper at set intervals, ensuring they receive the most up-to-date information by receiving emails with the latest results.

Outlines

00:00

πŸ€– Introduction to Browse AI for Web Scraping

The video introduces a powerful AI tool called Browse AI, which simplifies the process of extracting data from websites. Mike, the presenter, highlights the tool's ability to transform a website's data into a more organized format. He emphasizes the ease of use, especially for non-coders, and mentions the availability of pre-built templates for popular websites like YouTube, Yelp, and LinkedIn. The video also covers the integration capabilities of Browse AI with CRMs and databases, and encourages viewers to follow along by clicking a link in the description.

05:02

πŸ” Demonstrating Browse AI's Web Scraping Capabilities

Mike demonstrates how to use Browse AI to scrape data from sam.gov, a US Government Contracting site. He guides viewers through the process of setting up a scraping task, including installing the Browse AI Chrome extension and granting necessary permissions. The tutorial covers capturing lists and text from web pages, and how to save and name variables for the scraped data. Mike also shows how to navigate through pagination to extract more data and configure the scraping robot. The results are then displayed, showing how Browse AI can automatically organize the scraped data into a structured format.

10:05

πŸ“ˆ Advanced Features of Browse AI: Bulk Scraping and Integrations

The video continues with Mike showcasing the advanced features of Browse AI, such as bulk scraping and integrations. He explains how to use a CSV file to input multiple URLs and limits for scraping, which can significantly save time. Mike also demonstrates how to set up a workflow with Google Sheets to automatically update the spreadsheet with the scraped data. Additionally, he introduces the 'monitor' feature, which allows for automatic and recurring scraping at set intervals, ensuring up-to-date information.

15:05

🌐 Exploring Browse AI's Pre-built Templates and Future Scraping Plans

Mike explores Browse AI's pre-built templates for various websites, such as Expedia and Indeed, showing how easy it is to extract specific data like hotel listings or job postings. He emphasizes the tool's user-friendly interface and the value it provides, especially for those without coding skills. The video concludes with a teaser for another tool, 'scrape table', which is designed for scraping Google Maps data. Mike invites viewers to check out that video and encourages them to share their scraping experiences with Browse AI in the comments.

Mindmap

Keywords

πŸ’‘Browse AI

Browse AI is the main tool discussed in the video. It is an AI-powered tool designed to scrape data from websites quickly and easily. The video demonstrates how Browse AI can extract data from various websites, showing its practical applications and benefits for users looking to gather information without needing to code.

πŸ’‘Web Scraping

Web scraping refers to the process of extracting data from websites. In the video, the host explains how Browse AI facilitates web scraping by allowing users to gather information from different websites and store it in organized formats like CSV or Excel files. The tool's ease of use and low-code interface make web scraping accessible to non-technical users.

πŸ’‘Templates

Templates in Browse AI are pre-built configurations that allow users to scrape data from popular websites quickly. The video highlights how these templates cover various websites like YouTube, Yelp, Google, and more. Using templates can save users time and effort, as they provide ready-made solutions for common scraping tasks.

πŸ’‘Low-code UI

Low-code UI refers to a user interface that requires minimal coding knowledge to operate. Browse AI's low-code UI is praised in the video for its simplicity and accessibility, allowing users to set up web scraping tasks through a graphical interface rather than writing complex code. This feature broadens the tool's appeal to users with limited technical skills.

πŸ’‘Integration

Integration is the ability to connect Browse AI with other tools and services. The video mentions that Browse AI supports various integrations, enabling users to pass scraped data into CRM systems, Google Sheets, and other databases. This functionality helps automate workflows and ensures the seamless use of data across different platforms.

πŸ’‘Sam.gov

Sam.gov is a U.S. government contracting website used as an example in the video. The host demonstrates how to use Browse AI to scrape government contract data from Sam.gov, showing the practical application of the tool in a real-world context. This example illustrates the tool's capability to handle complex websites and extract valuable data.

πŸ’‘Custom Robots

Custom robots in Browse AI are user-defined scrapers tailored to specific data extraction needs. The video shows how users can create custom robots to scrape unique information from websites that may not be covered by pre-built templates. This flexibility allows users to adapt the tool to their specific requirements.

πŸ’‘CSV/Excel

CSV (Comma-Separated Values) and Excel files are common formats for storing scraped data. The video demonstrates how Browse AI can export extracted information into these formats, making it easy for users to analyze and manipulate the data using spreadsheet software. This feature enhances the usability of the scraped data.

πŸ’‘Pagination

Pagination refers to the process of navigating through multiple pages of data on a website. The video explains how Browse AI can handle pagination by clicking through pages to gather additional data beyond what is visible on the first page. This capability ensures comprehensive data extraction from websites with multi-page listings.

πŸ’‘Automation

Automation in the context of Browse AI involves setting up recurring tasks to scrape data at regular intervals. The video highlights the tool's ability to schedule scraping tasks, ensuring that users receive updated information without manual intervention. This feature is particularly useful for monitoring websites for changes or new data.

Highlights

Introduction to Browse AI, an AI-powered tool for easy data extraction from websites.

Browse AI's user-friendly low-code UI and robot feature for web scraping.

Integration options with various CRMs and databases for data management post-extraction.

Pre-built templates available for popular websites like YouTube, Yelp, and LinkedIn for quick data scraping.

Demonstration of building custom robot scrapers for specific data needs.

Tutorial on scraping sam.gov, a US Government Contracting site, using Browse AI.

Explanation of how to install and use the Browse AI Chrome extension for web scraping.

Step-by-step guide on capturing lists and text from a webpage using Browse AI.

How to navigate and scrape multiple pages for extended data collection.

Setting up a workflow with Google Sheets to automatically update with scraped data.

Utilizing Browse AI's bulk feature for running multiple URLs at once.

Accessing and using pre-built robots for various websites without coding.

Example of extracting hotel listings from Expedia using a pre-built Browse AI template.

Setting up a monitor for automatic reoccurring scraping at set intervals.

Extraction of job listings from Indeed using Browse AI's pre-built template.

Downloading and exporting scraped data in CSV or JSON formats.

Final thoughts on the simplicity and effectiveness of Browse AI for non-coders.

Transcripts

play00:00

this is by far one of the easiest ways

play00:02

to extract any data from any website in

play00:06

minutes if you're looking to turn a

play00:07

website from this to this well you've

play00:11

come to the right place I'm going to

play00:12

show you a AI power tool to do just that

play00:16

what's going on guys I'm Mike I talk

play00:18

about all things Ai and automation to

play00:21

help you save time and make more money

play00:23

inside of your business let's get into

play00:25

this AI tool to allow us to scrape any

play00:28

website in a matter of minutes so the

play00:30

tool that I want to showcase today is

play00:31

called browse Ai and just as it says in

play00:34

the landing page it is one of the

play00:35

easiest ways to extract and monitor data

play00:38

from any website and the basis of browse

play00:40

AI is they basically have these robots

play00:43

that allow us to crawl any page in a

play00:45

very easy to use low code UI which I

play00:48

love you can see they have a ton of

play00:50

different Integrations which we can use

play00:52

so that when we scrape our data from our

play00:54

website we can then pass it through

play00:56

whatever kind of CRM or database that

play00:58

we're working with and also one of the

play01:00

reasons why I love browse AI is because

play01:02

they have a bunch of pre-built templates

play01:04

that we're going to be checking out to

play01:05

get all kinds of really cool data from a

play01:07

bunch of different popular websites we

play01:09

have everything from YouTube to Yelp

play01:11

zappier Google LinkedIn here glass store

play01:15

I know indeed around here somewhere Tik

play01:17

Tok eBay there's a ton of great

play01:19

pre-built templates in here for us to

play01:21

extract information in a matter of

play01:23

seconds but I'm also going to show you

play01:24

how we can build custom robot scrapers

play01:27

to get any kind of information and have

play01:29

it in a NE organized CSV or Excel file

play01:32

if you want to follow along in this

play01:33

tutorial you can click the link in the

play01:35

description below to take you to browse

play01:36

AI you sign up with that link it does

play01:38

help out the channel this is not

play01:39

sponsored by browse AI I just really

play01:41

like the tool and I think for a lot of

play01:43

you non-coders out there you're really

play01:44

going to appreciate how simple this is

play01:47

so you can click the link down below to

play01:48

get started and let's hop right into

play01:50

browse AI so I've logged into browse Ai

play01:52

and right at the start ready to start

play01:54

putting in our URL to extract all of the

play01:56

data that we're looking for you see we

play01:58

have our little robot guy in the middle

play02:00

here basically the robot that's going to

play02:02

control our web scraping and allow us to

play02:04

get whatever kind of information that

play02:06

we're looking for so let's find a site

play02:08

to scrape and today we're going to be

play02:09

scraping sam.gov if you don't know what

play02:11

sam.gov is it's basically a US

play02:14

Government Contracting site which hosts

play02:16

a ton of free government contracts in a

play02:19

database for people to bid and put

play02:21

quotes on there's a ton of really cool

play02:22

and interesting information on here so

play02:24

you can take some time to poke around

play02:25

with it but what we're going to do is

play02:27

head on over to the search tab here and

play02:29

we're just going to search for any

play02:30

specific keyword here to get some

play02:32

results I'm going to type in the word

play02:33

buildings and you can see here we

play02:34

already got a list of some contracts

play02:37

here for expanding a parking structure a

play02:39

courtyard upgrade toilet Replacements

play02:42

lodging spaces Civil Works there's all

play02:45

kinds of really interesting stuff on

play02:46

here so if you've ever been interested

play02:48

in Government Contracting this is a

play02:49

great place to start but this is also

play02:51

going to be a great website for us to

play02:53

learn how browse AI works so we can

play02:55

scrape some government contracts and put

play02:57

it in a nice organized CSV and Excel

play02:59

file so I'm going to take this URL from

play03:02

browse AI with the buildings search

play03:05

query already inside of it we're going

play03:06

to head back over to browse Ai and we're

play03:09

going to pop this right in here if it's

play03:11

your first time using browse AI it's

play03:13

going to prompt you to install the

play03:14

browse AI Chrome extension so make sure

play03:17

you also install the browse AI Chrome

play03:19

extension cuz this is what's going to

play03:20

allow us to build out our Scraper on

play03:22

their platform so I'm going to add this

play03:23

to my browser it'll also ask us to Grant

play03:25

some permissions in order to record our

play03:27

actions just got to click on this and

play03:29

click ow and then same thing we're going

play03:31

to need to allow recording in incognito

play03:33

mode so we're just going to want to open

play03:34

our settings here and then I'm on Brave

play03:37

so it's this allow and private tab here

play03:39

but I'm pretty sure on Chrome it's allow

play03:41

and incognito mode so you're just going

play03:43

to switch that to be on and we should be

play03:46

all good to go we have our URL in the

play03:48

origin URL we just click on start

play03:50

training this robot and it pops up a

play03:52

brand new tab with the same URL we put

play03:54

in of sam.gov I'll click on okay here

play03:57

and now you'll notice at the top right

play03:59

we have our browse AI guy right here who

play04:02

is going to help us scrape this website

play04:04

so if we give him a click you'll notice

play04:06

we have a couple different options here

play04:07

we can either capture a list which is

play04:09

exactly how it looks in the picture

play04:11

there where allows us to select items

play04:13

that are similar in their structure on a

play04:16

website or we can use capture text that

play04:18

will allow us to just select a specific

play04:19

part of the page this is going to be

play04:21

good for things like product pages on

play04:22

like a Shopify store there's also

play04:24

capturing screenshots of a page so if

play04:26

you want to capture a screenshot this is

play04:27

how you do it but for today on sam.gov

play04:29

we're going to be using the capture list

play04:31

option right here and all we have to do

play04:33

is we're going to want to hover over the

play04:35

lists that we're looking for to scrape

play04:37

so as I hover over the sections here

play04:39

browse AI uses the different sections in

play04:41

the page to determine the different

play04:43

lists that we want to scrape and if I

play04:45

hover kind of right here you'll notice

play04:47

we actually get all of the different

play04:48

list items that we're looking to scrap

play04:50

so we want to get these 25 government

play04:53

contracts on sam.gov and all the

play04:54

information inside each of the boxes so

play04:56

I'm going to kind of go over here and

play04:58

give this a click and you'll notice we

play04:59

get all of the different contracts on

play05:02

this page so now we can just pick the

play05:04

information we want from each of the

play05:05

lists so I'm going to hover on over the

play05:07

title of this contract here and if I get

play05:09

it lined up right I can click on it and

play05:11

there's two options we have here you can

play05:12

save two different variables from this

play05:14

one particular text here so we can

play05:16

capture the visible text which I will do

play05:17

and then we're also going to want to go

play05:19

back over it again that didn't work

play05:21

we're going to back over it one more

play05:22

time and then we're going to capture the

play05:24

link so this will capture two different

play05:25

variables we have the text from the

play05:27

title and also the link from the title

play05:29

and let's also capture the notice ID and

play05:32

then we'll also capture this description

play05:34

here we'll capture the visible text for

play05:36

the Department agency we're just going

play05:38

to capture the visible text for this one

play05:41

same thing with the sub tier just the

play05:43

visible text we're also going to get the

play05:45

office information the current date

play05:47

offers are due the notice type here the

play05:50

updated date we're we're just going to

play05:52

grab the capture visible text and also

play05:54

the publish date and now once we have

play05:56

all the information from our one

play05:59

contract you'll notice it grabs the same

play06:01

information for all of the different

play06:02

listings on this page and then we just

play06:04

hit enter and after that it's going to

play06:06

ask us to name each of these variables

play06:08

so when it scrapes this page it'll put

play06:10

them in a row according to these headers

play06:12

so for this we're just going to name

play06:14

this title and click on enter and this

play06:16

is for the link so I'm just going to

play06:17

name this link we have the notice ID so

play06:19

I'm just going to name this ID we have

play06:21

the description here I'll name this

play06:23

department for the Department agency as

play06:26

well as the sub tier office for the

play06:28

office information I'll name this offer

play06:31

due for the due date of the offer I'll

play06:33

name this one type for the notice type

play06:35

updated date for the update date and the

play06:38

publish date will be the published date

play06:41

there we go and now we have all of our

play06:42

variables and you'll see as soon as we

play06:44

get the last one entered in we have all

play06:46

of our information here from our page

play06:48

and what's great too is if even if it

play06:50

doesn't exist it'll leave it blank which

play06:52

is nice because there'll be no

play06:53

formatting issues if for some reason the

play06:55

listing doesn't have that information

play06:57

already autop populated in and then from

play06:59

here we got do a couple more things

play07:00

we're going to name this list so I'm

play07:01

going to name this Sam contracts we're

play07:04

going to want to choose the amount of

play07:05

rows that we want to extract for this

play07:07

I'm going to extract a custom number of

play07:10

actually what we're going to do is this

play07:11

let's click on please select the page

play07:12

ination type and we're going to click on

play07:15

click to navigate to the next page and

play07:17

what this will do is allow us to go to

play07:19

the bottom here and once we get to the

play07:22

last 25 we can tell it to click on this

play07:25

button right here and if I move my

play07:27

camera this will allow us to capture

play07:29

capture more than the 25 rows of

play07:31

information that are on this page so

play07:33

instead of putting 25 in here let's put

play07:35

like 40 cuz 40 will mean that we have to

play07:37

at least go to the next page in order to

play07:39

grab the additional 15 that are on the

play07:41

next page because there's only 25 rows

play07:43

per page and you'll notice once we have

play07:45

all this populated in we get a new

play07:46

button here capture list click on this

play07:48

and we just click on finish recording

play07:51

and it'll upload our brand new robot now

play07:53

all we got to do is just configure it by

play07:55

giving it a name that's fine and we'll

play07:57

give it a search this is going to take a

play07:58

couple minutes but it's basically going

play08:00

to do the first initial run through of

play08:03

our robot so I'll come back to you when

play08:05

that's done awesome and a couple seconds

play08:07

later we get our information so you can

play08:09

see here we have Sam contracts 40

play08:12

results and we have all of the

play08:13

information from the pages this only

play08:16

shows the first 10 but if you click on

play08:18

see all 40 items we get all 40 results

play08:20

and what's great too is it even goes to

play08:22

the next page through that pentation

play08:24

feature that we've clicked on and goes

play08:26

to the next one to get all of the

play08:28

information for us sweet so then from

play08:30

here all we have to do just head down

play08:32

here and click on yes looks good if

play08:34

there's any issues with it you can

play08:36

always retrain it or delete it but we'll

play08:37

click on yes and now our robot is pretty

play08:39

much all good to go from here we can do

play08:41

a lot of different steps to integrate

play08:43

this bot into whatever kind of workflow

play08:45

we want we can click on the tables here

play08:47

to see tables of our past searches and

play08:50

also the integrate tab here is another

play08:52

thing you're going to want to be looking

play08:53

at there's a ton of great Integrations

play08:55

here that we can use to send our data

play08:57

once we run our robot inside of brows

play08:59

I'm going to set up a workflow with

play09:01

Google Sheets but you can obviously use

play09:03

any of the ones that are listed here

play09:04

I'll enable syncing with Google Sheets

play09:05

I'll log into my account I'll create a

play09:07

new spreadsheet and I'm going to name

play09:09

this Sam contracts and then I'll just

play09:11

click on create spreadsheet and activate

play09:13

integration so now what's going to

play09:15

happen is every time we run this

play09:16

workflow it's going to put the

play09:18

information we get from the contracts

play09:20

and put it inside of our Google sheet so

play09:21

let's give this a shot now if we head

play09:23

back to run task here you'll notice we

play09:25

have options to change our origin URL as

play09:28

well as our cont attack limit so I'm

play09:30

going to head back over to sam.gov and

play09:32

instead of putting in buildings here

play09:33

let's remove this keyboard and let's put

play09:35

in something like food so there's a ton

play09:38

of contracts here on chicken eggs HP

play09:41

freezer refrigerators yeah a lot of

play09:43

stuff here you can look through but

play09:44

we're going to take this URL here we're

play09:46

going to stick this inside of our origin

play09:47

URL and then we're going to change our

play09:49

contact limits to whatever we're looking

play09:50

for let's say we want to scrape like 30

play09:52

instead of 40 this time and then click

play09:54

on run a task while this is running I

play09:56

want to show you guys the Run task bulk

play09:58

feature you want do more than one URL at

play10:00

a time this feature is going to save you

play10:02

a ton you click on this bulk run tests

play10:04

here I would recommend downloading the

play10:06

sample CSV input here and it will kind

play10:08

of show you the format of what it's

play10:09

looking for but basically if you just

play10:10

make a CSV file with all your URLs and

play10:13

then the limit for each of those URLs in

play10:14

a list here you can then take this Excel

play10:16

file and you could just pop it right

play10:18

into here and it would go through and

play10:20

run a robot for every single one of

play10:22

those rows with a URL and a limit which

play10:25

is amazing this is save me a ton of time

play10:27

when I'm doing bulk scraping all right I

play10:29

had to rerun it cuz it wasn't working

play10:30

the first time so I got 40 you can see

play10:32

here we got our brand new list of the

play10:33

food contracts like I was saying earlier

play10:36

and if we take a look at our Google

play10:38

Sheets you'll notice we have a new Sam

play10:41

contracts tab right here with all the

play10:43

data we got from browse AI isn't that

play10:46

awesome it just automatically gets

play10:48

posted right in here once it's done

play10:50

scraping but I want to show you some of

play10:52

the other templates that I mentioned in

play10:53

the beginning of this video that will

play10:55

allow you to just hop right on here and

play10:57

start playing around and scraping all

play10:58

kinds of websites in minutes so if we

play11:00

head back over to the dashboard here we

play11:02

can choose from browse pre-built robots

play11:05

right here which will allow us to select

play11:07

from all the robots in the pre-built

play11:09

robots page and there are a ton of good

play11:11

ones in here we have all kinds of

play11:13

services from Airbnb Amazon Chrome

play11:16

Fiverr glass store Google indeed

play11:19

LinkedIn loopet monster Reddit product

play11:22

hunt Tik Tock Trip Advisor y commentator

play11:25

YouTube Zillow Zoom info so let's try a

play11:27

couple of these ones out personally I to

play11:29

try this Expedia one out here it's kind

play11:31

of peing my interest so we have the

play11:33

extract hotels list from Expedia so we

play11:35

can just click on use this automation so

play11:37

it will basically copy this template of

play11:39

a robot and put it inside of our account

play11:41

so from here all we have to do is just

play11:42

input our hotel list from Expedia I'm

play11:44

looking to go to the Florida Keys pretty

play11:46

soon so I'm going to copy this search

play11:48

result just for Florida keys for the

play11:49

next couple days here I'll head back

play11:51

over to the robot page I'll put in my

play11:53

URL and then we can change however many

play11:55

hotels we want I'm going to go with 20

play11:57

hotels and then we'll click on next step

play11:59

it'll show us the configuration and just

play12:02

start extracting in my experience I've

play12:04

had a lot less hiccups and issues just

play12:06

by using the pre-built templates that

play12:08

browse AI has already made and it's

play12:09

honestly where I see where most people

play12:11

are going to get value out of browse AI

play12:13

especially for people who are non Tech

play12:15

and don't know how to code and don't

play12:16

know how to build web scrapers is a very

play12:18

friendly and intuitive way to make web

play12:20

scraping accessible to pretty much

play12:22

anyone and there we go we got our 20

play12:24

search results right here from xedia you

play12:28

can see all 20 items another thing too

play12:30

that I forgot to mention while we were

play12:31

scraping Sam is this monitor tab here we

play12:34

can actually create a monitor for this

play12:36

scraper to automatically run the Scraper

play12:39

on a set interval so for instance if you

play12:41

want to scrape a product page so you

play12:43

always have the most upto-date price you

play12:45

would use this so it constantly scrapes

play12:47

the website to grab that pricing

play12:49

information you can have it run once on

play12:51

any day You' like in your time zone on

play12:54

the specific URL with however many

play12:56

results you want and you can just save

play12:58

it right here and then it will run on

play13:00

that interval every day and send you an

play13:02

email with the results it's pretty

play13:04

straightforward and intuitive once again

play13:06

if you're looking to get reoccurring

play13:07

information or the most up-to-date

play13:09

information on any kind of website that

play13:11

you're scraping this is how you would

play13:13

set that schedule there before we go

play13:15

let's look at one more pre-built

play13:17

template if I'm looking to extract stuff

play13:19

from indeed for instance I'm going to

play13:21

extract job listings from indeed I'm

play13:24

going to click on this one here and

play13:25

let's try this one out we'll use this

play13:26

Automation and this one it looks like we

play13:28

don't even need a link we could just put

play13:30

in our job title location and our amount

play13:32

of jobs so I'm going to go for

play13:34

automation expert this will be

play13:36

interesting and let's put it as like

play13:38

remote and let's see if we can get like

play13:40

30 jobs this will be interesting I

play13:42

haven't tried this out yet and here we

play13:44

go looks like we get design engineer

play13:47

senior full stack okay so I guess it's

play13:49

just like coding which makes sense and

play13:51

we get our 30 items for that search

play13:54

result we have the indeed link when it's

play13:56

posted little bit of a description here

play13:59

location the company and the position we

play14:02

can download this data if we want either

play14:04

as a CSV or Json and there we go we got

play14:07

all of our ND data right here or as we

play14:09

saw earlier you can use the integration

play14:12

here and you can set it up to a Google

play14:13

sheet and you can just make a brand new

play14:15

Google sheet with the indeed data so

play14:18

whenever it gets ran it sends the data

play14:20

that we made over to Google Sheets this

play14:22

is going to be one of the simplest and

play14:24

easiest ways to scrape any website in a

play14:27

matter of minutes what I love about

play14:29

browse AI is there's no coding at all

play14:32

all it is just click on what you want

play14:33

and then hit the button to go and then

play14:35

you get a nice layout of all the

play14:37

information that you looking for once

play14:39

again I'll have a link down description

play14:41

below for you to get started using

play14:42

browse Ai and also let me know down in

play14:44

the comments what websites are you going

play14:46

to be scraping with browse AI but if

play14:48

you're looking for Google Maps data

play14:51

specifically how about you check out

play14:53

this video here where I showcase my app

play14:55

scrape table which allows you to scrape

play14:57

unlimited Google Maps data for free in a

play15:00

matter of seconds if you haven't seen

play15:02

that video it's an absolute Banger so

play15:04

make sure you go check it out and I'll

play15:06

see you over there

Rate This
β˜…
β˜…
β˜…
β˜…
β˜…

5.0 / 5 (0 votes)

Related Tags
Web ScrapingAI AutomationData ExtractionBusiness ToolLow Code UICRM IntegrationPre-built TemplatesCustom ScrapersCSV ExportExcel Integration