Web Scraping Tutorial | Data Scraping from Websites to Excel | Web Scraper Chorme Extension

Azharul Rafy
20 Aug 202116:26

Summary

TLDRIn this tutorial video, Rafi demonstrates how to use a free Google Chrome extension called 'Web Scraper' to extract data from multiple web pages automatically. He provides a step-by-step guide on scraping information from the Yellow Pages business directory, focusing on car insurance service providers in New York City. The data collected includes business names, phone numbers, addresses, websites, and email addresses. The video also covers navigating pagination and setting up selectors for efficient data extraction.

Takeaways

  • πŸ’» The video demonstrates how to scrape data from websites using a free Google Chrome extension called Web Scraper.
  • 🏒 The target data source is the Yellow Pages business directory, specifically gathering information about car insurance service providers in New York City and State.
  • πŸ“Š The scraping process involves collecting details like business name, phone number, address, website, and email address from multiple pages.
  • πŸ”„ Web Scraper automates the process by moving from one page to the next after scraping 30 results per page.
  • πŸ› οΈ To begin, users need to install the Web Scraper extension from the Chrome Web Store and then reload the target website.
  • πŸ–±οΈ Using 'Inspect Element,' users can identify and create a sitemap with selectors to scrape specific information from the webpage.
  • 🌐 The tutorial includes selecting business listings, extracting information like name, phone number, and website, and handling multi-page navigation for continuous scraping.
  • πŸ“ˆ The tool allows users to adjust scraping intervals to avoid hitting website restrictions or being blocked.
  • πŸ“₯ After completing the scraping process, users can export the gathered data into a CSV file for further use and cleaning.
  • πŸ”§ The video emphasizes the importance of cleaning data post-extraction, such as removing unnecessary text (e.g., 'mailto:') from email addresses.

Q & A

  • What is the main topic of the video?

    -The main topic of the video is how to scrape data from websites using a free Google Chrome extension called Web Scraper.

  • What specific information is the presenter going to extract from the Yellow Pages business directory?

    -The presenter is going to extract car insurance service providers' information from New York City and State, including business profiles' names, phone numbers, addresses, website addresses, and email addresses.

  • How does the tool handle pagination on the website?

    -The tool automatically visits subsequent pages after completing the data extraction from the first page, continuing to scrape data from each page.

  • What is the name of the Google Chrome extension used in the video?

    -The Google Chrome extension used in the video is called 'Web Scraper'.

  • How does one install the Web Scraper extension on Google Chrome?

    -To install the Web Scraper extension, one needs to visit the extension page, click on 'Add to Chrome', and then confirm by clicking 'Add extension'.

  • What is a sitemap in the context of web scraping with the Web Scraper extension?

    -A sitemap in the context of web scraping with the Web Scraper extension is a configuration that defines how the tool navigates and extracts data from a website.

  • How does the presenter select the data points to be scraped from each business listing?

    -The presenter selects data points by clicking on 'Add new selector', choosing the type (text or link), and then selecting the specific elements on the webpage such as business name, phone number, address, website, and email.

  • What is the purpose of setting a delay between requests when scraping?

    -Setting a delay between requests prevents the scraper from being blocked by the website due to too many rapid requests, as most websites have limitations on the number of accesses per user per day.

  • How can one export the scraped data from the Web Scraper extension?

    -The scraped data can be exported by clicking on the 'Export data' button and then choosing 'Export data as CSV' to download the data into an Excel document.

  • What is the final format of the scraped data as mentioned in the video?

    -The final format of the scraped data is a CSV file containing the information such as business or person's name, phone number, address, website, and email.

  • How does the presenter clean the extracted email addresses in the CSV file?

    -The presenter cleans the extracted email addresses by using the 'Find and Replace' feature in Excel to remove the 'mailto:' prefix from each email address.

Outlines

00:00

πŸš€ Introduction to Web Scraping with Chrome Extension

The video starts with a warm welcome from the presenter, Ashore Rafi, explaining that the focus of the tutorial is on scraping data from websites using a free Chrome extension. He plans to demonstrate this by extracting car insurance service provider information from New York's Yellow Pages directory. The presenter explains that the tool will scrape business names, phone numbers, addresses, website URLs, and email addresses. He also highlights that the tool can automatically scrape multiple pages of data, with each page containing 30 results, and the process continues until all pages are scraped.

05:00

🧩 Installing the Web Scraper Chrome Extension

This section explains the step-by-step process of installing the Web Scraper Chrome extension. The presenter provides a detailed guide on how to navigate to the extension page, add it to Chrome, and verify the installation. Once installed, the user is advised to reload the search result page, after which they can begin scraping. He then shows how to open the browser’s developer tools and highlights the 'Inspect' function, which allows access to the Web Scraper extension within the browser.

10:01

πŸ“‘ Setting Up Selectors for Web Scraping

In this paragraph, the presenter explains how to create a new sitemap within the Web Scraper tool and set up selectors for collecting data. He describes how to identify and select business listings by clicking on their links, assigning IDs, and selecting multiple listings. He walks through selecting the first few business profiles and how the tool automatically recognizes the rest, making the process efficient. Once the business listings are selected, the user is instructed to save the selector.

15:03

πŸ— Collecting Business Details: Name, Phone, Address, and Website

This part dives into the process of extracting specific data points from each business listing. The presenter shows how to set up new selectors to scrape business names, phone numbers, addresses, websites, and email addresses. He describes how to handle different business profiles, whether personal or organizational, and ensure all relevant fields are captured. He also explains how the tool can automatically scrape multiple pages, collecting data from various business profiles listed on Yellow Pages.

πŸ”„ Automating the Scraping Process Across Multiple Pages

In this section, the presenter explains how to make the scraping tool visit subsequent pages after finishing the first one. He shows how to set up a new selector for pagination, selecting all the page numbers and the 'Next' button to ensure the tool can scrape across multiple pages. He advises adjusting settings to avoid scraping restrictions imposed by business directories and websites and demonstrates how to view the selector graph to visualize the scraping structure.

πŸ›  Running and Configuring the Scraping Script

Here, the presenter discusses how to configure the scraping script by setting appropriate time intervals to avoid triggering website restrictions. He walks through the process of starting the scraping operation and monitors the tool as it begins to extract data from multiple pages. The scraping process continues in the background, collecting data such as business names, phone numbers, addresses, and emails, all while avoiding website blocking issues.

πŸ“Š Viewing and Exporting the Scraped Data

The presenter shows how to access and review the collected data within the Web Scraper tool. He explains how users can refresh the tool to view updated information and export the data in a CSV format. The presenter demonstrates how to clean the data in Excel, removing unwanted fields, formatting emails, and avoiding duplication. He explains that while some businesses may list multiple contacts, this is not an issue, as each entry is distinct.

βœ… Final Thoughts and Tips for Successful Web Scraping

The video concludes with the presenter cleaning up the scraped data and summarizing the process of automatic data extraction from a business listing site. He offers tips on using the 'Find and Replace' function in Excel to clean up email addresses and ensure a neat data structure. Finally, he encourages viewers to like, share, and comment if they found the video helpful, and invites them to subscribe for more tutorials.

Mindmap

Keywords

πŸ’‘Web Scraping

Web scraping is the process of automatically extracting data from websites. In the video, the speaker demonstrates how to scrape data from the Yellow Pages business directory using a Google Chrome extension. Web scraping is central to the video's theme as it automates the collection of business details like names, addresses, and contact information from multiple web pages.

πŸ’‘Google Chrome Extension

A Google Chrome extension is a small software module that customizes the Chrome browsing experience. In the video, the speaker uses the 'Web Scraper' extension, which allows users to scrape data from websites directly through their browser. This tool is pivotal for automating the data extraction process, and its installation and usage are key steps shown in the tutorial.

πŸ’‘Sitemap

A sitemap is a structured model used in web scraping to define which elements of a webpage will be scraped. In the video, the speaker creates a sitemap within the Web Scraper extension to specify how the data will be extracted from Yellow Pages listings. Sitemaps help guide the scraper through multiple web pages and elements like business names and contact details.

πŸ’‘Selectors

Selectors are used in web scraping to identify and extract specific elements from a webpage, such as text, links, or images. In the video, selectors are created for extracting details such as business names, phone numbers, addresses, and emails from the Yellow Pages listings. They enable the scraper to target the right elements on a webpage and gather the required information.

πŸ’‘Yellow Pages

Yellow Pages is a business directory listing service where companies provide information such as names, phone numbers, and addresses. In the video, the speaker demonstrates scraping data from the Yellow Pages to collect information about car insurance providers in New York. Yellow Pages serves as the example website for showing how data can be extracted from such directories.

πŸ’‘Data Extraction

Data extraction refers to the process of retrieving specific data from websites or documents. In the context of the video, it involves extracting business profiles, including contact details like phone numbers, addresses, and websites from Yellow Pages listings. The video focuses on automating this process through the Web Scraper tool.

πŸ’‘Pagination

Pagination refers to the way content is divided into multiple pages on a website. The video explains how to scrape data across multiple pages of business listings by navigating through pagination on the Yellow Pages site. The speaker shows how the Web Scraper can automatically move from one page to the next, continuing to extract data.

πŸ’‘CSV File

A CSV (Comma-Separated Values) file is a simple file format used to store tabular data. In the video, after the data extraction process is complete, the speaker exports the scraped data into a CSV file, which can then be opened in spreadsheet software like Excel for further analysis. CSV export is an important final step to access and utilize the collected data.

πŸ’‘Interval Settings

Interval settings in web scraping refer to the time delay between requests made to a website to avoid overwhelming the server or triggering anti-scraping mechanisms. In the video, the speaker sets the interval to 2,000 milliseconds to prevent being blocked by the Yellow Pages website while scraping multiple pages of data.

πŸ’‘Duplicate Data

Duplicate data refers to repeated entries or information within a dataset. In the video, the speaker addresses the issue of duplicate entries, especially when multiple individuals from the same business appear in the Yellow Pages listing. The speaker cleans up the data by removing duplicates, which is a common step in data processing after extraction.

Highlights

Introduction to scraping data from websites using a free Google Chrome extension.

Demonstration of extracting business information from the Yellow Pages directory.

Focus on collecting car insurance service providers’ details from New York City and State.

The tool scrapes data such as business name, phone number, address, website, and email.

A single page contains 30 results, and the tool can navigate to subsequent pages automatically.

Installation process of the 'Web Scraper' extension from Google Chrome Web Store.

Step-by-step guide on using the Inspect tool in Chrome to identify elements for scraping.

Creation of a new sitemap for scraping, starting with the Yellow Pages directory URL.

Selecting multiple business listings for scraping using a 'link' type selector.

Detailing the process of creating specific selectors for scraping business names, phone numbers, and addresses.

Instructions for capturing website URLs and email addresses from the listings.

Automating the navigation between multiple pages for scraping beyond the first page of results.

Explanation of handling website limitations by setting interval gaps between requests to avoid restrictions.

Exporting the collected data as a CSV file and cleaning the data, including removing unnecessary fields.

The final cleaned CSV file includes essential information such as business name, phone number, address, website, and email.

Transcripts

play00:00

hello and welcome back this is ashore

play00:01

rafi once again in this video i'm going

play00:03

to show you how to scrape data from

play00:06

websites we are going to learn how we

play00:08

can get information from multiple web

play00:10

pages automatically at one go by using a

play00:14

free google chrome extension and to

play00:16

demonstrate the full process step by

play00:18

step to you i am going to extract data

play00:21

from the yellow page business directory

play00:24

and i'm going to collect car insurance

play00:25

service providers information from new

play00:28

york city and state so to make it more

play00:31

clear to you i'm going to collect each

play00:33

of these business profiles

play00:35

name their phone number their address

play00:38

information their website address their

play00:40

email address and you can follow the

play00:43

steps to collect any other information

play00:45

which is required for your need so

play00:47

without further ado and one thing that

play00:50

i'd love to mention as well first of all

play00:52

if you just notice on the first page we

play00:54

have got 30 results and after completing

play00:58

these 30 results the tool will

play01:01

start visiting the second page the third

play01:03

page fourth page fifth page and so on

play01:06

and get all the other business listings

play01:09

informations as well so let's sum it up

play01:11

on the first page we have got 30 results

play01:14

on the second page you have got more 30

play01:15

result that means total 60 and from the

play01:18

third page it is going to get more 30

play01:20

results the total is going to be 90 and

play01:23

it is going to be um this way right so

play01:26

without further ado let me download or

play01:28

actually install the required extension

play01:30

which is

play01:31

web

play01:33

scraper on google chrome and i'm going

play01:36

to let me open this link in a new tab

play01:38

just take a look here it is this is the

play01:40

extension page i'm going to attach this

play01:43

link into the video description for your

play01:45

easy access so after visiting this page

play01:47

simply you have to click on this add to

play01:49

chrome button right here then confirm it

play01:51

by clicking on this add extension button

play01:54

right here and within like seconds it is

play01:57

going to be added just take a look web

play01:59

scraper free web stripping has been

play02:01

added to chrome so now we are all set

play02:03

with the tool um installation now it's

play02:07

time to reload our search result page on

play02:09

yellow page or whatever uh directory

play02:12

that you would like to scrape data from

play02:14

all right so here we go now after

play02:16

reloading the page simply you have to

play02:19

click on the right button of your mouse

play02:21

and then you are going to find this

play02:23

inspect option or you can use a shortcut

play02:26

key which is ctrl plus shift plus i but

play02:29

i'd love to click here on this inspect

play02:31

button so that will get this console tab

play02:34

or this console information into uh your

play02:37

browser right so after

play02:40

coming up here you are going to notice

play02:42

this option web scraper if you have

play02:44

installed the extension already on your

play02:46

chrome browser you will notice this

play02:49

button or option so let's click on it

play02:52

after that you have to click on create

play02:54

new sitemap

play02:56

and then we are going to click on create

play02:58

sitemap and then we are going to give a

play03:00

name to the sitemap so let's say i'm

play03:02

going to type out yellow page extraction

play03:07

and the url start url is going to be

play03:10

this url

play03:12

copy and paste it and then click on

play03:15

create sitemap after that we have to add

play03:18

a new selector for this root um sitemap

play03:22

so let's click on add new selector and

play03:25

then we are going to select on the first

play03:27

stage we are going to select all of the

play03:29

business listings page or these links

play03:32

let's say this is the first link

play03:34

for the first business professionals

play03:36

information then i have got this is the

play03:39

second link for the second business or

play03:41

personal uh person's information right

play03:44

and so on we have got third fourth atc

play03:47

so now what we have to do we have to

play03:49

click here and then we have to provide a

play03:51

id name which is let's say i'd love to

play03:54

give it as links and then we have to

play03:57

change the type from text to link

play04:00

and then we are going to select multiple

play04:02

as we are going to select multiple links

play04:05

from this page let's say so we are going

play04:07

to click on select now after that we are

play04:09

going to click here and just take a look

play04:12

the name of this person has been

play04:13

selected after that go a little bit down

play04:17

click on the second listing and if you

play04:18

just click on the second listing you

play04:21

will notice that

play04:22

rest of the listings has already been

play04:25

selected automatically and the tool

play04:27

worked for us right so if i take you to

play04:29

the bottom of the page you are going to

play04:31

notice that we have selected all these

play04:34

30 uh profiles already right so now

play04:37

let's go to the top of the page so after

play04:39

selecting these listings we have to

play04:41

click on done selecting button right

play04:43

here and then go a little bit down you

play04:46

will find this option save selector

play04:48

let's click on save selector we are done

play04:51

with creating our very first um

play04:54

selector now what we have to do we have

play04:57

to go inside the selector because now we

play05:00

are going to collect information from

play05:02

this

play05:03

uh business listing inside this business

play05:05

listing so let's click on it

play05:07

after that we are going to click here as

play05:10

well on this id

play05:11

and now we are inside the first selector

play05:14

now what we have to do we have to click

play05:15

on add new selector button right here

play05:18

after that we are going to select the

play05:20

business name so that the name or the id

play05:23

name is going to be let's say business

play05:25

name

play05:26

or let's just put name here

play05:29

because i have noticed some of these

play05:30

profiles are personal profiles some of

play05:32

these profiles are business profiles as

play05:34

well after that we can keep the type to

play05:37

text then let's click on select and then

play05:40

we are going to select this business

play05:42

name just take a look it's just selected

play05:44

now let's click on

play05:46

done selecting after that we are going

play05:48

to

play05:49

the bottom of the page or this option

play05:52

and then click on save selector so we

play05:54

have selected our uh so we have just

play05:57

selected this parameter for taking our

play06:00

business name now it's time to select

play06:02

another selector for um this phone

play06:05

number address and all of this

play06:07

information so these are basically these

play06:10

two are basically the repetitive task of

play06:12

the first one but we have to make a

play06:14

little change here on website and email

play06:16

address collection so let's just click

play06:18

on it and then we are going to collect

play06:21

let's say phone number

play06:23

and then click on

play06:25

keep it as text and then click on select

play06:28

after that we are going to click here so

play06:31

that the phone number will be selected

play06:33

let's click on done selecting and then

play06:36

we are going to click on save selector

play06:38

after that we are going to click on add

play06:40

new selector again after that we are

play06:42

going to give it a name let's say

play06:44

address

play06:47

after that keep it as text click on

play06:49

select then we are going to select both

play06:52

of these uh lines so click here so that

play06:55

both lines has been selected now let's

play06:57

go

play06:58

oh sorry we have to click on this done

play07:00

selecting and then we have to click on

play07:02

save selector okay so we are done with

play07:05

all of this phone number address and

play07:07

business name selection it's time to

play07:09

select our website so let's click on add

play07:12

new selector after that we're going to

play07:14

give it a name let's say website

play07:16

after that we have to change the type

play07:18

from text to

play07:20

link

play07:22

and then we are going to click on select

play07:25

and after that we have to click on this

play07:27

visit website button right here and we

play07:30

are going to click on this c

play07:32

then click on done selecting click on

play07:35

save selector

play07:37

now let's do the same for the email

play07:39

address let's click on add new selector

play07:42

after that we are going to type out

play07:43

email and then change the type from text

play07:47

to link after that let's click on select

play07:50

then we are going to click on email

play07:53

business then we are going to click on c

play07:55

we are going to click on done selecting

play07:57

then click on this save selector all

play08:00

right so we are done with this page

play08:03

information selection now let's go back

play08:06

to the previous page by clicking here

play08:10

so

play08:11

if we run this script now it is going to

play08:13

collect information from all of these 30

play08:16

results from this page only but we have

play08:20

got few more pages appearing here so now

play08:22

it's time to select these pages as well

play08:25

so that our tool will go to this page

play08:28

and then get information as they're

play08:30

going to collect from the first page uh

play08:32

listings to make this happen we have to

play08:35

go up and then we'll notice this option

play08:38

sitemaps let's click on sitemaps after

play08:41

that we are going to click here

play08:44

inside this

play08:46

let's say

play08:47

sitemap we are going to find our first

play08:50

link sitemap then we are going to click

play08:52

on this add new selector button right

play08:54

here and then we're going to add a new

play08:57

id let's say these are pages so i'm

play08:59

going to type out let's say

play09:01

pages after that we're going to change

play09:03

the type from text to link

play09:06

and then we are going to

play09:09

click on select and this should be

play09:11

multiple as you can see we have got two

play09:13

three four five next so we are going to

play09:16

click on multiple then lets click on

play09:18

select

play09:19

and after that i am going to click on

play09:22

two three and just take a look it

play09:23

automatically selected two three four

play09:25

five and then it is going to go if there

play09:28

are the six seven eight nine pages it is

play09:31

going to go on these pages as well as

play09:34

the next also selected now let's click

play09:37

on save selector button right here

play09:40

okay so

play09:42

i missed something so let me click on

play09:44

select again yeah okay so let's do this

play09:47

part again so again i have provided the

play09:50

id here pages type is link now it's time

play09:53

to select it to multiple and then let's

play09:55

click on select after that we are going

play09:57

to click here on 2 3 and it is going to

play10:01

be selected all of these pages now let's

play10:03

click on done selecting okay so i

play10:05

actually missed this part so let's click

play10:07

on done selecting and then we are going

play10:09

to click on save selector now one more

play10:12

thing we have to do here we have to go

play10:15

here on the first selectors edit option

play10:18

and then from here we have to select as

play10:21

you can see parent selectors we have to

play10:23

select root

play10:24

and the pages now let's click on save

play10:28

selector after that if we go here and

play10:31

click on this option and click on

play10:33

selector graph we are going to see the

play10:36

graph of what we have done so far so

play10:38

just take a look from links we are going

play10:40

to find all the pages just take a look

play10:42

name phone address website and email

play10:45

these are the points we have selected

play10:47

now if i click on pages it is going to

play10:49

show us inside from pages it is going to

play10:52

visit each of the links and then it is

play10:54

going to get all the information from

play10:56

here

play10:57

so our repetitive task has been settled

play11:00

properly now what we have to do we have

play11:03

to go here then click on script button

play11:06

right here

play11:07

and then make sure you have settled a

play11:10

large number here let's say 2000 is the

play11:13

basic but the more gap you are giving on

play11:16

the intervals the better it is going to

play11:18

work on the website because most of

play11:21

these uh business directories or website

play11:23

has some limitations of accessing their

play11:26

website from a views a user each day so

play11:31

after visiting let's say 10 15 pages or

play11:34

more than that or whatever their limit

play11:37

is after visiting these pages they are

play11:39

going to send you a restrict message

play11:41

restriction message or they're going to

play11:43

make the tool stop so in this case if

play11:46

you provide let's say 7 000 your chances

play11:49

will be less to get

play11:51

stopped faster so in this case as this

play11:54

is a tutorial purpose so i'm going to

play11:56

select 2 000 millisecond it is going to

play11:59

work perfectly fine for this tutorial

play12:01

purpose so in this case i'm going to

play12:02

click on start scraping and it is going

play12:06

to start visiting each of these pages

play12:10

automatically and just take a look it is

play12:12

working here

play12:13

and

play12:14

okay

play12:16

let's just wait just take a look it's

play12:18

already been visiting the 61th listing

play12:21

31

play12:22

and here it is visiting the first page

play12:26

and

play12:27

yeah

play12:28

now it is going to visit each of the

play12:31

persons or businesses profiles it is

play12:34

going to collect their name phone number

play12:36

website email address when they are

play12:38

available some of the websites you will

play12:41

notice for some of the business listings

play12:43

you will notice that the website is

play12:44

missing the email address is missing or

play12:46

the phone number is missing and this is

play12:48

because

play12:49

there are no information provided for

play12:52

these fields and it happens for some

play12:54

pages okay so now if i click on this

play12:57

refresh button we are going to be able

play12:59

to see some information has been already

play13:02

populated here just take a look

play13:04

now the more time it will

play13:07

um

play13:08

stay active in the background the more

play13:10

data it is going to collect for us

play13:13

so

play13:14

as i have settled everything properly it

play13:17

is going to visit each of these pages it

play13:19

is going to get the data from these

play13:21

pages automatically into our um

play13:24

database now it is going to do its work

play13:27

so what i'm going to do i am going to

play13:30

close this data extraction process now

play13:33

and let me show you

play13:35

how much we have got as of now these

play13:38

things it's totally fine to show you the

play13:40

process of downloading your in results

play13:42

so what i'm going to do i need to click

play13:44

on this button after that you have to

play13:46

click on export data as csb to download

play13:50

it into your um

play13:52

excel document so let me click on it

play13:54

after that we are going to click on this

play13:56

download now button and then we have to

play13:59

click on save

play14:00

and the files will be saved on a csv

play14:04

file now i'm going to open this file

play14:06

and if you just notice here we have got

play14:09

all the information

play14:11

just take a look

play14:12

all right so now what i'm going to do

play14:14

i'm going to clean this um

play14:16

things up

play14:18

in real quick to show you the end result

play14:20

i'm going to clean and i'd love to keep

play14:23

these fills as a reference to each of

play14:26

these listings then i have got these

play14:28

business or the person's name we have

play14:30

got the phone number address information

play14:33

which includes the street address city

play14:35

and state and the zip code then i have

play14:37

got the website i'm going to delete this

play14:39

one from here if you notice here we have

play14:42

got the website and wow it looks that we

play14:45

have got some uh repetitive or let's say

play14:47

duplicate

play14:49

websites but i don't think there is as

play14:51

duplicate because if you just notice

play14:53

here at these emails you want you are

play14:56

going to see that these emails are

play14:58

different and the person names are

play15:00

different so these are actually these

play15:02

businesses

play15:04

uh has multiple people's listed on the

play15:07

uh listing so this is the reason it we

play15:09

are seeing some um relevancy here so it

play15:12

is not a problem then we have got this

play15:14

email field which we don't need we don't

play15:17

need

play15:18

the other pages from here and if you

play15:19

just notice here we have got mail to

play15:21

these types of text appearing on each of

play15:24

these emails so what we can do we can

play15:26

get rid of this easily to do this we

play15:28

have to simply click on control f into

play15:31

our keyboard and then we will find this

play15:33

option find and replace then click on

play15:35

replace we are going to type out this

play15:37

information like mail to

play15:40

mail to and then there is colon then

play15:43

we're going to click on replace all

play15:46

and just take a look we have got a clean

play15:49

email list already right so we have got

play15:51

27 listing it's totally fine

play15:54

so this was the process of automatically

play15:58

extracting data from any business

play16:01

listing website or any website so i

play16:04

believe you have found this video

play16:06

helpful if you did please give this

play16:08

video a like share this video to help

play16:10

your friends and let me know if you have

play16:12

any question by commenting below and

play16:15

your opinion will be highly appreciated

play16:18

and please subscribe to my channel to

play16:20

get more helpful videos in near future i

play16:22

hope to see you in my next videos have a

play16:24

good day bye

Rate This
β˜…
β˜…
β˜…
β˜…
β˜…

5.0 / 5 (0 votes)

Related Tags
Data ScrapingWeb ScraperChrome ExtensionYellow PagesBusiness ListingsData CollectionAutomation ToolsTutorialTech GuideWeb Scraping Tips