Scraper facilement 1001 pages et surpasser la concurrence - Découvrez comment !

Meydeey
16 Oct 202318:49

Summary

TLDRDans cette vidéo, l'auteur révèle une technique secrète pour extraire et réécrire des articles de blog d'un concurrent en utilisant l'intelligence artificielle. Il démontre comment analyser un site web, récupérer les liens des articles via le sitemap.xml, puis créer un fichier CSV avec les URLs, les mots-clés associés et les titres SEO optimisés. L'objectif est de créer du contenu旋回 (spinning) détectable en utilisant GPT-4, ce qui peut être utilisé pour améliorer le référencement naturel ou pour des campagnes de marketing à l'international.

Takeaways

  • 🔍 Utiliser un site appelé Open AI Master pour écrire des articles de blog sur l'IA et d'autres sujets.
  • 🗂️ Accéder au sitemap d'un site WordPress en ajoutant 'sitemap.xml' à l'URL pour extraire tous les liens de contenu.
  • 📄 Télécharger le fichier XML du sitemap pour analyser et utiliser les liens des articles de blog.
  • 🔧 Utiliser un prompt de GPT pour extraire les URLs, créer un fichier CSV, et générer des mots-clés et des titres SEO optimisés.
  • 🚀 Automatiser le processus de récupération de liens et de création de contenu en utilisant des outils de scraping et de génération de texte.
  • 📊 Analyser la structure du fichier XML pour comprendre les données et les liens présents.
  • 🔄 Diviser le processus en étapes pour éviter les erreurs et améliorer la précision des résultats.
  • 📈 Créer des titres de page (title tags) uniques et engageants en respectant les fondamentaux du référencement naturel.
  • 🔄 Effectuer des itérations pour corriger les erreurs et améliorer la qualité des données et du contenu généré.
  • 🌐 Considérer la possibilité de réécrire le contenu dans une autre langue pour atteindre un public international ou moins développé.
  • 🎓 Apprendre l'automatisation et l'utilisation de GPT Chat pour améliorer les techniques de SEO et de scraping.

Q & A

  • Quel est le but de la technique présentée dans cette vidéo ?

    -Le but de la technique présentée est de récupérer et analyser les articles d'un site concurrent pour les réécrire ou 'spinning', en utilisant une approche automatisée.

  • Pourquoi l'auteur utilise-t-il un site appelé 'open ai master' ?

    -L'auteur utilise le site 'open ai master' car il est un exemple de site ayant un trafic en augmentation rapide et qui publie de nombreux articles de blog sur l'intelligence artificielle et autres sujets.

  • Comment l'auteur accède-t-il aux éléments du site WordPress ?

    -L'auteur accède aux éléments du site en ajoutant 'sitemap.xml' à l'URL du site, ce qui lui permet de visualiser tous les éléments du site car il est sur une plateforme WordPress.

  • Quelles sont les étapes principales de la technique présentée ?

    -Les étapes principales sont : analyser la structure du fichier XML du sitemap, extraire les URLs des balises 'loc', créer un fichier CSV avec les URLs et les métadonnées (mots-clés et titres), et finalement utiliser ces données pour réécrire ou 'spinning' le contenu.

  • Comment l'auteur utilise-t-il un outil de traitement de données pour extraire les URLs ?

    -L'auteur utilise un outil de traitement de données pour extraire les URLs en utilisant une requête spécifique qui cherche les balises 'loc' ouvertes et fermées, et en créant un fichier CSV avec les résultats.

  • Quels sont les problèmes que l'auteur rencontre lors de la création des métadonnées ?

    -L'auteur rencontre des problèmes avec les métadonnées car les mots-clés ne correspondent pas toujours aux URLs et les titres ne respectent pas toujours les fondamentaux du référencement (SEO) et peuvent contenir des répétitions.

  • Comment l'auteur résout-il les problèmes de correspondance entre les mots-clés et les URLs ?

    -L'auteur résout ces problèmes en demandant au processus d'analyse de données d'extraire des mots-clés plus pertinents directement des URLs, puis de créer de nouveaux titres correspondants.

  • Quelle est la finalité de la technique présentée par l'auteur ?

    -La finalité de la technique est de permettre une réécriture ou 'spinning' de contenu de manière automatisée et efficace, en utilisant les données extraites d'un site concurrent pour créer du contenu original et optimisé pour le référencement.

  • Quels sont les avantages de cette technique par rapport au travail manuel ?

    -Les avantages incluent une économie de temps considérable, une automatisation des tâches répétitives, et la possibilité de traiter de grandes quantités de données qui pourraient prendre des jours de travail manuel.

  • Quelle est la recommandation de l'auteur pour éviter les répétitions dans les titres créés ?

    -L'auteur recommande d'utiliser des prompts précis et de faire des itérations étape par étape pour éviter les répétitions et assurer la qualité du contenu généré.

  • Quel est le prochain sujet que l'auteur envisage de couvrir dans ses prochaines vidéos ?

    -L'auteur envisage de couvrir le sujet de la scraping de plusieurs sites à la fois pour créer des combinaisons de contenus, qui pourraient être utilisées pour 'spinning' de manière plus efficace.

Outlines

00:00

🔍 Analyse de site concurrent et récupération de liens

Dans ce paragraphe, l'auteur présente une technique pour analyser le site d'un concurrent et récupérer ses liens de blog. Il utilise un site appelé 'open ai master' pour montrer comment exploiter le sitemap.xml pour avoir accès à toutes les pages du site. L'objectif est de scraper les articles du blog pour du spinning. Il explique que la majorité des sites sont basés sur WordPress et que l'on peut accéder au sitemap en ajoutant 'sitemap.xml' à l'URL. Il montre comment enregistrer ce sitemap en tant que fichier XML pour l'analyser plus en détail.

05:02

📋 Création de CSV et extraction de liens

Le paragraphe explique comment créer un fichier CSV en extrayant les URLs des tags 'loc' du fichier XML. L'auteur montre comment utiliser Visual Studio Code pour ouvrir et analyser le fichier XML, puis créer une feuille Google avec les liens dans la colonne A et les mots-clés associés dans la colonne B. Il insiste sur l'importance de ne pas avoir de répétition et de créer des titres uniques et engageants pour chaque lien.

10:03

🔄 Amélioration des données et création de titres

Dans ce paragraphe, l'auteur discute des erreurs rencontrées lors de la création initiale des fichiers CSV et des titres. Il explique qu'il a dû corriger les données en supprimant les mots-clés qui ne correspondaient pas aux URLs et en créant de nouveaux titres qui sont pertinents et respectent les règles de référencement (SEO). Il souligne l'importance de l'itération pour améliorer la qualité des données et des résultats.

15:06

🚀 Automatisation et utilisation de GPT pour le spinning de contenu

L'auteur conclut en expliquant comment l'automatisation et l'utilisation de GPT peuvent aider à réécrire les contenus en utilisant les données extraites. Il suggère de préparer un scénario 'diabolique' pour spinner le contenu de manière détectable en utilisant les mots-clés et les titres créés. Il mentionne également l'intérêt de créer du contenu dans d'autres langues pour s'adresser à des pays moins développés, ce qui peut être une stratégie efficace en référencement.

Mindmap

Keywords

💡technique secrète

La 'technique secrète' mentionnée dans le script fait référence à une méthode non divulguée utilisée pour extraire et analyser les articles d'un site web concurrent. Cette technique est utilisée pour scraper et 'spin' (réécrire avec un angle original) le contenu, ce qui est sujet à débat en termes de légalité et d'éthique. Dans le contexte vidéo, cela implique de récupérer les articles de blog d'un site en utilisant un sitemap.xml.

💡scraping

Le 'scraping' est le processus d'extraire des données d'un site web pour les utiliser à d'autres fins, souvent pour l'analyse ou la réutilisation du contenu. Dans le script, le scraping est utilisé pour obtenir les articles de blog d'un site concurrent afin de les réécrire et de les utiliser sur un autre site web.

💡SEO

L'acronyme 'SEO' signifie 'Search Engine Optimization', qui est la pratique d'améliorer le classement d'un site web dans les résultats de recherche pour augmenter le trafic organique. Dans le script, le narrateur discute de la création de titres uniques et de mots-clés pour optimiser les articles 'spun', afin d'améliorer leur classement dans les moteurs de recherche.

💡sitemap.xml

Un 'sitemap.xml' est un fichier qui liste tous les liens d'un site web, aidant les moteurs de recherche à comprendre la structure du site et à indexer son contenu plus efficacement. Dans le script, le narrateur utilise le sitemap.xml d'un site concurrent pour identifier et extraire les articles de blog à scraper.

💡WordPress

WordPress est un système de gestion de contenu (CMS) populaire utilisé pour créer et gérer les sites web. Il est mentionné dans le script car la majorité des sites web (environ 80% selon le narrateur) utilisent WordPress, ce qui signifie que la technique de scraping utilisée dans le tutoriel s'applique à un grand nombre de sites.

💡content spinning

Le 'content spinning' est une technique de réécriture de contenu existant pour créer de nouveaux articles en changeant les mots ou la structure des phrases tout en conservant le sens général. Cette pratique est utilisée pour éviter la détection de dupliqués par les moteurs de recherche et pour augmenter la quantité de contenu unique sur un site web.

💡Visual Studio Code

Visual Studio Code est un éditeur de code source polyvalent et gratuit développé par Microsoft. Dans le script, il est utilisé pour ouvrir et analyser le fichier XML téléchargé à partir du sitemap du site concurrent.

💡CSV

Un fichier CSV (Comma-Separated Values) est un format de fichier simple utilisé pour enregistrer des données tabulaires telles que des tableaux ou des feuilles de calcul. Dans le contexte du script, le narrateur crée un fichier CSV pour enregistrer les URLs extraites, les mots-clés associés et les titres SEO optimisés.

💡prompt destructeur

Le 'prompt destructeur' fait référence à une requête ou une commande utilisée pour extraire et manipuler des données de manière agressive ou intensive. Dans le script, le narrateur utilise un prompt destructeur pour extraire les URLs du sitemap et pour préparer le processus de 'spinning' du contenu.

💡GPT

GPT (Generative Pre-trained Transformer) est un modèle de langage basé sur l'apprentissage automatique développé par OpenAI. Dans le script, GPT est utilisé pour générer des titres SEO optimisés et pour 'spin' le contenu des articles de blog extraits.

💡habysal

Habysal est probablement une erreur de frappe pour 'Hatsby', un outil d'automatisation pour la création d'images à partir de textes. Dans le script, il est mentionné que le site concurrent utilise des outils comme Hatsby (ou un outil similaire) pour générer automatiquement des images pour leurs articles de blog.

💡iteration

L''iteration' fait référence à un processus répétitif dans lequel des ajustements ou des améliorations sont apportées à un travail ou à un projet. Dans le script, l'iteration est utilisée pour décrire le processus de perfectionnement continu des données extraites et de la création de contenu 'spun'.

Highlights

The video introduces a secret technique for scraping and spinning a competitor's blog articles.

The technique involves using the site's sitemap to access all blog posts efficiently.

The speaker uses a site called 'open ai master' as an example to demonstrate the process.

The process includes downloading the sitemap.xml file to extract all the blog post URLs.

The speaker uses Visual Studio Code to analyze and manipulate the XML file.

A prompt is created to structure the extraction and processing of URLs and creation of a CSV file.

The process involves creating keywords and title tags for SEO optimization based on the extracted URLs.

The technique emphasizes the importance of avoiding repetition and ensuring uniqueness in the generated content.

The video demonstrates the potential of automating content creation and SEO processes using AI.

The speaker discusses the legality and ethics of scraping and spinning content, leaving it to the viewer's judgment.

The method can be applied to multiple sites and languages, potentially targeting less developed markets.

The video is aimed at SEO professionals, writers, and those interested in content scraping and spinning.

The speaker plans to create more videos on scraping and combining content from multiple sources.

The process saves significant time compared to manual content creation, potentially saving days of work.

The video concludes with a call to action for viewers to apply the learned techniques in their own projects.

Transcripts

play00:00

welcome to this new video today I'm going to show you a secret technique I

play00:04

don't think there is anyone who has already made a video on this really there is it

play00:09

legal is it is not legal for you to judge but the goal is to take a competitor so

play00:14

someone that you want to completely scrape his site his blog articles and do spinning

play00:21

so I will show you directly the strategy the technique a little brutal I'm going to put myself here

play00:26

we will be better on the right so there I went to a site called open ai master which writes

play00:33

blog articles on AI on many things which is in place and I had

play00:38

done an analysis of his site his traffic is exploding the guy writes a lot of blog articles so you

play00:42

can already see it yesterday yesterday October 8 1 2 3 4 5 6 here he made six if only 'ier so

play00:51

he is a guy who has done automation he has set up systems and so on who

play00:54

writes uh he makes images it seems to me with either Habysal or with other tools via uh

play01:00

in automation so once I'm on his site what I'm going to do is I'm going to

play01:05

go to the site map so I'm going to put a slash after his link and I type sitemap so

play01:09

sitemap.xml as it's a WordPress site I can see exactly all the elements

play01:17

so there on most of the sites so we will say 80% of the sites are WordPress

play01:21

or when you type sitemap.xml or sitemap uh- from the bottomindex.xml you will have direct access to the link

play01:30

so there it is simple we have what we have we have here after the slashdedu.com we have post we still have post post

play01:37

so that's the blogs pages these are the pages of the site so the pages maybe the guy he made

play01:41

a page a landing page or whatever any page categories it's these categories that

play01:47

he created so perhaps artificial intelligence chat GPT mid day d it's categories and

play01:53

then the rest author that's him and post tag that doesn't interest us we're going to go to the first one

play01:58

so in the posts to retrieve the blog articles so I'm going to show you my combination there you

play02:03

see we have all the pages which are referenced from these blog articles so I open any

play02:08

random one like that in a new tab this one this one and this one and each of them

play02:12

are blog posts are blog posts from his site me what I'm going to do

play02:18

is is that I'm going to have fun with his site so I'm going to right click in XML sitemap here in

play02:24

the white I'm going to save under or control S and there I'm going to download an

play02:30

XML file so I called it deletion designed so this one I delete it you've never

play02:35

seen it I'm going to call it deletion designed so now I'm saving it so yes it's going to tell me that it

play02:41

's going to overwrite it that's normal so now I have my XML file from this site map here I'm going to open it

play02:46

to show you what it looks like in vs code so here I opened it in Visual Studio code

play02:51

which allows you to see a little bit what the file looks like the file is huge there is a mass

play02:56

of data and there are a lot of links so all the links of his of his we will say of his site

play03:02

and me what interests me if I look at all the structures if I tell him for example cat

play03:08

GPT to recover all the links it will take the pages from me it will also take the images there

play03:12

you see there is an image of point lock so that is an image if I take it and I open it and I go

play03:17

to google I will show you directly what it looks like if I go here hop via the Site

play03:23

Map I can retrieve this image but what interests me is only the links that's why

play03:28

I prepared a prompt quite destructive quality so once I have my XML my

play03:34

site map there I had cleaned the link but we have it we don't need it for today we will

play03:39

go directly to the magic prompt so I will show you the structure of the prompt it's going to be

play03:44

simple we're going to do it already I'm going to copy that because it's my last prompt hop no I'm going to

play03:51

take this one instead here hop so we look at analyzing the structure of this XML file deletion

play03:57

designed so there I put it I'm even going to add an XML point like that at least he knows what

play04:00

it's going to be in extracting me only the URLs with this tag so here I'm going to ask for

play04:06

the lock tag quite simply because it's only this tag there which interests me to

play04:11

recover the links so lock entry tag and loc closing tag and here here it is image

play04:17

of point lo so as I read said that the loc it will only recover well which me therefore interested

play04:21

in the pages so that's perfect in the third step create a CSV file by listing all the

play04:27

URLs one by one in column a so here I am already preparing it to create a Google sheet therefore a

play04:32

CSV file and which directly puts the links which goes which will extract in column a then in

play04:40

4 we created a keyword in column B linked to the URL present in column a no keyword

play04:47

must contain a duplication so if we take the example here for example of this link so uh how

play04:54

to use Bing ai on PC Mac well he will put me how to use Bing PC on PC Mac he will put me that so

play05:01

here it is then uh and in the last step write a TITLE tag of less than 65 characters in

play05:07

column C respecting the fundamentals of SEO so there I will perhaps add it

play05:11

same natural referencing Natural referencing each Balis title must correlate with column

play05:20

A and B avoids repetitions and is engaging the Balis titles must all be unique and

play05:25

then I told him wait for my validation once you have completed a step quite simply

play05:30

because when we make him do FIVE steps at the same time he gets confused so the best is to

play05:34

do a step afterward he waits for the OK or yes and each time we take it one step at a time so

play05:39

here I take this nugget I take it directly so I take this prompt I'm going to make a new prompt

play05:45

to show you how it happens I'm going to advanced data analysis I'm going to paste my

play05:50

prompt here so quite simply with the steps as required and I will add in file my

play05:56

deletion file designed as I said above he analyzes this deletion file designed

play06:02

there all I have to do is press Validate and you will see what will happen so there

play06:07

he will follow the steps one by one and he will wait for my ok after each step so I can

play06:14

already prepare so begin my my my ok I will simply put a sentence you can move on

play06:22

to the next step so so if there is a bug it's just because I have a plugin once again

play06:28

I can write OK to it or wait for what it will ask you if it asks you for a yes or something to be

play06:33

able to continue so here we really analyze what it does so here I received the

play06:38

XML file to begin with I will analyze the structure it's good at least he understands the content he sees

play06:42

he analyzes I managed to analyze the structure of the file XML here is an overview of what I found

play06:47

so there he found a very good link so there he found the the the siteem.org the URLs there this mode

play06:53

now that we have an idea of ​​the structure let's take the next step question mark there

play06:58

yes I tell him you can move on to the next step perfect now I'm going to extract the

play07:04

URLs which are in the lock tags of the XML file a moment please so

play07:09

there you see that allows him to do a pause each time to do one step by one and you

play07:14

will allow you to analyze how it worked in the sense that if tomorrow a GPT 5 cat

play07:20

comes out and which is ultra powerful you will have trained enough to know that uh action one

play07:26

plus action 2 plus action 3 plus action 4 so the prompts that were defined were

play07:30

well written you had worked on them but maybe with gpt5 you can make it do 15 steps

play07:35

d 'suddenly and that's where it allows you to prepare for war in a way

play07:42

so he tells me great I managed to extract 100 URLs from this XML file ok very good that

play07:48

suits you we go to the next step okay me that suits me 100 I just extracted 100 URLs in

play07:53

just 30 seconds that suits me so I'm going to the next step to the next step

play07:59

so great I'm now going to create a file CSV and list all URLs in the column at one

play08:05

moment please so somehow I know it could do the CINs at once

play08:11

or maybe all four but it gave me some bugs I tested it before but

play08:15

just why I wanted to show you why to do it one by one because there you see

play08:21

it is only step 3 and it will offer me to download the CSV so I will be able to go

play08:27

directly to download this CSV it is not finished we are we are to agreement there he asks us to move on

play08:32

to the next step but as I am curious I am going to go and analyze what he did to me so

play08:37

I downloaded it so I come back to where I was and as I I've downloaded I'm going to go open it

play08:43

to see a little bit what it did to me like that URL list here it is I'm going to wait for what

play08:50

matters there I pulled out my column A as we can see all the URLs are there there are 1001 of them,

play08:56

I tested it several times before giving it to you so no there are even 1002, well whatever

play08:59

1001 because he didn't count the title at the start so very good it suits me perfectly

play09:04

I'm happy I can move on to the next step in the next step it's been twice that I'm

play09:08

wrong so I send the next step then logically we are at the step create a keyword

play09:15

so there in column B he must create a keyword for each of the links so we

play09:19

will see if he will respect what we asked him and as you can see it is really it is

play09:25

it is really impressive the fact of, as I say each time, doing iterations

play09:30

if you have a request that is too complicated to be able to break it down into several steps simplifying the

play09:35

work in GPT chat there I'm not saying you are an expert in data analysis data na it's

play09:40

a simple job for him if you detail it directly to him as required step one 2 3 4 5 you

play09:45

don't need to bother yourself go tell him you are an expert in data analysis of

play09:50

world renown for 25 years it's useless there step an analysis this structure this structure as it should

play09:56

1 2 3 4 5 everything is understandable there are no words which are useless and that's it so there uh for

play10:02

column B I'm going to create a keyword in link with each URL so he understood very well

play10:06

eh we can read there he gives us some perfect examples here is a little preview it's always good

play10:10

to have little glimpses besides what you can do is is that in your

play10:14

initial prompt you could add uh add me little previews each time like that at least

play10:19

you see the structure and you are ready you don't need to download your file and go

play10:24

open it and to go check so ok he created the files for me I'm happy we move on to

play10:29

the next step so we continue the iterations once again there we see keyword homepage keyword

play10:37

chat GPT keyword iOS keyword up so there it is isn't great at all just now he did things

play10:42

a little better for me so we're definitely going to give him an iteration at the end if if I see that it's not

play10:46

it's not great at all which I 'did uh because there he didn't respect me too much so

play10:52

I'll do I'll do an iteration at the end I'll ask him to modify so that's the advantage

play10:55

he is that once your step one 2 3 4 it is well done and that leaves you Class 5 which has

play10:59

messed up a little well there you are going to use directly well a prompt finally at least a guest to

play11:07

only make a modification in a column and there you see you see how he speaks to me yy it's done

play11:12

I created title tags ok so he made me title tags he added open a master it's

play11:20

perfect like that I'm going to show you exactly what I didn't want let him do it to me he did it to me

play11:23

earlier too and and you would have I could have told him also don't include the word open ai master

play11:29

which is of course the name of the site of this competitor openmaster. com I could very well have

play11:34

told him not to include it and because there it wasted directly on the characters

play11:40

so as I asked for 65 maximum for the TITLE tag in ACO there he gave me some wasted

play11:44

a good ten so there it is working I will now save everything

play11:49

that in a CSV file for a moment OK and now download the CSV file we are going to download it we are going to

play11:57

come and open it so the very last one and we are going to browse and we are going to integrate it ok we are going to wait we are going to

play12:06

see what we have did we do the job well so there is a bug I

play12:09

think it's because I imported too many files it makes me re-import

play12:13

the old one so instead of opening I'm going to import it has already done it to me several times

play12:19

so here I import the data and I open ok so as we can see he did it to me he didn't

play12:27

do something great to me I'm going to do it for you show exactly what I did earlier when

play12:31

I reworked it, it was really perfect I'm going to show you the example from earlier as

play12:35

you can see there's a lot of iteration I I did a lot of tests where it was not bad at all

play12:40

it was here I think here it was not bad this is what came out to me how to use being a as

play12:48

I said earlier more afterwards directly he had removed open a master so it wasn't bad at

play12:53

all so I'm not happy with his result and I'm going to tell him the keywords don't match

play13:00

there the keywords in column B don't match the URL and and he did

play13:07

anything to me in in in column C ok so I'm going to write it here like this because

play13:14

with the plugin bug then the keywords in column B don't match the URLs in the

play13:26

column A corrects this then the title tags are not at all correlated with column

play13:42

A and B of the CSV file don't write again at least improve the CSV and delete the word open ai

play14:00

master point here's my little prompt here I'm going to send him that and we're going to send it so I think

play14:08

it's good in fact he did it for me perfectly just now I thought my prompt was

play14:11

perfect and then he made me a mistake so so much the better that way at least it allows me to see

play14:15

what I could have improved perhaps be a little more precise because at one point I tell him

play14:19

I think I know where it comes from it's at one point I talk to him about column A and the column B

play14:24

here and I didn't tell him what it was in column A and what it was

play14:28

in column B so he must have mixed it up at that time I think so we'll see what

play14:32

he's going to do to us I'm really sorry for the mistake to correct this I'm going to take a

play14:36

new approach to extract more relevant keywords directly from the URLs then I'm

play14:40

going to create here he is telling me a little about his life I'm not interested there in a few seconds he

play14:46

extracted directly this best keyword it's the home page so there he did to me

play14:53

what uh specific word keyword how to use ah there you go there it's better there it got me a keyword

play14:59

exactly as I wanted what is dpt that's it even if it gives me the URL it's that's exactly

play15:06

it it's perfect it's the keyword c 'is a keyword of approximately medium long term so it's very

play15:09

good now let's go through the creation of the steps of Balis title so there because I think that because

play15:17

we have already told it to do iterations one by one well there he has he has he hasn't mixed everything

play15:24

he hasn't done both at once he did first the key and then he does the URLs so that's it's

play15:30

really it's really not bad I don't think that's it because

play15:34

I gave it iterations like that but I find that it's clear so here we look at Balis

play15:40

title it's the home page so that's logical that he has he in addition he understood that this is the

play15:43

home page he is smart best ballist title how to use being on PC so the guide a

play15:50

perfect guide it's good we'll see we'll see if it will suit us often repeat the word guide information I

play15:55

think it will often repeat the same structure but it doesn't matter afterwards either we

play16:00

iterate or because there are already because there are 1000 1000 lines so it is already a lot for

play16:04

him and he risks doing repetitions often, I will now save these improvements

play16:10

ok very good then I wait and there it is done I saved the file very well now

play16:19

uh can you give it to me please I will download it I will import a new file

play16:27

so the last hop and normally it should be much better than before and even especially for the

play16:34

title ballistas we will have more open a master so that's very good so I'm going to stretch the links

play16:40

well not to the end because we don't care a bit there the titles are very good he recovered

play16:44

in fact he didn't get too bored he just recovered the the how to say the

play16:49

slugs the URLs and he put them there so it's very good and then as for the

play16:54

Balis title it's not crazy at all but yeah it's not crazy at all but it can still do it if

play17:02

there are still a lot of repetitions I think that maybe delete guide information all that

play17:06

or ask him to iterate but in itself in 5 minutes you did what a person would have

play17:12

spent perhaps days of whole days doing by hand it would have taken you TR days

play17:18

of days if this person is efficient so that's it and then after that I'll let you

play17:24

learn automation thanks to my training my videos which will allow you thanks to this

play17:29

data to this data I think I will make a microtraining one once you have this now

play17:34

you have to prepare a diabolical scenario which will allow you to rewrite each of these contents

play17:40

via this keyword via this this Balis title and to spin all these elements therefore to make a kind of

play17:46

spinning content in gpt4 by being undetectable or why not write in another language in

play17:51

another country make a point is a point be in another language which allows you to place yourself

play17:56

on countries which are less developed like uh uh finally less developed than France that the

play18:01

United States which everyone aims for every time there are also other countries which are interesting

play18:05

so there I leave you with that I hope you liked it little blue thumb it's a pleasure uh

play18:10

you're subscribe you share with your friends who are in SEO in writing or in scrapping he

play18:15

or even guys who are starting to know a little bit about GPT chat and its screws because this

play18:20

video is very technical and it can help you to iterate to do this even on

play18:24

several sites in fact I am preparing quite interesting videos where we will try to scrape

play18:29

at least I will try to scrape several sites at the same time to make combinations which

play18:32

are quite, how to say evil, which will allow spinner from the context which is already spined so

play18:37

these are words that I use a little a little SIO a little blackat and well it's not too too blackat there

play18:42

but for the moment here we say see you soon it was my ID for an upcoming Ciao video

Rate This

5.0 / 5 (0 votes)

Related Tags
ContenuWebSEOSpinningContenuAutomatisationAnalyseSiteWordPressGPTDroitsD'auteurEfficacité
Do you need a summary in English?