Why I'm placing a lot more focus on learning Python....and how I'm doing it

Enterprise DNA
16 Jun 202418:05

Summary

TLDRIn this insightful video, Sam emphasizes the growing importance of enhancing data skills, particularly in Python, due to its prevalence in AI systems and automations. He shares his journey of learning Python for its versatility and value addition, recommending it for others. Sam demonstrates using Google Collab Notebook for data analysis, showcasing how to load data, perform statistical summaries, and visualize insights with Python libraries. He also discusses the ease of learning Python with AI assistance, error evaluation, and the significance of understanding Python for grasping AI automation workflows.

Takeaways

  • 🚀 The importance of improving data skills, especially in Python, is highlighted as crucial for personal value and AI automations.
  • 🐍 Python is emphasized as a versatile skill that is integral to many AI systems and automations currently being developed.
  • 💡 The speaker has been diving deeper into Python recently, recognizing its potential for increasing personal value and enabling more capabilities.
  • 🔍 Google Collab Notebook is introduced as a tool that can be used without extensive Python knowledge, focusing on operation and usage.
  • 📈 The speaker demonstrates how to use Google Collab for simple data analysis, showcasing its ease of use and intuitive interface.
  • 📊 The script includes a practical example of loading data into Google Collab, performing summary statistics, and visualizing data, emphasizing the efficiency of Python for data analysis.
  • 📚 The use of AI systems like Chat GPT to generate random data sets for analysis is mentioned, highlighting the utility of AI in data exploration.
  • 📉 The script discusses the benefits of using tools like Google Collab for data exploration before moving on to more detailed analysis in other platforms like Excel or PowerBI.
  • 🔍 The speaker shares tips on using Google Collab, including error evaluation and code explanation features, to enhance learning and debugging.
  • 🔄 The process of debugging and correcting code in Google Collab is demonstrated, showing that learning and mastering Python involves repetition and problem-solving.
  • 🌐 The speaker concludes by expressing a commitment to mastering Python and AI automations, and plans to share more content on this journey.

Q & A

  • What is the main focus of the video script?

    -The main focus of the video script is to emphasize the importance of improving data skills, particularly with Python, due to its versatility and relevance in AI systems and automations.

  • Why is Python considered a versatile skill in the context of AI?

    -Python is considered versatile in AI because many AI systems and automations are developed and tested using Python code, making it crucial for understanding and interacting with these technologies.

  • What is the speaker's recommendation for those who haven't used Python before?

    -The speaker recommends diving into Python, exploring its capabilities, and learning how to write and execute code, especially in the context of AI agents and automations.

  • What is Google Collab Notebook and how does it relate to Python coding?

    -Google Collab Notebook is a tool that allows users to write and execute Python code in a browser. It is mentioned as a way to operate and use Python without needing extensive coding knowledge, making it accessible for data exploration and analysis.

  • What are some benefits of using Google Collab Notebook for data analysis?

    -Google Collab Notebook enables quick data exploration and analysis, providing summary statistics and other insights with minimal code. It can be a useful step before creating more detailed reports or analyses.

  • How does the speaker use AI to generate a random dataset for analysis?

    -The speaker uses an AI chat experience to generate a random dataset by providing an abstract of the data and the columns needed. This approach avoids data security issues while still allowing for meaningful analysis.

  • What is the significance of analyzing summary statistics in the given dataset?

    -Analyzing summary statistics like mean, total, and average values provides a quick overview of key attributes in the data, such as passengers, distance traveled, stops, and fuel consumption, which can guide further detailed analysis.

  • How does the speaker plan to showcase the use of Python and AI in data analysis?

    -The speaker plans to showcase the use of Python and AI in data analysis by building simple notebooks in Google Collab, demonstrating how to load data, perform calculations, and visualize results.

  • What is the speaker's approach to learning Python and AI for data analysis?

    -The speaker's approach involves diving into new tools and methods, using AI systems to generate code and insights, and learning through repetition and practical application, such as building notebooks and analyzing data.

  • Why is the speaker interested in exploring the new framework called Autogen from Microsoft?

    -The speaker is interested in Autogen from Microsoft because it represents a new framework for AI agent workflows and automation, which are increasingly being implemented using Python, and understanding these can enhance one's ability to work with AI systems.

Outlines

00:00

🐍 Embracing Python for Data Skills

Sam introduces the importance of enhancing data skills, particularly with Python, due to its prevalence in AI systems and automations. He emphasizes Python's versatility and its potential to increase personal value. Sam also highlights the usefulness of Google Collab Notebook for those unfamiliar with Python coding, suggesting that one can operate and utilize it effectively without extensive coding knowledge. He plans to demonstrate how to use these tools for data exploration and analysis, starting with a simple notebook in Google Collab using a random transportation dataset generated through an AI chat experience.

05:02

📊 Analyzing Transportation Data with Python

Sam demonstrates how to use Python and Google Collab to analyze a transportation dataset. He shows how to load data into a DataFrame, calculate summary statistics like total passengers, distance traveled, stops, and fuel consumption. He also explains how to calculate the average number of passengers per route and the total distance traveled per route using Python's Pandas Library. Sam further illustrates how to visualize data, such as total passengers per month and year, and total fuel consumed per month, using column charts. He emphasizes the ease of use and intuitiveness of Google Collab for building analyses and the importance of learning through trial and error.

10:03

🔍 Deepening Insights with Data Visualization

Continuing the data analysis, Sam discusses the process of visualizing total fuel consumption per month in a column chart, highlighting the need to adjust the format to display months and years side by side. He also touches on the use of error evaluation tools to understand and correct issues in the code. Sam then moves on to more complex analysis, such as investigating correlations between fuel consumption and distance traveled, using a heatmap. He stresses the importance of learning Python and understanding AI agent workflows, particularly with the new framework called AutoGen from Microsoft.

15:03

🚀 Advancing Skills with Python and AI

In the conclusion, Sam wraps up the session by emphasizing the need to master Python and AI workflows, particularly as they relate to automating analysis. He mentions his intention to produce more content on this topic, encouraging viewers to stay tuned. Sam reflects on the process of learning through repetition and the importance of understanding the automations happening behind the scenes in AI systems. He invites viewers to join him on this journey of mastering Python and AI, promising to share his learning experiences along the way.

Mindmap

Keywords

💡Data Skills

Data skills refer to the ability to work with, analyze, and interpret data. In the video, the speaker emphasizes the importance of improving data skills in the current era, particularly with the rise of AI systems and automations. The speaker suggests that these skills are crucial for personal growth and value addition in the professional sphere.

💡Python

Python is a high-level programming language known for its simplicity and versatility. It is widely used in AI systems and automations, as mentioned in the script. The speaker has been focusing on enhancing their Python skills to stay relevant in the tech industry, highlighting its significance in the development and testing of AI technologies.

💡AI Systems

AI Systems, or Artificial Intelligence Systems, are computerized systems that perform tasks typically requiring human intelligence, such as learning, problem-solving, and decision-making. The script discusses how many AI systems are being developed and tested, and they predominantly use Python for their operations, making Python a vital skill for those working with AI.

💡Google Collab Notebook

Google Collab Notebook, also known as Google Colaboratory, is a free cloud service for machine learning education and research. It allows users to write and execute Python code in a notebook format. The speaker mentions using Google Collab to perform data analysis, emphasizing its ease of use and the ability to operate it without extensive Python coding knowledge.

💡Data Analysis

Data analysis is the process of inspecting, cleaning, transforming, and modeling data to extract useful information, draw conclusions, and support decision-making. In the video, the speaker demonstrates how to perform simple data analysis using Python and Google Collab, showcasing the power of these tools in quickly generating insights from data.

💡Pandas Library

The Pandas Library is a Python package for data manipulation and analysis. It provides data structures and operations for manipulating numerical tables and time series. The speaker uses Pandas in the script to load data into Google Collab, perform statistical analysis, and calculate summary statistics, demonstrating its utility in data analysis tasks.

💡Summary Statistics

Summary statistics are quantitative measures that summarize a data set, such as the mean, median, mode, and standard deviation. In the script, the speaker uses Python code to quickly generate summary statistics for their transportation data set, illustrating how these statistics can provide a quick overview of the data's key attributes.

💡Data Frame

A data frame is a two-dimensional data structure in Pandas, used to store and manipulate tabular data. The speaker mentions creating a data frame from their transportation data set in Google Collab, which serves as the foundation for further analysis and manipulation of the data.

💡Visualization

Visualization refers to the graphical representation of data and information. The speaker discusses using visualization libraries in Python, such as Matplotlib, to create visual representations of their data, such as column charts, to better understand patterns and trends.

💡Correlation

Correlation is a measure that expresses the extent to which two variables are linearly related. In the script, the speaker investigates the correlation between fuel consumption and distance traveled using a heat map, demonstrating how such analysis can reveal relationships within the data.

💡Error Evaluation

Error evaluation is the process of identifying and understanding errors in code or analysis. The speaker mentions using error evaluation tools to quickly diagnose and resolve issues in their Python code, such as key errors, which is crucial for effective data analysis and debugging.

Highlights

The importance of improving data skills, especially in Python, due to its use in AI systems and automations.

Python's versatility as a skill that can increase personal value and enable more capabilities in AI.

The recommendation for beginners to start learning Python for its role in AI and automation.

The capabilities of Python in writing, executing code automatically, and its use in loops for AI.

The significance of understanding Python for future possibilities in AI development.

Google Collab Notebook as a tool for data analysis without extensive Python knowledge.

The ease of using Google Collab for creating code, understanding it, and resolving errors.

The benefits of using notebooks for quick data exploration before creating reports or analyses.

A demonstration of loading data into Google Collab and performing summary statistics with Python.

The simplicity of obtaining summary statistics on key data attributes using one line of Python code.

The process of calculating the average number of passengers per route using Python's Pandas Library.

A tip on using the 'code explainer' feature in Google Collab for understanding specific code segments.

The creation of a column chart to visualize total passengers per month and year using Python.

Adjusting code to display data in the desired format, such as month and year on the x-axis.

Using error evaluation tools to quickly understand and resolve Python code errors.

Investigating correlations between fuel consumption and distance traveled with Python.

The use of a heat map to visualize the correlation between two data variables.

The speaker's commitment to mastering Python for understanding AI agent workflows and automating analysis.

Plans for releasing more content on Python and AI to help others learn and master these skills.

Transcripts

play00:01

hey everyone Sam here what I think is

play00:04

becoming more important now than than

play00:08

ever has um

play00:10

around how

play00:11

to improve your data skills now one of

play00:15

the one of the things I've been diving

play00:16

into a lot more in recent months is uh

play00:19

is python my python skills uh one one of

play00:23

the reasons why is because a lot of the

play00:27

AI systems a lot of the AI automations

play00:29

that are being uh being developed and

play00:32

being tested right now they're all being

play00:34

run with python code right and so I I

play00:37

realized this a while ago and I've been

play00:40

diving into it a lot more than I ever

play00:42

have in the past because I think it's

play00:43

just a incredibly versatile skill that

play00:48

is going to enable you to personally

play00:51

increase your value and also do a lot

play00:54

more that that is that is absolutely

play00:56

true in my mind right so I've I've been

play00:58

expanding my horizons a lot recently and

play01:00

I and I really I really believe you

play01:02

should as well and if you haven't done

play01:04

much with python previously I really

play01:06

recommend it you know I've been diving

play01:09

into a lot of brand new things that you

play01:12

can do with AI particularly with AI

play01:13

agents it's all happening in Python

play01:15

everything right the ability to write

play01:17

code execute code automatically do this

play01:20

on Loops it's it's incredible what is

play01:22

what is possible now and what is going

play01:24

to be possible in the future I don't

play01:26

know exactly the direction it's going to

play01:27

go but all I know is that a lot of this

play01:29

has been done done within Python and

play01:31

having a really solid understanding of

play01:32

how python works is crucial right one

play01:35

thing I'll say before we dive into this

play01:37

uh Google collab notebook is that you

play01:40

can um you can do a lot without knowing

play01:42

how to write python code all you need to

play01:45

know is more like how to operate it how

play01:49

to use it right and there's a whole

play01:51

range of tools now that can help you you

play01:54

create the code understand the code

play01:56

understand errors whole whole range of

play01:58

things right and you know that's what

play01:59

I'm my plan on show showcasing to you a

play02:01

bit more today so okay how do we get

play02:04

started now I'm just going to do some

play02:05

simple analysis we're going to build a

play02:07

simple notebook and Google collab here

play02:10

and the reason why we're doing a

play02:11

notebook here and you know and this is a

play02:13

bit different to

play02:14

say doing something in Excel or doing

play02:16

something in powerbi but there are some

play02:20

some real benefits to using these tools

play02:22

even if they're not the end product that

play02:24

you want to create they can with the

play02:27

help of get you some you to do a lot of

play02:30

like exploration around your data quite

play02:32

quickly which I find very very useful

play02:34

actually and so a couple of maybe like

play02:37

an hour or two spent within here before

play02:39

you actually do anything within a report

play02:42

or or or within other analysis that you

play02:45

do it actually I think can make a big

play02:47

difference right okay so what I did

play02:49

we've got some Transportation data here

play02:51

I just created this it's totally random

play02:53

data set generated it initially through

play02:55

chat GPT a couple of weeks ago uh and

play02:58

for me to just start off my analysis I

play03:00

just put the data into our edner AI chat

play03:04

experience here right okay so this is

play03:06

this is our own one this is the one that

play03:07

we've created through in but you can put

play03:09

this in anything you mean you can put

play03:10

this in chat gbt you can put this in

play03:12

Gemini you can put it in co-pilot you

play03:14

can put it in but put this sort of thing

play03:16

anywhere I'm using our environment

play03:17

because I just like it because I built

play03:18

it exactly how I I want to use it

play03:22

right

play03:24

so I just gave it an abstract of the

play03:27

data okay so I didn't actually give it

play03:30

too much I just said I give it an

play03:31

abstract and I said here are the columns

play03:33

in my data set this is not the actual

play03:35

data I want you to give me analysis the

play03:37

reason why I'm doing it like this is

play03:39

because I know there's a lot of data

play03:40

security issues no one wants to you put

play03:42

no one wants you to put your entire data

play03:44

set into these AI systems well don't

play03:46

just put a subset put in like one row

play03:48

two rows at least give it the data

play03:50

structure I don't see any data security

play03:53

issues with that and you can still do a

play03:56

lot or learn a lot uh on data which is

play03:58

relevant to you

play04:00

if you really can't even put like a row

play04:02

or two go and just put the columns in

play04:05

and get a random data set created by

play04:07

chat gbt around those columns and then

play04:09

put it into these into the system and

play04:11

this it's still going to give you the

play04:12

the right code the right ability

play04:14

abilities to find good analysis right

play04:18

okay so here's some analysis based on

play04:20

your data set so route performance okay

play04:23

so

play04:24

let's so I'm just going to go through a

play04:26

few of these to to create some analysis

play04:29

I did did actually create a a bit of

play04:31

analysis before you know what I've

play04:32

already like actually what I've already

play04:34

done is I've already loaded the data

play04:37

into into Google collab so this is some

play04:39

code that enables you to do that right

play04:40

here right and just make this a little

play04:43

bit bigger so I've loaded it in I've

play04:45

taken the file from my computer and I

play04:47

put it into the Google collab

play04:49

environment it's built what's called a

play04:50

data frame around that and then I've

play04:52

already run some statistics so these

play04:54

summary statistics right like these can

play04:56

take some time to do elsewhere but

play04:58

literally with one line of line of code

play04:59

here which I I just got out of our AI

play05:02

system it it gave me all the um summary

play05:05

statistics on a whole range of key

play05:07

attributes of the data so total

play05:08

passengers distance traveled how many

play05:11

stops um fuel consumption okay so if we

play05:13

just look at the mean each route has 50

play05:16

on average 50 passengers in the mean

play05:19

27ks is the average distance traveled

play05:22

average stops is 16 and the average fuel

play05:25

consumption is 55 lers by the looks

play05:30

I think that's how we can read it right

play05:32

so yep cool okay so quick and easy to

play05:37

get that information I might actually

play05:39

might actually delete some of this stuff

play05:41

and we'll just start from scratch right

play05:43

cool

play05:44

okay really easy to use Google collab

play05:47

it's so intuitive you just you're

play05:48

literally just building one piece of

play05:50

analysis on top of the other and you can

play05:52

do that by using code here you can also

play05:54

add text very easily here so above shows

play06:00

the

play06:01

summary stats of our

play06:06

transportation data so simple things

play06:08

like that okay and then I can just click

play06:11

this and that embeds it in and then I

play06:13

can put some more code below here okay

play06:14

right

play06:31

the data frame is

play06:35

already

play06:36

available as

play06:39

DF okay so I'm just going to say that

play06:41

we've already got a data frame which has

play06:43

a variable of DF so don't recreate

play06:46

anything okay okay so sure you can

play06:50

calculate the average number of

play06:51

passengers per route using the Python's

play06:54

Li the Panda's Library sorry here is the

play06:57

code snippet okay average passengers per

play07:00

route okay so let's copy this

play07:03

across okay so all I have to do is copy

play07:05

this in

play07:06

here cool so there's a lot of routes

play07:08

right so um so so this is just giving us

play07:11

an idea so 100 okay so let's let's move

play07:14

on to the next one so determined the

play07:17

total distance traveled per route same

play07:21

thing now this insight

play07:34

to determine the total distance Trevor

play07:35

per route in Python use Panda's Library

play07:39

okay so again let's just come in here

play07:42

distance total distance per

play07:44

route okay and then I can just go

play07:47

play cool okay here's a little tip for

play07:51

you as well here's a little cool thing

play07:52

that we uh you can do inside of here

play07:54

what we have embedded into into here

play07:56

which is quite cool is you can click one

play07:58

click and go to the code explain so if

play07:59

you want to get a little bit more detail

play08:01

see it's not very descriptive here I can

play08:03

go code explainer like this it will

play08:07

paste that data in here automatically

play08:10

and then I can go just run and it will

play08:12

give me more detail about the spe

play08:14

specific code I could also copy and

play08:16

paste this by the way but we just try to

play08:17

make it really really quick so

play08:18

calculating total distance traveled per

play08:20

rout the code provided blah blah blah it

play08:23

assumes the

play08:26

installed the code groups the data frame

play08:29

by the column route ID this operation

play08:31

creates groups Okay cool so this is how

play08:33

we can learn quite quickly right this is

play08:35

this is so powerful like if you just say

play08:37

you think before even we had any AI

play08:40

system like before we even had gbt I

play08:41

mean you would just be stuck in Forum

play08:44

hell trying to learn all this stuff

play08:45

right like it's it's crazy it's crazy

play08:47

how much quicker it is okay so y okay

play08:50

now what I want to do um I want to

play08:52

actually

play08:53

calculate I want to calculate I want to

play08:55

calculate something quite simple um now

play08:59

I just want to show in

play09:04

a column chart the total passengers per

play09:11

month and year okay let's see what it

play09:14

comes up with here okay so M poop lib is

play09:19

is going to be our visualization Library

play09:22

assuming the date is in date yep okay

play09:27

let's give this a wh let's give this a

play09:28

wh and see what it comes up

play09:39

with okay hasn't just it's I mean it's

play09:42

it's it's kind of interesting but I I

play09:46

actually wanted it like month and year

play09:47

but anyway it's still it's still giving

play09:49

us some we just need to adjust the the

play09:51

inside a little bit I mean this is not

play09:53

going to be some sort of final product I

play09:54

show anyone so I'm just going to leave

play09:56

it as is for now let's just find some

play09:59

other some other insights here can you

play10:02

now show me the

play10:05

total fuel

play10:09

consumed per month in a column

play10:14

chart

play10:16

um please show each unique

play10:21

month side by

play10:25

side side

play10:32

I think it's done exactly the same thing

play10:33

but let's just have a

play10:42

look cool okay so this is little bit

play10:45

annoying that it's not not in the format

play10:47

I want okay but if if we if we do have

play10:49

this right if we do find this what you

play10:51

can do is I want to change this code

play10:56

around right so um

play11:04

actually formula fixer let's let's use

play11:07

this okay I'm going to type in the code

play11:09

I want to show this as month and year

play11:17

not and then

play11:20

year with each column next to each

play11:27

other and the same

play11:31

size the the is it the

play11:35

xaxis to be month and

play11:40

year okay let's see what it comes up

play11:42

with so these are just some simple tools

play11:44

that we've created that you

play11:48

can

play11:50

use okay let's just see let's just see

play11:52

if there some prer right that you can

play11:54

use to here's this corrected

play11:57

code um and let's see what this comes up

play11:59

with

play12:02

so do you imagine me to write all this

play12:05

code out it's just terrible right okay

play12:07

cool finally got what we wanted okay so

play12:09

just a few quick things you can you can

play12:11

use now if I want to learn a little bit

play12:13

more about this then I can just quickly

play12:15

come to the code

play12:17

explainer boom like that and then I'll

play12:21

get a detailed description of how this

play12:22

was actually cre this this is how I am

play12:24

learning python okay this is this is how

play12:26

I'm learning whether it's just simple

play12:28

analysis it I'm just I'm just literally

play12:32

plugging myself into

play12:34

into all sorts of new ways I can learn

play12:37

right and like this this just gives me a

play12:39

whole new dimension to how fast I can

play12:41

learn

play12:42

things okay now what are some things we

play12:45

can round off here with um let's do

play12:47

something a little bit more complex

play12:48

right investigate correlations between

play12:51

Fuel

play12:52

consumptions and distance travel okay

play12:54

let's let's I want code to complete this

play13:00

analysis

play13:02

please okay so just popping this in

play13:16

here aha okay so we've got an we've got

play13:18

an issue right so this is here's a

play13:21

little little tip for you this is what I

play13:22

this is what I've been doing I can go to

play13:25

error evaluation like the era the I mean

play13:28

you I I mean you can pop this into

play13:29

anyone but this this is a new a new tool

play13:31

that I created just so that I can

play13:33

quickly try and understand what these

play13:35

eras are the user and C because what you

play13:37

find like what you find like these these

play13:38

are just hopeless I mean they're so hard

play13:40

to understand what is going on so I mean

play13:43

just look at that it's confusing right

play13:45

so trying trying as best we can to um

play13:49

you know just give

play13:50

simple explanations right corrected code

play13:53

with comments okay so let's just try out

play13:56

this new code

play14:10

H

play14:13

okay I didn't actually load the data in

play14:15

didn't it

play14:38

okay while that's running let's have a

play14:39

look here

play14:41

so yeah okay ensure the data frame is

play14:44

loaded and contains the

play14:46

columns yeah okay yeah there we go

play15:02

okay so this is just a little bit of

play15:05

debugging that we need to do

play15:37

okay so this is this is where you know

play15:40

you can get a little bit stuck but like

play15:41

honestly trust me when you're using

play15:43

Google collab like just working through

play15:45

like what I've done like working through

play15:47

errors is just part of the process it's

play15:49

just part of the process so nothing ever

play15:51

works first go around

play15:53

right so let's just try and understand

play15:56

what's going on here so the user ENC

play15:58

counted a key error when trying to

play15:59

create a subset of the data

play16:01

frame it's probably because we haven't

play16:03

actually selected

play16:05

columns yeah okay the error message says

play16:09

that the columns mentioned do not

play16:12

exist it's because they're actually

play16:14

named differently right yeah it's

play16:17

because they they're named differently

play16:19

that's

play16:24

why okay please redo with the correct

play16:30

column names from

play16:33

below that's why that's why must be must

play16:37

be must be let's have a

play16:45

look Okay cool so it's just creating a a

play16:48

heat map seeing is is there a

play16:50

correlation between distance

play16:53

traveled and fuel

play16:56

consumption yes there is cuz it's zero

play16:59

it would be minus right it would be

play17:00

minus if there wasn't one okay cool

play17:04

right I'm going to wrap I'm going to

play17:05

wrap and I'll do more of these I'll do

play17:06

more of these like I'm I'm going to

play17:08

start doing a lot more of these because

play17:10

there there's a reason behind this

play17:12

right as I've been digging into this new

play17:15

framework called autogen from Microsoft

play17:17

is so much of the AI agent workflow of

play17:21

like automating analysis is being done

play17:23

with python right and so the more we can

play17:26

just become familiar with how this

play17:28

actually works the more we will be able

play17:30

to understand what the automations are

play17:32

doing in behind the scenes that is that

play17:34

is the big reason why I'm diving into

play17:36

this more and more and more right and

play17:39

I'm you know I'm I'm I'm I'm really

play17:40

testing myself I'm doing things that I

play17:42

haven't really done a lot of before but

play17:44

I'm going to be I'm G to master them I'm

play17:46

going to master them by just repetition

play17:48

and I'm going to show you how I'm doing

play17:49

it along the way okay okay I'm GNA wrap

play17:52

up and I'll be putting out a lot more

play17:54

content about this and around this so so

play17:56

keep watch out and um I'll talk talk to

play17:59

you again soon see you later

Rate This

5.0 / 5 (0 votes)

Etiquetas Relacionadas
Python SkillsAI AutomationData AnalysisGoogle CollabPandas LibraryAI AgentsCode ExecutionTransportation DataStatistical InsightsCorrelation Analysis
¿Necesitas un resumen en inglés?